E5-base-en-ru

Model info

This is vocabulary pruned version of intfloat/multilingual-e5-base.

Uses only russian and english tokens.

Size

intfloat/multilingual-e5-base d0rj/e5-base-en-ru
Model size (MB) 1060.65 504.89
Params (count) 278,043,648 132,354,048
Word embeddings dim 192,001,536 46,311,936

Performance

Performance on SberQuAD dev benchmark.

Metric on SberQuAD (4122 questions) intfloat/multilingual-e5-base d0rj/e5-base-en-ru
recall@3
map@3
mrr@3
recall@5
map@5
mrr@5
recall@10
map@10
mrr@10

Usage

  • Use dot product distance for retrieval.

  • Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval.

  • Use "query: " prefix for symmetric tasks such as semantic similarity, bitext mining, paraphrase retrieval.

  • Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering.

transformers

Direct usage

import torch.nn.functional as F
from torch import Tensor
from transformers import XLMRobertaTokenizer, XLMRobertaModel


def average_pool(last_hidden_states: Tensor, attention_mask: Tensor) -> Tensor:
    last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
    return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]


input_texts = [
  'query: How does a corporate website differ from a business card website?',
  'query: Π“Π΄Π΅ Π±Ρ‹Π» создан ΠΏΠ΅Ρ€Π²Ρ‹ΠΉ троллСйбус?',
  'passage: The first trolleybus was created in Germany by engineer Werner von Siemens, probably influenced by the idea of his brother, Dr. Wilhelm Siemens, who lived in England, expressed on May 18, 1881 at the twenty-second meeting of the Royal Scientific Society. The electrical circuit was carried out by an eight-wheeled cart (Kontaktwagen) rolling along two parallel contact wires. The wires were located quite close to each other, and in strong winds they often overlapped, which led to short circuits. An experimental trolleybus line with a length of 540 m (591 yards), opened by Siemens & Halske in the Berlin suburb of Halensee, operated from April 29 to June 13, 1882.',
  'passage: ΠšΠΎΡ€ΠΏΠΎΡ€Π°Ρ‚ΠΈΠ²Π½Ρ‹ΠΉ сайт β€” содСрТит ΠΏΠΎΠ»Π½ΡƒΡŽ ΠΈΠ½Ρ„ΠΎΡ€ΠΌΠ°Ρ†ΠΈΡŽ ΠΎ ΠΊΠΎΠΌΠΏΠ°Π½ΠΈΠΈ-Π²Π»Π°Π΄Π΅Π»ΡŒΡ†Π΅, услугах/ΠΏΡ€ΠΎΠ΄ΡƒΠΊΡ†ΠΈΠΈ, событиях Π² ΠΆΠΈΠ·Π½ΠΈ ΠΊΠΎΠΌΠΏΠ°Π½ΠΈΠΈ. ΠžΡ‚Π»ΠΈΡ‡Π°Π΅Ρ‚ΡΡ ΠΎΡ‚ сайта-Π²ΠΈΠ·ΠΈΡ‚ΠΊΠΈ ΠΈ ΠΏΡ€Π΅Π΄ΡΡ‚Π°Π²ΠΈΡ‚Π΅Π»ΡŒΡΠΊΠΎΠ³ΠΎ сайта ΠΏΠΎΠ»Π½ΠΎΡ‚ΠΎΠΉ прСдставлСнной ΠΈΠ½Ρ„ΠΎΡ€ΠΌΠ°Ρ†ΠΈΠΈ, Π·Π°Ρ‡Π°ΡΡ‚ΡƒΡŽ содСрТит Ρ€Π°Π·Π»ΠΈΡ‡Π½Ρ‹Π΅ Ρ„ΡƒΠ½ΠΊΡ†ΠΈΠΎΠ½Π°Π»ΡŒΠ½Ρ‹Π΅ инструмСнты для Ρ€Π°Π±ΠΎΡ‚Ρ‹ с ΠΊΠΎΠ½Ρ‚Π΅Π½Ρ‚ΠΎΠΌ (поиск ΠΈ Ρ„ΠΈΠ»ΡŒΡ‚Ρ€Ρ‹, ΠΊΠ°Π»Π΅Π½Π΄Π°Ρ€ΠΈ событий, Ρ„ΠΎΡ‚ΠΎΠ³Π°Π»Π΅Ρ€Π΅ΠΈ, ΠΊΠΎΡ€ΠΏΠΎΡ€Π°Ρ‚ΠΈΠ²Π½Ρ‹Π΅ Π±Π»ΠΎΠ³ΠΈ, Ρ„ΠΎΡ€ΡƒΠΌΡ‹). ΠœΠΎΠΆΠ΅Ρ‚ Π±Ρ‹Ρ‚ΡŒ ΠΈΠ½Ρ‚Π΅Π³Ρ€ΠΈΡ€ΠΎΠ²Π°Π½ с Π²Π½ΡƒΡ‚Ρ€Π΅Π½Π½ΠΈΠΌΠΈ ΠΈΠ½Ρ„ΠΎΡ€ΠΌΠ°Ρ†ΠΈΠΎΠ½Π½Ρ‹ΠΌΠΈ систСмами ΠΊΠΎΠΌΠΏΠ°Π½ΠΈΠΈ-Π²Π»Π°Π΄Π΅Π»ΡŒΡ†Π° (КИБ, CRM, бухгалтСрскими систСмами). ΠœΠΎΠΆΠ΅Ρ‚ ΡΠΎΠ΄Π΅Ρ€ΠΆΠ°Ρ‚ΡŒ Π·Π°ΠΊΡ€Ρ‹Ρ‚Ρ‹Π΅ Ρ€Π°Π·Π΄Π΅Π»Ρ‹ для Ρ‚Π΅Ρ… ΠΈΠ»ΠΈ ΠΈΠ½Ρ‹Ρ… Π³Ρ€ΡƒΠΏΠΏ ΠΏΠΎΠ»ΡŒΠ·ΠΎΠ²Π°Ρ‚Π΅Π»Π΅ΠΉ β€” сотрудников, Π΄ΠΈΠ»Π΅Ρ€ΠΎΠ², ΠΊΠΎΠ½Ρ‚Ρ€Π°Π³Π΅Π½Ρ‚ΠΎΠ² ΠΈ ΠΏΡ€.',
]

tokenizer = XLMRobertaTokenizer.from_pretrained('d0rj/e5-base-en-ru', use_cache=False)
model = XLMRobertaModel.from_pretrained('d0rj/e5-base-en-ru', use_cache=False)

batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')

outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])

embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
# [[68.59542846679688, 81.75910949707031], [80.36100769042969, 64.77748107910156]]

Pipeline

from transformers import pipeline


pipe = pipeline('feature-extraction', model='d0rj/e5-base-en-ru')
embeddings = pipe(input_texts, return_tensors=True)
embeddings[0].size()
# torch.Size([1, 17, 1024])

sentence-transformers

from sentence_transformers import SentenceTransformer


sentences = [
    'query: Π§Ρ‚ΠΎ Ρ‚Π°ΠΊΠΎΠ΅ ΠΊΡ€ΡƒΠ³Π»Ρ‹Π΅ Ρ‚Π΅Π½Π·ΠΎΡ€Ρ‹?',
    'passage: Abstract: we introduce a novel method for compressing round tensors based on their inherent radial symmetry. We start by generalising PCA and eigen decomposition on round tensors...',
]

model = SentenceTransformer('d0rj/e5-base-en-ru')
embeddings = model.encode(sentences, convert_to_tensor=True)
embeddings.size()
# torch.Size([2, 1024])
Downloads last month
1,019
Safetensors
Model size
0.1B params
Tensor type
F32
Β·
Inference Providers NEW

Spaces using d0rj/e5-base-en-ru 15

Collection including d0rj/e5-base-en-ru