MTEB Leaderboard
Massive Text Embedding Benchmark (MTEB) Leaderboard - Compare embedding models across multiple tasks
Rank | Model | Model Size (M) | Memory (GB) | Dimensions | Max Tokens | Average (56) | Classification (12) | Clustering (11) | Pair Class. (3) | Reranking (4) | Retrieval (15) | STS (10) | Summary (1) |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | NV-Embed-v2 | 7,851 | 29.25 | 4,096 | 32,768 | 72.31 | 90.37 | 58.46 | 88.67 | 60.65 | 62.65 | 84.31 | 30.70 |
2 | jasper_en_vision_language_v1 | N/A | N/A | N/A | N/A | 72.02 | 88.49 | 58.04 | 88.07 | 60.91 | 63.12 | 84.67 | 31.42 |
4 | bge-en-icl | 7,111 | 26.49 | 4,096 | 32,768 | 71.67 | 88.95 | 57.89 | 88.14 | 59.86 | 62.16 | 84.24 | 30.77 |
3 | bge-en-icl | 7,111 | 26.49 | 4,096 | 32,768 | 71.67 | 88.95 | 57.89 | 88.14 | 59.86 | 62.16 | 84.24 | 30.77 |
5 | stella_en_1.5B_v5 | 1,543 | 5.75 | 8,192 | 131,072 | 71.19 | 87.63 | 57.69 | 88.07 | 61.21 | 61.01 | 84.51 | 31.49 |
6 | SFR-Embedding-2_R | 7,111 | 26.49 | 4,096 | 32,768 | 70.31 | 89.05 | 56.17 | 88.07 | 60.14 | 60.18 | 81.26 | 30.71 |
11 | gte-Qwen2-7B-instruct-Q4-mlx | 1,190 | 4.43 | 3,584 | 131,072 | 70.24 | 86.58 | 56.92 | 85.79 | 61.42 | 60.25 | 83.04 | 31.35 |
10 | gte-Qwen2-7B-instruct | 7,613 | 28.36 | 3,584 | 131,072 | 70.24 | 86.58 | 56.92 | 85.79 | 61.42 | 60.25 | 83.04 | 31.35 |
9 | gte-Qwen2-7B-instruct-fp16 | 7,069 | 26.33 | 3,584 | 131,072 | 70.24 | 86.58 | 56.92 | 85.79 | 61.42 | 60.25 | 83.04 | 31.35 |
8 | gte-Qwen2-7B-instruct-Q4_K_M-GGUF | N/A | N/A | N/A | N/A | 70.24 | 86.58 | 56.92 | 85.79 | 61.42 | 60.25 | 83.04 | 31.35 |
7 | gte-Qwen2-7B-instruct | 7,613 | 28.36 | 3,584 | 131,072 | 70.24 | 86.58 | 56.92 | 85.79 | 61.42 | 60.25 | 83.04 | 31.35 |
12 | stella_en_400M_v5 | 435 | 1.62 | 8,192 | 8,192 | 70.11 | 86.67 | 56.70 | 87.74 | 60.16 | 58.97 | 84.22 | 31.66 |
13 | bge-multilingual-gemma2 | 9,242 | 34.43 | 3,584 | 8,192 | 69.88 | 88.08 | 54.65 | 85.84 | 59.72 | 59.24 | 83.88 | 31.20 |
14 | NV-Embed-v1 | 7,851 | 29.25 | 4,096 | 32,768 | 69.32 | 87.35 | 52.80 | 86.91 | 60.54 | 59.36 | 82.84 | 31.20 |
15 | voyage-large-2-instruct | N/A | N/A | 1,024 | 16,000 | 68.23 | 81.49 | 53.35 | 89.25 | 60.09 | 58.28 | 84.31 | 30.84 |
16 | Linq-Embed-Mistral | 7,111 | 26.49 | 4,096 | 32,768 | 68.17 | 80.20 | 51.42 | 88.35 | 60.29 | 60.19 | 84.97 | 30.98 |
17 | hui-embedding | N/A | N/A | N/A | N/A | 68.02 | 85.71 | 54.36 | 87.28 | 60.25 | 54.14 | 83.74 | 30.17 |
18 | KaLM-embedding-multilingual-max-instruct-v1 | N/A | N/A | N/A | N/A | 67.60 | 88.46 | 50.71 | 86.37 | 57.87 | 54.62 | 82.86 | 27.86 |
19 | SFR-Embedding-Mistral | 7,111 | 26.49 | 4,096 | 32,768 | 67.56 | 78.33 | 51.67 | 88.54 | 60.64 | 59.00 | 85.05 | 31.16 |
20 | Zeta-Alpha-E5-Mistral | 7,111 | 26.49 | 4,096 | 32,768 | 67.55 | 78.77 | 51.89 | 88.19 | 59.02 | 59.50 | 84.24 | 31.42 |
21 | gte-Qwen1.5-7B-instruct | 7,099 | 26.45 | 4,096 | 32,768 | 67.35 | 79.60 | 55.83 | 87.41 | 60.13 | 56.24 | 82.42 | 31.46 |
23 | gte-Qwen2-1.5B-instruct | 1,776 | 6.62 | 1,536 | 131,072 | 67.16 | 82.47 | 48.75 | 87.51 | 59.98 | 58.29 | 82.73 | 31.17 |
22 | gte-Qwen2-1.5B-instruct-Q4_K_M-GGUF | N/A | N/A | N/A | N/A | 67.16 | 82.47 | 48.75 | 87.51 | 59.98 | 58.29 | 82.73 | 31.17 |
24 | voyage-lite-02-instruct | 1,220 | 4.54 | 1,024 | 4,000 | 67.13 | 79.25 | 52.42 | 86.87 | 58.24 | 56.60 | 85.79 | 31.01 |
25 | GritLM-7B-vllm | 7,242 | 26.98 | 4,096 | 32,768 | 66.76 | 79.46 | 50.61 | 87.16 | 60.49 | 57.41 | 83.35 | 30.37 |
26 | e5-mistral-7b-instruct | 7,111 | 26.49 | 4,096 | 32,768 | 66.63 | 78.47 | 50.26 | 88.34 | 60.21 | 56.89 | 84.63 | 31.40 |
27 | GritLM-7B | 7,240 | 26.97 | 4,096 | N/A | 66.58 | 78.65 | 50.61 | 87.29 | 60.48 | 57.36 | 83.35 | 30.39 |
28 | speed-embedding-7b-instruct | 7,111 | 26.49 | 4,096 | 32,768 | 66.51 | 78.44 | 49.28 | 88.19 | 60.76 | 56.50 | 85.48 | 31.06 |
29 | e5-mistral-7b-instruct | 7,111 | 26.49 | 4,096 | 32,768 | 66.40 | 77.37 | 50.26 | 88.42 | 60.21 | 56.87 | 84.62 | 31.53 |
30 | text-embedding-004 | 1,200 | 4.47 | 768 | 2,048 | 66.31 | 81.17 | 47.48 | 87.61 | 58.90 | 55.70 | 85.07 | 32.63 |
31 | TDTE | N/A | N/A | N/A | N/A | 65.96 | 77.17 | 47.86 | 88.27 | 60.46 | 57.05 | 84.82 | 30.83 |
32 | BinGSE-Meta-Llama-3-8B-Instruct | N/A | N/A | N/A | N/A | 65.72 | 76.05 | 47.18 | 88.84 | 59.28 | 57.38 | 85.31 | 31.19 |
33 | GritLM-8x7B | 46,703 | 173.98 | 4,096 | 32,768 | 65.66 | 78.53 | 50.14 | 84.97 | 59.80 | 55.09 | 83.26 | 29.82 |
34 | jina-embeddings-v3 | 572 | 2.13 | 1,024 | 8,194 | 65.51 | 82.58 | 45.21 | 84.01 | 58.13 | 53.88 | 85.81 | 29.71 |
36 | learning2_model | 434 | 1.62 | 1,024 | 8,192 | 65.39 | 77.75 | 47.96 | 84.53 | 58.50 | 57.91 | 81.43 | 30.91 |
35 | gte-large-en-v1.5 | 434 | 1.62 | 1,024 | 8,192 | 65.39 | 77.75 | 47.96 | 84.53 | 58.50 | 57.91 | 81.43 | 30.91 |
37 | LLM2Vec-Meta-Llama-3-supervised | 7,505 | 27.96 | 4,096 | 8,192 | 65.01 | 75.92 | 46.45 | 87.79 | 59.68 | 56.63 | 83.58 | 30.94 |
38 | cde-small-v1 | 143 | 0.53 | 768 | 512 | 65.00 | 81.71 | 48.32 | 84.69 | 56.75 | 53.27 | 81.63 | 31.23 |
39 | KaLM-embedding-multilingual-mini-instruct-v1.5 | 494 | 1.84 | 896 | 131,072 | 64.94 | 84.74 | 47.82 | 83.26 | 55.41 | 51.65 | 82.24 | 25.23 |
40 | LLM2Vec-Mistral-supervised | 7,111 | 26.49 | 4,096 | 32,768 | 64.80 | 76.63 | 45.54 | 87.99 | 58.42 | 55.99 | 84.09 | 29.96 |
41 | KaLM-embedding-multilingual-mini-instruct-v1 | 494 | 1.84 | 896 | 131,072 | 64.74 | 85.10 | 45.46 | 83.92 | 55.58 | 51.88 | 82.40 | 27.88 |
43 | mxbai-embed-large-v1 | 335 | 1.25 | 1,024 | 512 | 64.68 | 75.64 | 46.71 | 87.20 | 60.11 | 54.39 | 85.00 | 32.71 |
42 | echo-mistral-7b-instruct-lasttoken | 7,111 | 26.49 | 4,096 | 32,768 | 64.68 | 77.43 | 46.32 | 87.34 | 58.14 | 55.52 | 82.56 | 30.73 |
44 | UAE-Large-V1 | 335 | 1.25 | 1,024 | 512 | 64.64 | 75.58 | 46.73 | 87.25 | 59.88 | 54.66 | 84.54 | 32.03 |
45 | text-embedding-3-large | N/A | N/A | 3,072 | 8,191 | 64.59 | 75.45 | 49.01 | 85.72 | 59.16 | 55.44 | 81.73 | 29.92 |
46 | voyage-lite-01-instruct | N/A | N/A | 1,024 | 4,000 | 64.49 | 74.79 | 47.40 | 86.57 | 59.74 | 55.58 | 82.93 | 30.97 |
47 | Cohere-embed-english-v3.0 | N/A | N/A | N/A | N/A | 64.47 | 76.49 | 47.43 | 85.84 | 58.01 | 55.00 | 82.62 | 30.18 |
48 | text-embedding-004-256 | 1,200 | 4.47 | 256 | 2,048 | 64.37 | 79.00 | 45.07 | 87.29 | 57.78 | 52.44 | 84.93 | 32.36 |
49 | GIST-large-Embedding-v0 | 335 | 1.25 | 1,024 | 512 | 64.34 | 76.01 | 46.55 | 86.70 | 60.05 | 53.44 | 84.59 | 30.96 |
50 | bge-large-en-v1.5 | 335 | 1.25 | 1,024 | 512 | 64.23 | 75.97 | 46.08 | 87.12 | 60.03 | 54.29 | 83.11 | 31.61 |
51 | b1ade-embed | 335 | 1.25 | 1,024 | 512 | 64.21 | 75.16 | 46.46 | 87.07 | 60.00 | 53.30 | 85.04 | 31.93 |
52 | MUG-B-1.6 | 335 | 1.25 | 1,024 | 512 | 64.20 | 74.50 | 46.93 | 87.14 | 59.97 | 53.46 | 84.99 | 32.12 |
53 | LLM2Vec-Llama-2-7b-chat-hf-mntp-supervised | 6,607 | 24.61 | 4,096 | 4,096 | 64.14 | 76.33 | 45.24 | 88.03 | 57.38 | 54.60 | 83.73 | 28.49 |
54 | gte-base-en-v1.5 | 137 | 0.51 | 768 | 8,192 | 64.11 | 77.17 | 46.82 | 85.33 | 57.66 | 54.09 | 81.97 | 31.17 |
55 | Cohere-embed-multilingual-v3.0 | N/A | N/A | 1,024 | 512 | 64.01 | 76.01 | 46.60 | 86.15 | 57.86 | 53.84 | 83.15 | 30.99 |
56 | GIST-Embedding-v0 | 109 | 0.41 | 768 | 512 | 63.71 | 76.03 | 46.21 | 86.32 | 59.37 | 52.31 | 83.51 | 30.87 |
57 | multilingual-e5-large-instruct | 560 | 2.09 | 1,024 | 514 | 63.61 | 73.32 | 47.07 | 86.24 | 58.92 | 52.64 | 85.03 | 30.46 |
58 | bge-base-en-v1.5 | 109 | 0.41 | 768 | 512 | 63.56 | 75.53 | 45.81 | 86.55 | 58.86 | 53.25 | 82.40 | 31.07 |
59 | ember-v1 | 335 | 1.25 | 1,024 | 512 | 63.54 | 75.99 | 45.58 | 87.37 | 60.04 | 51.92 | 83.34 | 30.82 |
60 | sf_model_e5 | 335 | 1.25 | 1,024 | 512 | 63.34 | 73.96 | 46.61 | 86.85 | 59.86 | 51.80 | 83.85 | 31.61 |
61 | mxbai-embed-2d-large-v1 | 335 | 1.25 | 1,024 | 512 | 63.25 | 74.14 | 46.07 | 85.89 | 58.94 | 51.42 | 84.90 | 31.55 |
62 | gte-large | 335 | 1.25 | 1,024 | 512 | 63.13 | 73.33 | 46.84 | 85.00 | 59.13 | 52.22 | 83.35 | 31.66 |
63 | NoInstruct-small-Embedding-v0 | 33 | 0.12 | 384 | 512 | 63.12 | 75.97 | 44.95 | 84.99 | 58.30 | 51.99 | 83.00 | 30.60 |
64 | GIST-small-Embedding-v0 | 33 | 0.12 | 384 | 512 | 62.72 | 76.11 | 44.82 | 84.68 | 58.56 | 50.43 | 83.03 | 31.14 |
65 | stella-base-en-v2 | 55 | 0.20 | 768 | 512 | 62.61 | 75.28 | 44.90 | 86.45 | 58.78 | 50.10 | 83.02 | 32.52 |
67 | nomic-embed-text-v1 | 137 | 0.51 | 768 | 8,192 | 62.39 | 74.12 | 43.91 | 85.15 | 55.69 | 52.81 | 82.06 | 30.08 |
66 | gte-base | 109 | 0.41 | 768 | 512 | 62.39 | 73.01 | 46.20 | 84.57 | 58.61 | 51.14 | 82.30 | 31.17 |
68 | nomic-embed-text-v1.5 | 137 | 0.51 | 768 | 8,192 | 62.28 | 73.55 | 43.93 | 84.61 | 55.78 | 53.01 | 81.94 | 30.40 |
69 | text-embedding-3-small | N/A | N/A | 1,536 | 8,191 | 62.26 | 73.21 | 46.65 | 85.04 | 56.72 | 51.08 | 81.58 | 31.12 |
70 | e5-large-v2 | 335 | 1.25 | 1,024 | 512 | 62.20 | 75.24 | 44.26 | 86.03 | 56.61 | 50.56 | 82.05 | 30.19 |
71 | concat-e5-small-bge-small-01 | N/A | N/A | N/A | N/A | 62.18 | 73.33 | 44.36 | 86.09 | 58.14 | 51.40 | 82.24 | 30.07 |
73 | bge-small-en-v1.5 | 33 | 0.12 | 384 | 512 | 62.17 | 74.14 | 43.82 | 84.92 | 58.36 | 51.68 | 81.59 | 30.12 |
72 | bge-small-retail-finetuned | 33 | 0.12 | 384 | 512 | 62.17 | 74.14 | 43.82 | 84.92 | 58.36 | 51.68 | 81.59 | 30.12 |
74 | bge-small-en | N/A | N/A | N/A | N/A | 62.11 | 74.37 | 44.31 | 83.78 | 57.97 | 51.82 | 80.72 | 30.53 |
75 | Cohere-embed-english-light-v3.0 | N/A | N/A | N/A | N/A | 62.01 | 74.31 | 44.64 | 85.05 | 56.09 | 51.34 | 80.92 | 31.29 |
76 | text-embedding-3-large-256 | N/A | N/A | 256 | 8,191 | 62.00 | 71.97 | 46.23 | 84.22 | 57.99 | 51.66 | 81.04 | 29.92 |
77 | nomic-embed-text-v1.5-512 | 138 | 0.51 | 512 | 8,192 | 61.96 | 73.24 | 43.71 | 84.59 | 55.65 | 52.40 | 81.70 | 30.47 |
78 | KaLM-embedding-multilingual-mini-v1 | 494 | 1.84 | 896 | 131,072 | 61.87 | 72.88 | 45.37 | 84.09 | 55.86 | 50.81 | 82.42 | 29.15 |
79 | LLM2Vec-Sheared-Llama-supervised | 1,280 | 4.77 | 2,048 | 4,096 | 61.85 | 72.21 | 43.57 | 86.21 | 55.38 | 51.44 | 83.58 | 30.01 |
81 | MedEmbed-small-v0.1 | 33 | 0.12 | 384 | 512 | 61.79 | 71.83 | 43.45 | 84.71 | 58.38 | 52.49 | 81.48 | 30.49 |
80 | instructor-xl | 1,241 | 4.62 | 768 | 512 | 61.79 | 73.12 | 44.74 | 86.62 | 57.29 | 49.26 | 83.06 | 32.32 |
82 | instructor-large | 335 | 1.25 | 768 | 512 | 61.59 | 73.86 | 45.29 | 85.89 | 57.54 | 47.57 | 83.15 | 31.84 |
83 | BinGSE-Sheared-LLaMA | N/A | N/A | N/A | N/A | 61.58 | 70.56 | 43.59 | 86.61 | 54.16 | 52.28 | 83.10 | 30.79 |
84 | e5-base-v2 | 110 | 0.41 | 768 | 512 | 61.56 | 73.84 | 44.10 | 85.73 | 55.91 | 50.29 | 81.05 | 30.28 |
85 | e5-base-4k | 112 | 0.42 | 768 | 4,096 | 61.50 | 73.84 | 43.80 | 85.73 | 55.91 | 50.29 | 81.05 | 30.28 |
86 | e5-large | 335 | 1.25 | 1,024 | 512 | 61.42 | 73.14 | 43.33 | 85.94 | 56.53 | 49.99 | 82.06 | 30.97 |
88 | gte-multilingual-base | 305 | 1.14 | 768 | 8,192 | 61.40 | 70.89 | 44.31 | 84.24 | 57.47 | 51.08 | 82.11 | 30.58 |
87 | onnx-gte-multilingual-base | N/A | N/A | N/A | N/A | 61.40 | 70.89 | 44.31 | 84.24 | 57.47 | 51.08 | 82.11 | 30.58 |
90 | gte-small | 33 | 0.12 | 384 | 512 | 61.36 | 72.31 | 44.89 | 83.54 | 57.70 | 49.46 | 82.07 | 30.42 |
89 | nomic-embed-text-v1-ablated | 137 | 0.51 | 768 | 8,192 | 61.36 | 73.65 | 43.70 | 84.59 | 53.32 | 51.43 | 80.22 | 31.28 |
91 | nomic-embed-text-v1.5-256 | 138 | 0.51 | 256 | 8,192 | 61.04 | 72.10 | 43.16 | 84.09 | 55.18 | 50.81 | 81.34 | 30.05 |
92 | text-embedding-ada-002 | N/A | N/A | 1,536 | 8,191 | 60.99 | 70.93 | 45.90 | 84.89 | 56.32 | 49.25 | 80.97 | 30.80 |
93 | multilingual-e5-large | 560 | 2.09 | 1,024 | 514 | 60.89 | 71.77 | 41.23 | 84.75 | 55.96 | 51.40 | 81.62 | 29.64 |
94 | udever-bloom-7b1 | 7,069 | 26.33 | 4,096 | 2,048 | 60.63 | 72.13 | 40.81 | 85.40 | 55.91 | 49.34 | 83.01 | 30.97 |
95 | snowflake-arctic-embed-l-v2.0 | 568 | 2.12 | 1,024 | 8,194 | 60.50 | 67.18 | 41.65 | 83.00 | 55.78 | 55.65 | 78.62 | 30.51 |
96 | e5-base | 109 | 0.41 | 768 | 512 | 60.44 | 72.63 | 42.11 | 85.09 | 55.70 | 48.75 | 80.96 | 31.01 |
97 | jina-embeddings-v2-base-en | 137 | 0.51 | 768 | 8,192 | 60.38 | 73.45 | 41.73 | 85.38 | 56.98 | 47.87 | 80.70 | 31.60 |
About MTEB
The Massive Text Embedding Benchmark (MTEB) evaluates embedding models across 56 datasets covering various tasks like classification, clustering, retrieval, and more.
Dataset Categories
- • Classification (12 datasets)
- • Clustering (11 datasets)
- • Pair Classification (3 datasets)
- • Reranking (4 datasets)
- • Retrieval (15 datasets)
- • STS (10 datasets)
- • Summarization (1 dataset)