MTEB Leaderboard

Massive Text Embedding Benchmark (MTEB) Leaderboard - Compare embedding models across multiple tasks

Rank ModelModel Size (M) Memory (GB) Dimensions Max Tokens Average (56) Classification (12) Clustering (11) Pair Class. (3) Reranking (4) Retrieval (15) STS (10) Summary (1)
1
NV-Embed-v2
7,85129.254,09632,76872.3190.3758.4688.6760.6562.6584.3130.70
2
jasper_en_vision_language_v1
N/AN/AN/AN/A72.0288.4958.0488.0760.9163.1284.6731.42
4
bge-en-icl
7,11126.494,09632,76871.6788.9557.8988.1459.8662.1684.2430.77
3
bge-en-icl
7,11126.494,09632,76871.6788.9557.8988.1459.8662.1684.2430.77
5
stella_en_1.5B_v5
1,5435.758,192131,07271.1987.6357.6988.0761.2161.0184.5131.49
6
SFR-Embedding-2_R
7,11126.494,09632,76870.3189.0556.1788.0760.1460.1881.2630.71
11
gte-Qwen2-7B-instruct-Q4-mlx
1,1904.433,584131,07270.2486.5856.9285.7961.4260.2583.0431.35
10
gte-Qwen2-7B-instruct
7,61328.363,584131,07270.2486.5856.9285.7961.4260.2583.0431.35
9
gte-Qwen2-7B-instruct-fp16
7,06926.333,584131,07270.2486.5856.9285.7961.4260.2583.0431.35
8
gte-Qwen2-7B-instruct-Q4_K_M-GGUF
N/AN/AN/AN/A70.2486.5856.9285.7961.4260.2583.0431.35
7
gte-Qwen2-7B-instruct
7,61328.363,584131,07270.2486.5856.9285.7961.4260.2583.0431.35
12
stella_en_400M_v5
4351.628,1928,19270.1186.6756.7087.7460.1658.9784.2231.66
13
bge-multilingual-gemma2
9,24234.433,5848,19269.8888.0854.6585.8459.7259.2483.8831.20
14
NV-Embed-v1
7,85129.254,09632,76869.3287.3552.8086.9160.5459.3682.8431.20
15
voyage-large-2-instruct
N/AN/A1,02416,00068.2381.4953.3589.2560.0958.2884.3130.84
16
Linq-Embed-Mistral
7,11126.494,09632,76868.1780.2051.4288.3560.2960.1984.9730.98
17
hui-embedding
N/AN/AN/AN/A68.0285.7154.3687.2860.2554.1483.7430.17
18
KaLM-embedding-multilingual-max-instruct-v1
N/AN/AN/AN/A67.6088.4650.7186.3757.8754.6282.8627.86
19
SFR-Embedding-Mistral
7,11126.494,09632,76867.5678.3351.6788.5460.6459.0085.0531.16
20
Zeta-Alpha-E5-Mistral
7,11126.494,09632,76867.5578.7751.8988.1959.0259.5084.2431.42
21
gte-Qwen1.5-7B-instruct
7,09926.454,09632,76867.3579.6055.8387.4160.1356.2482.4231.46
23
gte-Qwen2-1.5B-instruct
1,7766.621,536131,07267.1682.4748.7587.5159.9858.2982.7331.17
22
gte-Qwen2-1.5B-instruct-Q4_K_M-GGUF
N/AN/AN/AN/A67.1682.4748.7587.5159.9858.2982.7331.17
24
voyage-lite-02-instruct
1,2204.541,0244,00067.1379.2552.4286.8758.2456.6085.7931.01
25
GritLM-7B-vllm
7,24226.984,09632,76866.7679.4650.6187.1660.4957.4183.3530.37
26
e5-mistral-7b-instruct
7,11126.494,09632,76866.6378.4750.2688.3460.2156.8984.6331.40
27
GritLM-7B
7,24026.974,096N/A66.5878.6550.6187.2960.4857.3683.3530.39
28
speed-embedding-7b-instruct
7,11126.494,09632,76866.5178.4449.2888.1960.7656.5085.4831.06
29
e5-mistral-7b-instruct
7,11126.494,09632,76866.4077.3750.2688.4260.2156.8784.6231.53
30
text-embedding-004
1,2004.477682,04866.3181.1747.4887.6158.9055.7085.0732.63
31
TDTE
N/AN/AN/AN/A65.9677.1747.8688.2760.4657.0584.8230.83
32
BinGSE-Meta-Llama-3-8B-Instruct
N/AN/AN/AN/A65.7276.0547.1888.8459.2857.3885.3131.19
33
GritLM-8x7B
46,703173.984,09632,76865.6678.5350.1484.9759.8055.0983.2629.82
34
jina-embeddings-v3
5722.131,0248,19465.5182.5845.2184.0158.1353.8885.8129.71
36
learning2_model
4341.621,0248,19265.3977.7547.9684.5358.5057.9181.4330.91
35
gte-large-en-v1.5
4341.621,0248,19265.3977.7547.9684.5358.5057.9181.4330.91
37
LLM2Vec-Meta-Llama-3-supervised
7,50527.964,0968,19265.0175.9246.4587.7959.6856.6383.5830.94
38
cde-small-v1
1430.5376851265.0081.7148.3284.6956.7553.2781.6331.23
39
KaLM-embedding-multilingual-mini-instruct-v1.5
4941.84896131,07264.9484.7447.8283.2655.4151.6582.2425.23
40
LLM2Vec-Mistral-supervised
7,11126.494,09632,76864.8076.6345.5487.9958.4255.9984.0929.96
41
KaLM-embedding-multilingual-mini-instruct-v1
4941.84896131,07264.7485.1045.4683.9255.5851.8882.4027.88
43
mxbai-embed-large-v1
3351.251,02451264.6875.6446.7187.2060.1154.3985.0032.71
42
echo-mistral-7b-instruct-lasttoken
7,11126.494,09632,76864.6877.4346.3287.3458.1455.5282.5630.73
44
UAE-Large-V1
3351.251,02451264.6475.5846.7387.2559.8854.6684.5432.03
45
text-embedding-3-large
N/AN/A3,0728,19164.5975.4549.0185.7259.1655.4481.7329.92
46
voyage-lite-01-instruct
N/AN/A1,0244,00064.4974.7947.4086.5759.7455.5882.9330.97
47
Cohere-embed-english-v3.0
N/AN/AN/AN/A64.4776.4947.4385.8458.0155.0082.6230.18
48
text-embedding-004-256
1,2004.472562,04864.3779.0045.0787.2957.7852.4484.9332.36
49
GIST-large-Embedding-v0
3351.251,02451264.3476.0146.5586.7060.0553.4484.5930.96
50
bge-large-en-v1.5
3351.251,02451264.2375.9746.0887.1260.0354.2983.1131.61
51
b1ade-embed
3351.251,02451264.2175.1646.4687.0760.0053.3085.0431.93
52
MUG-B-1.6
3351.251,02451264.2074.5046.9387.1459.9753.4684.9932.12
53
LLM2Vec-Llama-2-7b-chat-hf-mntp-supervised
6,60724.614,0964,09664.1476.3345.2488.0357.3854.6083.7328.49
54
gte-base-en-v1.5
1370.517688,19264.1177.1746.8285.3357.6654.0981.9731.17
55
Cohere-embed-multilingual-v3.0
N/AN/A1,02451264.0176.0146.6086.1557.8653.8483.1530.99
56
GIST-Embedding-v0
1090.4176851263.7176.0346.2186.3259.3752.3183.5130.87
57
multilingual-e5-large-instruct
5602.091,02451463.6173.3247.0786.2458.9252.6485.0330.46
58
bge-base-en-v1.5
1090.4176851263.5675.5345.8186.5558.8653.2582.4031.07
59
ember-v1
3351.251,02451263.5475.9945.5887.3760.0451.9283.3430.82
60
sf_model_e5
3351.251,02451263.3473.9646.6186.8559.8651.8083.8531.61
61
mxbai-embed-2d-large-v1
3351.251,02451263.2574.1446.0785.8958.9451.4284.9031.55
62
gte-large
3351.251,02451263.1373.3346.8485.0059.1352.2283.3531.66
63
NoInstruct-small-Embedding-v0
330.1238451263.1275.9744.9584.9958.3051.9983.0030.60
64
GIST-small-Embedding-v0
330.1238451262.7276.1144.8284.6858.5650.4383.0331.14
65
stella-base-en-v2
550.2076851262.6175.2844.9086.4558.7850.1083.0232.52
67
nomic-embed-text-v1
1370.517688,19262.3974.1243.9185.1555.6952.8182.0630.08
66
gte-base
1090.4176851262.3973.0146.2084.5758.6151.1482.3031.17
68
nomic-embed-text-v1.5
1370.517688,19262.2873.5543.9384.6155.7853.0181.9430.40
69
text-embedding-3-small
N/AN/A1,5368,19162.2673.2146.6585.0456.7251.0881.5831.12
70
e5-large-v2
3351.251,02451262.2075.2444.2686.0356.6150.5682.0530.19
71
concat-e5-small-bge-small-01
N/AN/AN/AN/A62.1873.3344.3686.0958.1451.4082.2430.07
73
bge-small-en-v1.5
330.1238451262.1774.1443.8284.9258.3651.6881.5930.12
72
bge-small-retail-finetuned
330.1238451262.1774.1443.8284.9258.3651.6881.5930.12
74
bge-small-en
N/AN/AN/AN/A62.1174.3744.3183.7857.9751.8280.7230.53
75
Cohere-embed-english-light-v3.0
N/AN/AN/AN/A62.0174.3144.6485.0556.0951.3480.9231.29
76
text-embedding-3-large-256
N/AN/A2568,19162.0071.9746.2384.2257.9951.6681.0429.92
77
nomic-embed-text-v1.5-512
1380.515128,19261.9673.2443.7184.5955.6552.4081.7030.47
78
KaLM-embedding-multilingual-mini-v1
4941.84896131,07261.8772.8845.3784.0955.8650.8182.4229.15
79
LLM2Vec-Sheared-Llama-supervised
1,2804.772,0484,09661.8572.2143.5786.2155.3851.4483.5830.01
81
MedEmbed-small-v0.1
330.1238451261.7971.8343.4584.7158.3852.4981.4830.49
80
instructor-xl
1,2414.6276851261.7973.1244.7486.6257.2949.2683.0632.32
82
instructor-large
3351.2576851261.5973.8645.2985.8957.5447.5783.1531.84
83
BinGSE-Sheared-LLaMA
N/AN/AN/AN/A61.5870.5643.5986.6154.1652.2883.1030.79
84
e5-base-v2
1100.4176851261.5673.8444.1085.7355.9150.2981.0530.28
85
e5-base-4k
1120.427684,09661.5073.8443.8085.7355.9150.2981.0530.28
86
e5-large
3351.251,02451261.4273.1443.3385.9456.5349.9982.0630.97
88
gte-multilingual-base
3051.147688,19261.4070.8944.3184.2457.4751.0882.1130.58
87
onnx-gte-multilingual-base
N/AN/AN/AN/A61.4070.8944.3184.2457.4751.0882.1130.58
90
gte-small
330.1238451261.3672.3144.8983.5457.7049.4682.0730.42
89
nomic-embed-text-v1-ablated
1370.517688,19261.3673.6543.7084.5953.3251.4380.2231.28
91
nomic-embed-text-v1.5-256
1380.512568,19261.0472.1043.1684.0955.1850.8181.3430.05
92
text-embedding-ada-002
N/AN/A1,5368,19160.9970.9345.9084.8956.3249.2580.9730.80
93
multilingual-e5-large
5602.091,02451460.8971.7741.2384.7555.9651.4081.6229.64
94
udever-bloom-7b1
7,06926.334,0962,04860.6372.1340.8185.4055.9149.3483.0130.97
95
snowflake-arctic-embed-l-v2.0
5682.121,0248,19460.5067.1841.6583.0055.7855.6578.6230.51
96
e5-base
1090.4176851260.4472.6342.1185.0955.7048.7580.9631.01
97
jina-embeddings-v2-base-en
1370.517688,19260.3873.4541.7385.3856.9847.8780.7031.60

About MTEB

The Massive Text Embedding Benchmark (MTEB) evaluates embedding models across 56 datasets covering various tasks like classification, clustering, retrieval, and more.

Dataset Categories

  • • Classification (12 datasets)
  • • Clustering (11 datasets)
  • • Pair Classification (3 datasets)
  • • Reranking (4 datasets)
  • • Retrieval (15 datasets)
  • • STS (10 datasets)
  • • Summarization (1 dataset)