RichardErkhov commited on
Commit
e81de54
·
verified ·
1 Parent(s): a44a542

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +2678 -0
README.md ADDED
@@ -0,0 +1,2678 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ GritLM-7B - GGUF
11
+ - Model creator: https://huggingface.co/GritLM/
12
+ - Original model: https://huggingface.co/GritLM/GritLM-7B/
13
+
14
+
15
+ | Name | Quant method | Size |
16
+ | ---- | ---- | ---- |
17
+ | [GritLM-7B.Q2_K.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q2_K.gguf) | Q2_K | 2.53GB |
18
+ | [GritLM-7B.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.IQ3_XS.gguf) | IQ3_XS | 2.81GB |
19
+ | [GritLM-7B.IQ3_S.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.IQ3_S.gguf) | IQ3_S | 2.96GB |
20
+ | [GritLM-7B.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q3_K_S.gguf) | Q3_K_S | 2.95GB |
21
+ | [GritLM-7B.IQ3_M.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.IQ3_M.gguf) | IQ3_M | 3.06GB |
22
+ | [GritLM-7B.Q3_K.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q3_K.gguf) | Q3_K | 3.28GB |
23
+ | [GritLM-7B.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q3_K_M.gguf) | Q3_K_M | 3.28GB |
24
+ | [GritLM-7B.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q3_K_L.gguf) | Q3_K_L | 3.56GB |
25
+ | [GritLM-7B.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.IQ4_XS.gguf) | IQ4_XS | 3.67GB |
26
+ | [GritLM-7B.Q4_0.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q4_0.gguf) | Q4_0 | 3.83GB |
27
+ | [GritLM-7B.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.IQ4_NL.gguf) | IQ4_NL | 3.87GB |
28
+ | [GritLM-7B.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q4_K_S.gguf) | Q4_K_S | 3.86GB |
29
+ | [GritLM-7B.Q4_K.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q4_K.gguf) | Q4_K | 4.07GB |
30
+ | [GritLM-7B.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q4_K_M.gguf) | Q4_K_M | 4.07GB |
31
+ | [GritLM-7B.Q4_1.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q4_1.gguf) | Q4_1 | 4.24GB |
32
+ | [GritLM-7B.Q5_0.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q5_0.gguf) | Q5_0 | 4.65GB |
33
+ | [GritLM-7B.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q5_K_S.gguf) | Q5_K_S | 4.65GB |
34
+ | [GritLM-7B.Q5_K.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q5_K.gguf) | Q5_K | 4.78GB |
35
+ | [GritLM-7B.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q5_K_M.gguf) | Q5_K_M | 4.78GB |
36
+ | [GritLM-7B.Q5_1.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q5_1.gguf) | Q5_1 | 5.07GB |
37
+ | [GritLM-7B.Q6_K.gguf](https://huggingface.co/RichardErkhov/GritLM_-_GritLM-7B-gguf/blob/main/GritLM-7B.Q6_K.gguf) | Q6_K | 5.53GB |
38
+
39
+
40
+
41
+
42
+ Original model description:
43
+ ---
44
+ pipeline_tag: text-generation
45
+ inference: true
46
+ license: apache-2.0
47
+ datasets:
48
+ - GritLM/tulu2
49
+ tags:
50
+ - mteb
51
+ model-index:
52
+ - name: GritLM-7B
53
+ results:
54
+ - task:
55
+ type: Classification
56
+ dataset:
57
+ type: mteb/amazon_counterfactual
58
+ name: MTEB AmazonCounterfactualClassification (en)
59
+ config: en
60
+ split: test
61
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
62
+ metrics:
63
+ - type: accuracy
64
+ value: 81.17910447761194
65
+ - type: ap
66
+ value: 46.26260671758199
67
+ - type: f1
68
+ value: 75.44565719934167
69
+ - task:
70
+ type: Classification
71
+ dataset:
72
+ type: mteb/amazon_polarity
73
+ name: MTEB AmazonPolarityClassification
74
+ config: default
75
+ split: test
76
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
77
+ metrics:
78
+ - type: accuracy
79
+ value: 96.5161
80
+ - type: ap
81
+ value: 94.79131981460425
82
+ - type: f1
83
+ value: 96.51506148413065
84
+ - task:
85
+ type: Classification
86
+ dataset:
87
+ type: mteb/amazon_reviews_multi
88
+ name: MTEB AmazonReviewsClassification (en)
89
+ config: en
90
+ split: test
91
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
92
+ metrics:
93
+ - type: accuracy
94
+ value: 57.806000000000004
95
+ - type: f1
96
+ value: 56.78350156257903
97
+ - task:
98
+ type: Retrieval
99
+ dataset:
100
+ type: arguana
101
+ name: MTEB ArguAna
102
+ config: default
103
+ split: test
104
+ revision: None
105
+ metrics:
106
+ - type: map_at_1
107
+ value: 38.478
108
+ - type: map_at_10
109
+ value: 54.955
110
+ - type: map_at_100
111
+ value: 54.955
112
+ - type: map_at_1000
113
+ value: 54.955
114
+ - type: map_at_3
115
+ value: 50.888999999999996
116
+ - type: map_at_5
117
+ value: 53.349999999999994
118
+ - type: mrr_at_1
119
+ value: 39.757999999999996
120
+ - type: mrr_at_10
121
+ value: 55.449000000000005
122
+ - type: mrr_at_100
123
+ value: 55.449000000000005
124
+ - type: mrr_at_1000
125
+ value: 55.449000000000005
126
+ - type: mrr_at_3
127
+ value: 51.37500000000001
128
+ - type: mrr_at_5
129
+ value: 53.822
130
+ - type: ndcg_at_1
131
+ value: 38.478
132
+ - type: ndcg_at_10
133
+ value: 63.239999999999995
134
+ - type: ndcg_at_100
135
+ value: 63.239999999999995
136
+ - type: ndcg_at_1000
137
+ value: 63.239999999999995
138
+ - type: ndcg_at_3
139
+ value: 54.935
140
+ - type: ndcg_at_5
141
+ value: 59.379000000000005
142
+ - type: precision_at_1
143
+ value: 38.478
144
+ - type: precision_at_10
145
+ value: 8.933
146
+ - type: precision_at_100
147
+ value: 0.893
148
+ - type: precision_at_1000
149
+ value: 0.089
150
+ - type: precision_at_3
151
+ value: 22.214
152
+ - type: precision_at_5
153
+ value: 15.491
154
+ - type: recall_at_1
155
+ value: 38.478
156
+ - type: recall_at_10
157
+ value: 89.331
158
+ - type: recall_at_100
159
+ value: 89.331
160
+ - type: recall_at_1000
161
+ value: 89.331
162
+ - type: recall_at_3
163
+ value: 66.643
164
+ - type: recall_at_5
165
+ value: 77.45400000000001
166
+ - task:
167
+ type: Clustering
168
+ dataset:
169
+ type: mteb/arxiv-clustering-p2p
170
+ name: MTEB ArxivClusteringP2P
171
+ config: default
172
+ split: test
173
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
174
+ metrics:
175
+ - type: v_measure
176
+ value: 51.67144081472449
177
+ - task:
178
+ type: Clustering
179
+ dataset:
180
+ type: mteb/arxiv-clustering-s2s
181
+ name: MTEB ArxivClusteringS2S
182
+ config: default
183
+ split: test
184
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
185
+ metrics:
186
+ - type: v_measure
187
+ value: 48.11256154264126
188
+ - task:
189
+ type: Reranking
190
+ dataset:
191
+ type: mteb/askubuntudupquestions-reranking
192
+ name: MTEB AskUbuntuDupQuestions
193
+ config: default
194
+ split: test
195
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
196
+ metrics:
197
+ - type: map
198
+ value: 67.33801955487878
199
+ - type: mrr
200
+ value: 80.71549487754474
201
+ - task:
202
+ type: STS
203
+ dataset:
204
+ type: mteb/biosses-sts
205
+ name: MTEB BIOSSES
206
+ config: default
207
+ split: test
208
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
209
+ metrics:
210
+ - type: cos_sim_pearson
211
+ value: 88.1935203751726
212
+ - type: cos_sim_spearman
213
+ value: 86.35497970498659
214
+ - type: euclidean_pearson
215
+ value: 85.46910708503744
216
+ - type: euclidean_spearman
217
+ value: 85.13928935405485
218
+ - type: manhattan_pearson
219
+ value: 85.68373836333303
220
+ - type: manhattan_spearman
221
+ value: 85.40013867117746
222
+ - task:
223
+ type: Classification
224
+ dataset:
225
+ type: mteb/banking77
226
+ name: MTEB Banking77Classification
227
+ config: default
228
+ split: test
229
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
230
+ metrics:
231
+ - type: accuracy
232
+ value: 88.46753246753248
233
+ - type: f1
234
+ value: 88.43006344981134
235
+ - task:
236
+ type: Clustering
237
+ dataset:
238
+ type: mteb/biorxiv-clustering-p2p
239
+ name: MTEB BiorxivClusteringP2P
240
+ config: default
241
+ split: test
242
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
243
+ metrics:
244
+ - type: v_measure
245
+ value: 40.86793640310432
246
+ - task:
247
+ type: Clustering
248
+ dataset:
249
+ type: mteb/biorxiv-clustering-s2s
250
+ name: MTEB BiorxivClusteringS2S
251
+ config: default
252
+ split: test
253
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
254
+ metrics:
255
+ - type: v_measure
256
+ value: 39.80291334130727
257
+ - task:
258
+ type: Retrieval
259
+ dataset:
260
+ type: BeIR/cqadupstack
261
+ name: MTEB CQADupstackAndroidRetrieval
262
+ config: default
263
+ split: test
264
+ revision: None
265
+ metrics:
266
+ - type: map_at_1
267
+ value: 38.421
268
+ - type: map_at_10
269
+ value: 52.349000000000004
270
+ - type: map_at_100
271
+ value: 52.349000000000004
272
+ - type: map_at_1000
273
+ value: 52.349000000000004
274
+ - type: map_at_3
275
+ value: 48.17
276
+ - type: map_at_5
277
+ value: 50.432
278
+ - type: mrr_at_1
279
+ value: 47.353
280
+ - type: mrr_at_10
281
+ value: 58.387
282
+ - type: mrr_at_100
283
+ value: 58.387
284
+ - type: mrr_at_1000
285
+ value: 58.387
286
+ - type: mrr_at_3
287
+ value: 56.199
288
+ - type: mrr_at_5
289
+ value: 57.487
290
+ - type: ndcg_at_1
291
+ value: 47.353
292
+ - type: ndcg_at_10
293
+ value: 59.202
294
+ - type: ndcg_at_100
295
+ value: 58.848
296
+ - type: ndcg_at_1000
297
+ value: 58.831999999999994
298
+ - type: ndcg_at_3
299
+ value: 54.112
300
+ - type: ndcg_at_5
301
+ value: 56.312
302
+ - type: precision_at_1
303
+ value: 47.353
304
+ - type: precision_at_10
305
+ value: 11.459
306
+ - type: precision_at_100
307
+ value: 1.146
308
+ - type: precision_at_1000
309
+ value: 0.11499999999999999
310
+ - type: precision_at_3
311
+ value: 26.133
312
+ - type: precision_at_5
313
+ value: 18.627
314
+ - type: recall_at_1
315
+ value: 38.421
316
+ - type: recall_at_10
317
+ value: 71.89
318
+ - type: recall_at_100
319
+ value: 71.89
320
+ - type: recall_at_1000
321
+ value: 71.89
322
+ - type: recall_at_3
323
+ value: 56.58
324
+ - type: recall_at_5
325
+ value: 63.125
326
+ - task:
327
+ type: Retrieval
328
+ dataset:
329
+ type: BeIR/cqadupstack
330
+ name: MTEB CQADupstackEnglishRetrieval
331
+ config: default
332
+ split: test
333
+ revision: None
334
+ metrics:
335
+ - type: map_at_1
336
+ value: 38.025999999999996
337
+ - type: map_at_10
338
+ value: 50.590999999999994
339
+ - type: map_at_100
340
+ value: 51.99700000000001
341
+ - type: map_at_1000
342
+ value: 52.11599999999999
343
+ - type: map_at_3
344
+ value: 47.435
345
+ - type: map_at_5
346
+ value: 49.236000000000004
347
+ - type: mrr_at_1
348
+ value: 48.28
349
+ - type: mrr_at_10
350
+ value: 56.814
351
+ - type: mrr_at_100
352
+ value: 57.446
353
+ - type: mrr_at_1000
354
+ value: 57.476000000000006
355
+ - type: mrr_at_3
356
+ value: 54.958
357
+ - type: mrr_at_5
358
+ value: 56.084999999999994
359
+ - type: ndcg_at_1
360
+ value: 48.28
361
+ - type: ndcg_at_10
362
+ value: 56.442
363
+ - type: ndcg_at_100
364
+ value: 60.651999999999994
365
+ - type: ndcg_at_1000
366
+ value: 62.187000000000005
367
+ - type: ndcg_at_3
368
+ value: 52.866
369
+ - type: ndcg_at_5
370
+ value: 54.515
371
+ - type: precision_at_1
372
+ value: 48.28
373
+ - type: precision_at_10
374
+ value: 10.586
375
+ - type: precision_at_100
376
+ value: 1.6310000000000002
377
+ - type: precision_at_1000
378
+ value: 0.20600000000000002
379
+ - type: precision_at_3
380
+ value: 25.945
381
+ - type: precision_at_5
382
+ value: 18.076
383
+ - type: recall_at_1
384
+ value: 38.025999999999996
385
+ - type: recall_at_10
386
+ value: 66.11399999999999
387
+ - type: recall_at_100
388
+ value: 83.339
389
+ - type: recall_at_1000
390
+ value: 92.413
391
+ - type: recall_at_3
392
+ value: 54.493
393
+ - type: recall_at_5
394
+ value: 59.64699999999999
395
+ - task:
396
+ type: Retrieval
397
+ dataset:
398
+ type: BeIR/cqadupstack
399
+ name: MTEB CQADupstackGamingRetrieval
400
+ config: default
401
+ split: test
402
+ revision: None
403
+ metrics:
404
+ - type: map_at_1
405
+ value: 47.905
406
+ - type: map_at_10
407
+ value: 61.58
408
+ - type: map_at_100
409
+ value: 62.605
410
+ - type: map_at_1000
411
+ value: 62.637
412
+ - type: map_at_3
413
+ value: 58.074000000000005
414
+ - type: map_at_5
415
+ value: 60.260000000000005
416
+ - type: mrr_at_1
417
+ value: 54.42
418
+ - type: mrr_at_10
419
+ value: 64.847
420
+ - type: mrr_at_100
421
+ value: 65.403
422
+ - type: mrr_at_1000
423
+ value: 65.41900000000001
424
+ - type: mrr_at_3
425
+ value: 62.675000000000004
426
+ - type: mrr_at_5
427
+ value: 64.101
428
+ - type: ndcg_at_1
429
+ value: 54.42
430
+ - type: ndcg_at_10
431
+ value: 67.394
432
+ - type: ndcg_at_100
433
+ value: 70.846
434
+ - type: ndcg_at_1000
435
+ value: 71.403
436
+ - type: ndcg_at_3
437
+ value: 62.025
438
+ - type: ndcg_at_5
439
+ value: 65.032
440
+ - type: precision_at_1
441
+ value: 54.42
442
+ - type: precision_at_10
443
+ value: 10.646
444
+ - type: precision_at_100
445
+ value: 1.325
446
+ - type: precision_at_1000
447
+ value: 0.13999999999999999
448
+ - type: precision_at_3
449
+ value: 27.398
450
+ - type: precision_at_5
451
+ value: 18.796
452
+ - type: recall_at_1
453
+ value: 47.905
454
+ - type: recall_at_10
455
+ value: 80.84599999999999
456
+ - type: recall_at_100
457
+ value: 95.078
458
+ - type: recall_at_1000
459
+ value: 98.878
460
+ - type: recall_at_3
461
+ value: 67.05600000000001
462
+ - type: recall_at_5
463
+ value: 74.261
464
+ - task:
465
+ type: Retrieval
466
+ dataset:
467
+ type: BeIR/cqadupstack
468
+ name: MTEB CQADupstackGisRetrieval
469
+ config: default
470
+ split: test
471
+ revision: None
472
+ metrics:
473
+ - type: map_at_1
474
+ value: 30.745
475
+ - type: map_at_10
476
+ value: 41.021
477
+ - type: map_at_100
478
+ value: 41.021
479
+ - type: map_at_1000
480
+ value: 41.021
481
+ - type: map_at_3
482
+ value: 37.714999999999996
483
+ - type: map_at_5
484
+ value: 39.766
485
+ - type: mrr_at_1
486
+ value: 33.559
487
+ - type: mrr_at_10
488
+ value: 43.537
489
+ - type: mrr_at_100
490
+ value: 43.537
491
+ - type: mrr_at_1000
492
+ value: 43.537
493
+ - type: mrr_at_3
494
+ value: 40.546
495
+ - type: mrr_at_5
496
+ value: 42.439
497
+ - type: ndcg_at_1
498
+ value: 33.559
499
+ - type: ndcg_at_10
500
+ value: 46.781
501
+ - type: ndcg_at_100
502
+ value: 46.781
503
+ - type: ndcg_at_1000
504
+ value: 46.781
505
+ - type: ndcg_at_3
506
+ value: 40.516000000000005
507
+ - type: ndcg_at_5
508
+ value: 43.957
509
+ - type: precision_at_1
510
+ value: 33.559
511
+ - type: precision_at_10
512
+ value: 7.198
513
+ - type: precision_at_100
514
+ value: 0.72
515
+ - type: precision_at_1000
516
+ value: 0.07200000000000001
517
+ - type: precision_at_3
518
+ value: 17.1
519
+ - type: precision_at_5
520
+ value: 12.316
521
+ - type: recall_at_1
522
+ value: 30.745
523
+ - type: recall_at_10
524
+ value: 62.038000000000004
525
+ - type: recall_at_100
526
+ value: 62.038000000000004
527
+ - type: recall_at_1000
528
+ value: 62.038000000000004
529
+ - type: recall_at_3
530
+ value: 45.378
531
+ - type: recall_at_5
532
+ value: 53.580000000000005
533
+ - task:
534
+ type: Retrieval
535
+ dataset:
536
+ type: BeIR/cqadupstack
537
+ name: MTEB CQADupstackMathematicaRetrieval
538
+ config: default
539
+ split: test
540
+ revision: None
541
+ metrics:
542
+ - type: map_at_1
543
+ value: 19.637999999999998
544
+ - type: map_at_10
545
+ value: 31.05
546
+ - type: map_at_100
547
+ value: 31.05
548
+ - type: map_at_1000
549
+ value: 31.05
550
+ - type: map_at_3
551
+ value: 27.628000000000004
552
+ - type: map_at_5
553
+ value: 29.767
554
+ - type: mrr_at_1
555
+ value: 25.0
556
+ - type: mrr_at_10
557
+ value: 36.131
558
+ - type: mrr_at_100
559
+ value: 36.131
560
+ - type: mrr_at_1000
561
+ value: 36.131
562
+ - type: mrr_at_3
563
+ value: 33.333
564
+ - type: mrr_at_5
565
+ value: 35.143
566
+ - type: ndcg_at_1
567
+ value: 25.0
568
+ - type: ndcg_at_10
569
+ value: 37.478
570
+ - type: ndcg_at_100
571
+ value: 37.469
572
+ - type: ndcg_at_1000
573
+ value: 37.469
574
+ - type: ndcg_at_3
575
+ value: 31.757999999999996
576
+ - type: ndcg_at_5
577
+ value: 34.821999999999996
578
+ - type: precision_at_1
579
+ value: 25.0
580
+ - type: precision_at_10
581
+ value: 7.188999999999999
582
+ - type: precision_at_100
583
+ value: 0.719
584
+ - type: precision_at_1000
585
+ value: 0.07200000000000001
586
+ - type: precision_at_3
587
+ value: 15.837000000000002
588
+ - type: precision_at_5
589
+ value: 11.841
590
+ - type: recall_at_1
591
+ value: 19.637999999999998
592
+ - type: recall_at_10
593
+ value: 51.836000000000006
594
+ - type: recall_at_100
595
+ value: 51.836000000000006
596
+ - type: recall_at_1000
597
+ value: 51.836000000000006
598
+ - type: recall_at_3
599
+ value: 36.384
600
+ - type: recall_at_5
601
+ value: 43.964
602
+ - task:
603
+ type: Retrieval
604
+ dataset:
605
+ type: BeIR/cqadupstack
606
+ name: MTEB CQADupstackPhysicsRetrieval
607
+ config: default
608
+ split: test
609
+ revision: None
610
+ metrics:
611
+ - type: map_at_1
612
+ value: 34.884
613
+ - type: map_at_10
614
+ value: 47.88
615
+ - type: map_at_100
616
+ value: 47.88
617
+ - type: map_at_1000
618
+ value: 47.88
619
+ - type: map_at_3
620
+ value: 43.85
621
+ - type: map_at_5
622
+ value: 46.414
623
+ - type: mrr_at_1
624
+ value: 43.022
625
+ - type: mrr_at_10
626
+ value: 53.569
627
+ - type: mrr_at_100
628
+ value: 53.569
629
+ - type: mrr_at_1000
630
+ value: 53.569
631
+ - type: mrr_at_3
632
+ value: 51.075
633
+ - type: mrr_at_5
634
+ value: 52.725
635
+ - type: ndcg_at_1
636
+ value: 43.022
637
+ - type: ndcg_at_10
638
+ value: 54.461000000000006
639
+ - type: ndcg_at_100
640
+ value: 54.388000000000005
641
+ - type: ndcg_at_1000
642
+ value: 54.388000000000005
643
+ - type: ndcg_at_3
644
+ value: 48.864999999999995
645
+ - type: ndcg_at_5
646
+ value: 52.032000000000004
647
+ - type: precision_at_1
648
+ value: 43.022
649
+ - type: precision_at_10
650
+ value: 9.885
651
+ - type: precision_at_100
652
+ value: 0.988
653
+ - type: precision_at_1000
654
+ value: 0.099
655
+ - type: precision_at_3
656
+ value: 23.612
657
+ - type: precision_at_5
658
+ value: 16.997
659
+ - type: recall_at_1
660
+ value: 34.884
661
+ - type: recall_at_10
662
+ value: 68.12899999999999
663
+ - type: recall_at_100
664
+ value: 68.12899999999999
665
+ - type: recall_at_1000
666
+ value: 68.12899999999999
667
+ - type: recall_at_3
668
+ value: 52.428
669
+ - type: recall_at_5
670
+ value: 60.662000000000006
671
+ - task:
672
+ type: Retrieval
673
+ dataset:
674
+ type: BeIR/cqadupstack
675
+ name: MTEB CQADupstackProgrammersRetrieval
676
+ config: default
677
+ split: test
678
+ revision: None
679
+ metrics:
680
+ - type: map_at_1
681
+ value: 31.588
682
+ - type: map_at_10
683
+ value: 43.85
684
+ - type: map_at_100
685
+ value: 45.317
686
+ - type: map_at_1000
687
+ value: 45.408
688
+ - type: map_at_3
689
+ value: 39.73
690
+ - type: map_at_5
691
+ value: 42.122
692
+ - type: mrr_at_1
693
+ value: 38.927
694
+ - type: mrr_at_10
695
+ value: 49.582
696
+ - type: mrr_at_100
697
+ value: 50.39
698
+ - type: mrr_at_1000
699
+ value: 50.426
700
+ - type: mrr_at_3
701
+ value: 46.518
702
+ - type: mrr_at_5
703
+ value: 48.271
704
+ - type: ndcg_at_1
705
+ value: 38.927
706
+ - type: ndcg_at_10
707
+ value: 50.605999999999995
708
+ - type: ndcg_at_100
709
+ value: 56.22200000000001
710
+ - type: ndcg_at_1000
711
+ value: 57.724
712
+ - type: ndcg_at_3
713
+ value: 44.232
714
+ - type: ndcg_at_5
715
+ value: 47.233999999999995
716
+ - type: precision_at_1
717
+ value: 38.927
718
+ - type: precision_at_10
719
+ value: 9.429
720
+ - type: precision_at_100
721
+ value: 1.435
722
+ - type: precision_at_1000
723
+ value: 0.172
724
+ - type: precision_at_3
725
+ value: 21.271
726
+ - type: precision_at_5
727
+ value: 15.434000000000001
728
+ - type: recall_at_1
729
+ value: 31.588
730
+ - type: recall_at_10
731
+ value: 64.836
732
+ - type: recall_at_100
733
+ value: 88.066
734
+ - type: recall_at_1000
735
+ value: 97.748
736
+ - type: recall_at_3
737
+ value: 47.128
738
+ - type: recall_at_5
739
+ value: 54.954
740
+ - task:
741
+ type: Retrieval
742
+ dataset:
743
+ type: BeIR/cqadupstack
744
+ name: MTEB CQADupstackRetrieval
745
+ config: default
746
+ split: test
747
+ revision: None
748
+ metrics:
749
+ - type: map_at_1
750
+ value: 31.956083333333336
751
+ - type: map_at_10
752
+ value: 43.33483333333333
753
+ - type: map_at_100
754
+ value: 44.64883333333333
755
+ - type: map_at_1000
756
+ value: 44.75
757
+ - type: map_at_3
758
+ value: 39.87741666666666
759
+ - type: map_at_5
760
+ value: 41.86766666666667
761
+ - type: mrr_at_1
762
+ value: 38.06341666666667
763
+ - type: mrr_at_10
764
+ value: 47.839666666666666
765
+ - type: mrr_at_100
766
+ value: 48.644000000000005
767
+ - type: mrr_at_1000
768
+ value: 48.68566666666667
769
+ - type: mrr_at_3
770
+ value: 45.26358333333334
771
+ - type: mrr_at_5
772
+ value: 46.790000000000006
773
+ - type: ndcg_at_1
774
+ value: 38.06341666666667
775
+ - type: ndcg_at_10
776
+ value: 49.419333333333334
777
+ - type: ndcg_at_100
778
+ value: 54.50166666666667
779
+ - type: ndcg_at_1000
780
+ value: 56.161166666666674
781
+ - type: ndcg_at_3
782
+ value: 43.982416666666666
783
+ - type: ndcg_at_5
784
+ value: 46.638083333333334
785
+ - type: precision_at_1
786
+ value: 38.06341666666667
787
+ - type: precision_at_10
788
+ value: 8.70858333333333
789
+ - type: precision_at_100
790
+ value: 1.327
791
+ - type: precision_at_1000
792
+ value: 0.165
793
+ - type: precision_at_3
794
+ value: 20.37816666666667
795
+ - type: precision_at_5
796
+ value: 14.516333333333334
797
+ - type: recall_at_1
798
+ value: 31.956083333333336
799
+ - type: recall_at_10
800
+ value: 62.69458333333334
801
+ - type: recall_at_100
802
+ value: 84.46433333333334
803
+ - type: recall_at_1000
804
+ value: 95.58449999999999
805
+ - type: recall_at_3
806
+ value: 47.52016666666666
807
+ - type: recall_at_5
808
+ value: 54.36066666666666
809
+ - task:
810
+ type: Retrieval
811
+ dataset:
812
+ type: BeIR/cqadupstack
813
+ name: MTEB CQADupstackStatsRetrieval
814
+ config: default
815
+ split: test
816
+ revision: None
817
+ metrics:
818
+ - type: map_at_1
819
+ value: 28.912
820
+ - type: map_at_10
821
+ value: 38.291
822
+ - type: map_at_100
823
+ value: 39.44
824
+ - type: map_at_1000
825
+ value: 39.528
826
+ - type: map_at_3
827
+ value: 35.638
828
+ - type: map_at_5
829
+ value: 37.218
830
+ - type: mrr_at_1
831
+ value: 32.822
832
+ - type: mrr_at_10
833
+ value: 41.661
834
+ - type: mrr_at_100
835
+ value: 42.546
836
+ - type: mrr_at_1000
837
+ value: 42.603
838
+ - type: mrr_at_3
839
+ value: 39.238
840
+ - type: mrr_at_5
841
+ value: 40.726
842
+ - type: ndcg_at_1
843
+ value: 32.822
844
+ - type: ndcg_at_10
845
+ value: 43.373
846
+ - type: ndcg_at_100
847
+ value: 48.638
848
+ - type: ndcg_at_1000
849
+ value: 50.654999999999994
850
+ - type: ndcg_at_3
851
+ value: 38.643
852
+ - type: ndcg_at_5
853
+ value: 41.126000000000005
854
+ - type: precision_at_1
855
+ value: 32.822
856
+ - type: precision_at_10
857
+ value: 6.8709999999999996
858
+ - type: precision_at_100
859
+ value: 1.032
860
+ - type: precision_at_1000
861
+ value: 0.128
862
+ - type: precision_at_3
863
+ value: 16.82
864
+ - type: precision_at_5
865
+ value: 11.718
866
+ - type: recall_at_1
867
+ value: 28.912
868
+ - type: recall_at_10
869
+ value: 55.376999999999995
870
+ - type: recall_at_100
871
+ value: 79.066
872
+ - type: recall_at_1000
873
+ value: 93.664
874
+ - type: recall_at_3
875
+ value: 42.569
876
+ - type: recall_at_5
877
+ value: 48.719
878
+ - task:
879
+ type: Retrieval
880
+ dataset:
881
+ type: BeIR/cqadupstack
882
+ name: MTEB CQADupstackTexRetrieval
883
+ config: default
884
+ split: test
885
+ revision: None
886
+ metrics:
887
+ - type: map_at_1
888
+ value: 22.181
889
+ - type: map_at_10
890
+ value: 31.462
891
+ - type: map_at_100
892
+ value: 32.73
893
+ - type: map_at_1000
894
+ value: 32.848
895
+ - type: map_at_3
896
+ value: 28.57
897
+ - type: map_at_5
898
+ value: 30.182
899
+ - type: mrr_at_1
900
+ value: 27.185
901
+ - type: mrr_at_10
902
+ value: 35.846000000000004
903
+ - type: mrr_at_100
904
+ value: 36.811
905
+ - type: mrr_at_1000
906
+ value: 36.873
907
+ - type: mrr_at_3
908
+ value: 33.437
909
+ - type: mrr_at_5
910
+ value: 34.813
911
+ - type: ndcg_at_1
912
+ value: 27.185
913
+ - type: ndcg_at_10
914
+ value: 36.858000000000004
915
+ - type: ndcg_at_100
916
+ value: 42.501
917
+ - type: ndcg_at_1000
918
+ value: 44.945
919
+ - type: ndcg_at_3
920
+ value: 32.066
921
+ - type: ndcg_at_5
922
+ value: 34.29
923
+ - type: precision_at_1
924
+ value: 27.185
925
+ - type: precision_at_10
926
+ value: 6.752
927
+ - type: precision_at_100
928
+ value: 1.111
929
+ - type: precision_at_1000
930
+ value: 0.151
931
+ - type: precision_at_3
932
+ value: 15.290000000000001
933
+ - type: precision_at_5
934
+ value: 11.004999999999999
935
+ - type: recall_at_1
936
+ value: 22.181
937
+ - type: recall_at_10
938
+ value: 48.513
939
+ - type: recall_at_100
940
+ value: 73.418
941
+ - type: recall_at_1000
942
+ value: 90.306
943
+ - type: recall_at_3
944
+ value: 35.003
945
+ - type: recall_at_5
946
+ value: 40.876000000000005
947
+ - task:
948
+ type: Retrieval
949
+ dataset:
950
+ type: BeIR/cqadupstack
951
+ name: MTEB CQADupstackUnixRetrieval
952
+ config: default
953
+ split: test
954
+ revision: None
955
+ metrics:
956
+ - type: map_at_1
957
+ value: 33.934999999999995
958
+ - type: map_at_10
959
+ value: 44.727
960
+ - type: map_at_100
961
+ value: 44.727
962
+ - type: map_at_1000
963
+ value: 44.727
964
+ - type: map_at_3
965
+ value: 40.918
966
+ - type: map_at_5
967
+ value: 42.961
968
+ - type: mrr_at_1
969
+ value: 39.646
970
+ - type: mrr_at_10
971
+ value: 48.898
972
+ - type: mrr_at_100
973
+ value: 48.898
974
+ - type: mrr_at_1000
975
+ value: 48.898
976
+ - type: mrr_at_3
977
+ value: 45.896
978
+ - type: mrr_at_5
979
+ value: 47.514
980
+ - type: ndcg_at_1
981
+ value: 39.646
982
+ - type: ndcg_at_10
983
+ value: 50.817
984
+ - type: ndcg_at_100
985
+ value: 50.803
986
+ - type: ndcg_at_1000
987
+ value: 50.803
988
+ - type: ndcg_at_3
989
+ value: 44.507999999999996
990
+ - type: ndcg_at_5
991
+ value: 47.259
992
+ - type: precision_at_1
993
+ value: 39.646
994
+ - type: precision_at_10
995
+ value: 8.759
996
+ - type: precision_at_100
997
+ value: 0.876
998
+ - type: precision_at_1000
999
+ value: 0.08800000000000001
1000
+ - type: precision_at_3
1001
+ value: 20.274
1002
+ - type: precision_at_5
1003
+ value: 14.366000000000001
1004
+ - type: recall_at_1
1005
+ value: 33.934999999999995
1006
+ - type: recall_at_10
1007
+ value: 65.037
1008
+ - type: recall_at_100
1009
+ value: 65.037
1010
+ - type: recall_at_1000
1011
+ value: 65.037
1012
+ - type: recall_at_3
1013
+ value: 47.439
1014
+ - type: recall_at_5
1015
+ value: 54.567
1016
+ - task:
1017
+ type: Retrieval
1018
+ dataset:
1019
+ type: BeIR/cqadupstack
1020
+ name: MTEB CQADupstackWebmastersRetrieval
1021
+ config: default
1022
+ split: test
1023
+ revision: None
1024
+ metrics:
1025
+ - type: map_at_1
1026
+ value: 32.058
1027
+ - type: map_at_10
1028
+ value: 43.137
1029
+ - type: map_at_100
1030
+ value: 43.137
1031
+ - type: map_at_1000
1032
+ value: 43.137
1033
+ - type: map_at_3
1034
+ value: 39.882
1035
+ - type: map_at_5
1036
+ value: 41.379
1037
+ - type: mrr_at_1
1038
+ value: 38.933
1039
+ - type: mrr_at_10
1040
+ value: 48.344
1041
+ - type: mrr_at_100
1042
+ value: 48.344
1043
+ - type: mrr_at_1000
1044
+ value: 48.344
1045
+ - type: mrr_at_3
1046
+ value: 45.652
1047
+ - type: mrr_at_5
1048
+ value: 46.877
1049
+ - type: ndcg_at_1
1050
+ value: 38.933
1051
+ - type: ndcg_at_10
1052
+ value: 49.964
1053
+ - type: ndcg_at_100
1054
+ value: 49.242000000000004
1055
+ - type: ndcg_at_1000
1056
+ value: 49.222
1057
+ - type: ndcg_at_3
1058
+ value: 44.605
1059
+ - type: ndcg_at_5
1060
+ value: 46.501999999999995
1061
+ - type: precision_at_1
1062
+ value: 38.933
1063
+ - type: precision_at_10
1064
+ value: 9.427000000000001
1065
+ - type: precision_at_100
1066
+ value: 0.943
1067
+ - type: precision_at_1000
1068
+ value: 0.094
1069
+ - type: precision_at_3
1070
+ value: 20.685000000000002
1071
+ - type: precision_at_5
1072
+ value: 14.585
1073
+ - type: recall_at_1
1074
+ value: 32.058
1075
+ - type: recall_at_10
1076
+ value: 63.074
1077
+ - type: recall_at_100
1078
+ value: 63.074
1079
+ - type: recall_at_1000
1080
+ value: 63.074
1081
+ - type: recall_at_3
1082
+ value: 47.509
1083
+ - type: recall_at_5
1084
+ value: 52.455
1085
+ - task:
1086
+ type: Retrieval
1087
+ dataset:
1088
+ type: BeIR/cqadupstack
1089
+ name: MTEB CQADupstackWordpressRetrieval
1090
+ config: default
1091
+ split: test
1092
+ revision: None
1093
+ metrics:
1094
+ - type: map_at_1
1095
+ value: 26.029000000000003
1096
+ - type: map_at_10
1097
+ value: 34.646
1098
+ - type: map_at_100
1099
+ value: 34.646
1100
+ - type: map_at_1000
1101
+ value: 34.646
1102
+ - type: map_at_3
1103
+ value: 31.456
1104
+ - type: map_at_5
1105
+ value: 33.138
1106
+ - type: mrr_at_1
1107
+ value: 28.281
1108
+ - type: mrr_at_10
1109
+ value: 36.905
1110
+ - type: mrr_at_100
1111
+ value: 36.905
1112
+ - type: mrr_at_1000
1113
+ value: 36.905
1114
+ - type: mrr_at_3
1115
+ value: 34.011
1116
+ - type: mrr_at_5
1117
+ value: 35.638
1118
+ - type: ndcg_at_1
1119
+ value: 28.281
1120
+ - type: ndcg_at_10
1121
+ value: 40.159
1122
+ - type: ndcg_at_100
1123
+ value: 40.159
1124
+ - type: ndcg_at_1000
1125
+ value: 40.159
1126
+ - type: ndcg_at_3
1127
+ value: 33.995
1128
+ - type: ndcg_at_5
1129
+ value: 36.836999999999996
1130
+ - type: precision_at_1
1131
+ value: 28.281
1132
+ - type: precision_at_10
1133
+ value: 6.358999999999999
1134
+ - type: precision_at_100
1135
+ value: 0.636
1136
+ - type: precision_at_1000
1137
+ value: 0.064
1138
+ - type: precision_at_3
1139
+ value: 14.233
1140
+ - type: precision_at_5
1141
+ value: 10.314
1142
+ - type: recall_at_1
1143
+ value: 26.029000000000003
1144
+ - type: recall_at_10
1145
+ value: 55.08
1146
+ - type: recall_at_100
1147
+ value: 55.08
1148
+ - type: recall_at_1000
1149
+ value: 55.08
1150
+ - type: recall_at_3
1151
+ value: 38.487
1152
+ - type: recall_at_5
1153
+ value: 45.308
1154
+ - task:
1155
+ type: Retrieval
1156
+ dataset:
1157
+ type: climate-fever
1158
+ name: MTEB ClimateFEVER
1159
+ config: default
1160
+ split: test
1161
+ revision: None
1162
+ metrics:
1163
+ - type: map_at_1
1164
+ value: 12.842999999999998
1165
+ - type: map_at_10
1166
+ value: 22.101000000000003
1167
+ - type: map_at_100
1168
+ value: 24.319
1169
+ - type: map_at_1000
1170
+ value: 24.51
1171
+ - type: map_at_3
1172
+ value: 18.372
1173
+ - type: map_at_5
1174
+ value: 20.323
1175
+ - type: mrr_at_1
1176
+ value: 27.948
1177
+ - type: mrr_at_10
1178
+ value: 40.321
1179
+ - type: mrr_at_100
1180
+ value: 41.262
1181
+ - type: mrr_at_1000
1182
+ value: 41.297
1183
+ - type: mrr_at_3
1184
+ value: 36.558
1185
+ - type: mrr_at_5
1186
+ value: 38.824999999999996
1187
+ - type: ndcg_at_1
1188
+ value: 27.948
1189
+ - type: ndcg_at_10
1190
+ value: 30.906
1191
+ - type: ndcg_at_100
1192
+ value: 38.986
1193
+ - type: ndcg_at_1000
1194
+ value: 42.136
1195
+ - type: ndcg_at_3
1196
+ value: 24.911
1197
+ - type: ndcg_at_5
1198
+ value: 27.168999999999997
1199
+ - type: precision_at_1
1200
+ value: 27.948
1201
+ - type: precision_at_10
1202
+ value: 9.798
1203
+ - type: precision_at_100
1204
+ value: 1.8399999999999999
1205
+ - type: precision_at_1000
1206
+ value: 0.243
1207
+ - type: precision_at_3
1208
+ value: 18.328
1209
+ - type: precision_at_5
1210
+ value: 14.502
1211
+ - type: recall_at_1
1212
+ value: 12.842999999999998
1213
+ - type: recall_at_10
1214
+ value: 37.245
1215
+ - type: recall_at_100
1216
+ value: 64.769
1217
+ - type: recall_at_1000
1218
+ value: 82.055
1219
+ - type: recall_at_3
1220
+ value: 23.159
1221
+ - type: recall_at_5
1222
+ value: 29.113
1223
+ - task:
1224
+ type: Retrieval
1225
+ dataset:
1226
+ type: dbpedia-entity
1227
+ name: MTEB DBPedia
1228
+ config: default
1229
+ split: test
1230
+ revision: None
1231
+ metrics:
1232
+ - type: map_at_1
1233
+ value: 8.934000000000001
1234
+ - type: map_at_10
1235
+ value: 21.915000000000003
1236
+ - type: map_at_100
1237
+ value: 21.915000000000003
1238
+ - type: map_at_1000
1239
+ value: 21.915000000000003
1240
+ - type: map_at_3
1241
+ value: 14.623
1242
+ - type: map_at_5
1243
+ value: 17.841
1244
+ - type: mrr_at_1
1245
+ value: 71.25
1246
+ - type: mrr_at_10
1247
+ value: 78.994
1248
+ - type: mrr_at_100
1249
+ value: 78.994
1250
+ - type: mrr_at_1000
1251
+ value: 78.994
1252
+ - type: mrr_at_3
1253
+ value: 77.208
1254
+ - type: mrr_at_5
1255
+ value: 78.55799999999999
1256
+ - type: ndcg_at_1
1257
+ value: 60.62499999999999
1258
+ - type: ndcg_at_10
1259
+ value: 46.604
1260
+ - type: ndcg_at_100
1261
+ value: 35.653
1262
+ - type: ndcg_at_1000
1263
+ value: 35.531
1264
+ - type: ndcg_at_3
1265
+ value: 50.605
1266
+ - type: ndcg_at_5
1267
+ value: 48.730000000000004
1268
+ - type: precision_at_1
1269
+ value: 71.25
1270
+ - type: precision_at_10
1271
+ value: 37.75
1272
+ - type: precision_at_100
1273
+ value: 3.775
1274
+ - type: precision_at_1000
1275
+ value: 0.377
1276
+ - type: precision_at_3
1277
+ value: 54.417
1278
+ - type: precision_at_5
1279
+ value: 48.15
1280
+ - type: recall_at_1
1281
+ value: 8.934000000000001
1282
+ - type: recall_at_10
1283
+ value: 28.471000000000004
1284
+ - type: recall_at_100
1285
+ value: 28.471000000000004
1286
+ - type: recall_at_1000
1287
+ value: 28.471000000000004
1288
+ - type: recall_at_3
1289
+ value: 16.019
1290
+ - type: recall_at_5
1291
+ value: 21.410999999999998
1292
+ - task:
1293
+ type: Classification
1294
+ dataset:
1295
+ type: mteb/emotion
1296
+ name: MTEB EmotionClassification
1297
+ config: default
1298
+ split: test
1299
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1300
+ metrics:
1301
+ - type: accuracy
1302
+ value: 52.81
1303
+ - type: f1
1304
+ value: 47.987573380720114
1305
+ - task:
1306
+ type: Retrieval
1307
+ dataset:
1308
+ type: fever
1309
+ name: MTEB FEVER
1310
+ config: default
1311
+ split: test
1312
+ revision: None
1313
+ metrics:
1314
+ - type: map_at_1
1315
+ value: 66.81899999999999
1316
+ - type: map_at_10
1317
+ value: 78.034
1318
+ - type: map_at_100
1319
+ value: 78.034
1320
+ - type: map_at_1000
1321
+ value: 78.034
1322
+ - type: map_at_3
1323
+ value: 76.43100000000001
1324
+ - type: map_at_5
1325
+ value: 77.515
1326
+ - type: mrr_at_1
1327
+ value: 71.542
1328
+ - type: mrr_at_10
1329
+ value: 81.638
1330
+ - type: mrr_at_100
1331
+ value: 81.638
1332
+ - type: mrr_at_1000
1333
+ value: 81.638
1334
+ - type: mrr_at_3
1335
+ value: 80.403
1336
+ - type: mrr_at_5
1337
+ value: 81.256
1338
+ - type: ndcg_at_1
1339
+ value: 71.542
1340
+ - type: ndcg_at_10
1341
+ value: 82.742
1342
+ - type: ndcg_at_100
1343
+ value: 82.741
1344
+ - type: ndcg_at_1000
1345
+ value: 82.741
1346
+ - type: ndcg_at_3
1347
+ value: 80.039
1348
+ - type: ndcg_at_5
1349
+ value: 81.695
1350
+ - type: precision_at_1
1351
+ value: 71.542
1352
+ - type: precision_at_10
1353
+ value: 10.387
1354
+ - type: precision_at_100
1355
+ value: 1.039
1356
+ - type: precision_at_1000
1357
+ value: 0.104
1358
+ - type: precision_at_3
1359
+ value: 31.447999999999997
1360
+ - type: precision_at_5
1361
+ value: 19.91
1362
+ - type: recall_at_1
1363
+ value: 66.81899999999999
1364
+ - type: recall_at_10
1365
+ value: 93.372
1366
+ - type: recall_at_100
1367
+ value: 93.372
1368
+ - type: recall_at_1000
1369
+ value: 93.372
1370
+ - type: recall_at_3
1371
+ value: 86.33
1372
+ - type: recall_at_5
1373
+ value: 90.347
1374
+ - task:
1375
+ type: Retrieval
1376
+ dataset:
1377
+ type: fiqa
1378
+ name: MTEB FiQA2018
1379
+ config: default
1380
+ split: test
1381
+ revision: None
1382
+ metrics:
1383
+ - type: map_at_1
1384
+ value: 31.158
1385
+ - type: map_at_10
1386
+ value: 52.017
1387
+ - type: map_at_100
1388
+ value: 54.259
1389
+ - type: map_at_1000
1390
+ value: 54.367
1391
+ - type: map_at_3
1392
+ value: 45.738
1393
+ - type: map_at_5
1394
+ value: 49.283
1395
+ - type: mrr_at_1
1396
+ value: 57.87
1397
+ - type: mrr_at_10
1398
+ value: 66.215
1399
+ - type: mrr_at_100
1400
+ value: 66.735
1401
+ - type: mrr_at_1000
1402
+ value: 66.75
1403
+ - type: mrr_at_3
1404
+ value: 64.043
1405
+ - type: mrr_at_5
1406
+ value: 65.116
1407
+ - type: ndcg_at_1
1408
+ value: 57.87
1409
+ - type: ndcg_at_10
1410
+ value: 59.946999999999996
1411
+ - type: ndcg_at_100
1412
+ value: 66.31099999999999
1413
+ - type: ndcg_at_1000
1414
+ value: 67.75999999999999
1415
+ - type: ndcg_at_3
1416
+ value: 55.483000000000004
1417
+ - type: ndcg_at_5
1418
+ value: 56.891000000000005
1419
+ - type: precision_at_1
1420
+ value: 57.87
1421
+ - type: precision_at_10
1422
+ value: 16.497
1423
+ - type: precision_at_100
1424
+ value: 2.321
1425
+ - type: precision_at_1000
1426
+ value: 0.258
1427
+ - type: precision_at_3
1428
+ value: 37.14
1429
+ - type: precision_at_5
1430
+ value: 27.067999999999998
1431
+ - type: recall_at_1
1432
+ value: 31.158
1433
+ - type: recall_at_10
1434
+ value: 67.381
1435
+ - type: recall_at_100
1436
+ value: 89.464
1437
+ - type: recall_at_1000
1438
+ value: 97.989
1439
+ - type: recall_at_3
1440
+ value: 50.553000000000004
1441
+ - type: recall_at_5
1442
+ value: 57.824
1443
+ - task:
1444
+ type: Retrieval
1445
+ dataset:
1446
+ type: hotpotqa
1447
+ name: MTEB HotpotQA
1448
+ config: default
1449
+ split: test
1450
+ revision: None
1451
+ metrics:
1452
+ - type: map_at_1
1453
+ value: 42.073
1454
+ - type: map_at_10
1455
+ value: 72.418
1456
+ - type: map_at_100
1457
+ value: 73.175
1458
+ - type: map_at_1000
1459
+ value: 73.215
1460
+ - type: map_at_3
1461
+ value: 68.791
1462
+ - type: map_at_5
1463
+ value: 71.19
1464
+ - type: mrr_at_1
1465
+ value: 84.146
1466
+ - type: mrr_at_10
1467
+ value: 88.994
1468
+ - type: mrr_at_100
1469
+ value: 89.116
1470
+ - type: mrr_at_1000
1471
+ value: 89.12
1472
+ - type: mrr_at_3
1473
+ value: 88.373
1474
+ - type: mrr_at_5
1475
+ value: 88.82
1476
+ - type: ndcg_at_1
1477
+ value: 84.146
1478
+ - type: ndcg_at_10
1479
+ value: 79.404
1480
+ - type: ndcg_at_100
1481
+ value: 81.83200000000001
1482
+ - type: ndcg_at_1000
1483
+ value: 82.524
1484
+ - type: ndcg_at_3
1485
+ value: 74.595
1486
+ - type: ndcg_at_5
1487
+ value: 77.474
1488
+ - type: precision_at_1
1489
+ value: 84.146
1490
+ - type: precision_at_10
1491
+ value: 16.753999999999998
1492
+ - type: precision_at_100
1493
+ value: 1.8599999999999999
1494
+ - type: precision_at_1000
1495
+ value: 0.19499999999999998
1496
+ - type: precision_at_3
1497
+ value: 48.854
1498
+ - type: precision_at_5
1499
+ value: 31.579
1500
+ - type: recall_at_1
1501
+ value: 42.073
1502
+ - type: recall_at_10
1503
+ value: 83.768
1504
+ - type: recall_at_100
1505
+ value: 93.018
1506
+ - type: recall_at_1000
1507
+ value: 97.481
1508
+ - type: recall_at_3
1509
+ value: 73.282
1510
+ - type: recall_at_5
1511
+ value: 78.947
1512
+ - task:
1513
+ type: Classification
1514
+ dataset:
1515
+ type: mteb/imdb
1516
+ name: MTEB ImdbClassification
1517
+ config: default
1518
+ split: test
1519
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1520
+ metrics:
1521
+ - type: accuracy
1522
+ value: 94.9968
1523
+ - type: ap
1524
+ value: 92.93892195862824
1525
+ - type: f1
1526
+ value: 94.99327998213761
1527
+ - task:
1528
+ type: Retrieval
1529
+ dataset:
1530
+ type: msmarco
1531
+ name: MTEB MSMARCO
1532
+ config: default
1533
+ split: dev
1534
+ revision: None
1535
+ metrics:
1536
+ - type: map_at_1
1537
+ value: 21.698
1538
+ - type: map_at_10
1539
+ value: 34.585
1540
+ - type: map_at_100
1541
+ value: 35.782000000000004
1542
+ - type: map_at_1000
1543
+ value: 35.825
1544
+ - type: map_at_3
1545
+ value: 30.397999999999996
1546
+ - type: map_at_5
1547
+ value: 32.72
1548
+ - type: mrr_at_1
1549
+ value: 22.192
1550
+ - type: mrr_at_10
1551
+ value: 35.085
1552
+ - type: mrr_at_100
1553
+ value: 36.218
1554
+ - type: mrr_at_1000
1555
+ value: 36.256
1556
+ - type: mrr_at_3
1557
+ value: 30.986000000000004
1558
+ - type: mrr_at_5
1559
+ value: 33.268
1560
+ - type: ndcg_at_1
1561
+ value: 22.192
1562
+ - type: ndcg_at_10
1563
+ value: 41.957
1564
+ - type: ndcg_at_100
1565
+ value: 47.658
1566
+ - type: ndcg_at_1000
1567
+ value: 48.697
1568
+ - type: ndcg_at_3
1569
+ value: 33.433
1570
+ - type: ndcg_at_5
1571
+ value: 37.551
1572
+ - type: precision_at_1
1573
+ value: 22.192
1574
+ - type: precision_at_10
1575
+ value: 6.781
1576
+ - type: precision_at_100
1577
+ value: 0.963
1578
+ - type: precision_at_1000
1579
+ value: 0.105
1580
+ - type: precision_at_3
1581
+ value: 14.365
1582
+ - type: precision_at_5
1583
+ value: 10.713000000000001
1584
+ - type: recall_at_1
1585
+ value: 21.698
1586
+ - type: recall_at_10
1587
+ value: 64.79
1588
+ - type: recall_at_100
1589
+ value: 91.071
1590
+ - type: recall_at_1000
1591
+ value: 98.883
1592
+ - type: recall_at_3
1593
+ value: 41.611
1594
+ - type: recall_at_5
1595
+ value: 51.459999999999994
1596
+ - task:
1597
+ type: Classification
1598
+ dataset:
1599
+ type: mteb/mtop_domain
1600
+ name: MTEB MTOPDomainClassification (en)
1601
+ config: en
1602
+ split: test
1603
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1604
+ metrics:
1605
+ - type: accuracy
1606
+ value: 96.15823073415413
1607
+ - type: f1
1608
+ value: 96.00362034963248
1609
+ - task:
1610
+ type: Classification
1611
+ dataset:
1612
+ type: mteb/mtop_intent
1613
+ name: MTEB MTOPIntentClassification (en)
1614
+ config: en
1615
+ split: test
1616
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1617
+ metrics:
1618
+ - type: accuracy
1619
+ value: 87.12722298221614
1620
+ - type: f1
1621
+ value: 70.46888967516227
1622
+ - task:
1623
+ type: Classification
1624
+ dataset:
1625
+ type: mteb/amazon_massive_intent
1626
+ name: MTEB MassiveIntentClassification (en)
1627
+ config: en
1628
+ split: test
1629
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1630
+ metrics:
1631
+ - type: accuracy
1632
+ value: 80.77673167451245
1633
+ - type: f1
1634
+ value: 77.60202561132175
1635
+ - task:
1636
+ type: Classification
1637
+ dataset:
1638
+ type: mteb/amazon_massive_scenario
1639
+ name: MTEB MassiveScenarioClassification (en)
1640
+ config: en
1641
+ split: test
1642
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1643
+ metrics:
1644
+ - type: accuracy
1645
+ value: 82.09145931405514
1646
+ - type: f1
1647
+ value: 81.7701921473406
1648
+ - task:
1649
+ type: Clustering
1650
+ dataset:
1651
+ type: mteb/medrxiv-clustering-p2p
1652
+ name: MTEB MedrxivClusteringP2P
1653
+ config: default
1654
+ split: test
1655
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1656
+ metrics:
1657
+ - type: v_measure
1658
+ value: 36.52153488185864
1659
+ - task:
1660
+ type: Clustering
1661
+ dataset:
1662
+ type: mteb/medrxiv-clustering-s2s
1663
+ name: MTEB MedrxivClusteringS2S
1664
+ config: default
1665
+ split: test
1666
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1667
+ metrics:
1668
+ - type: v_measure
1669
+ value: 36.80090398444147
1670
+ - task:
1671
+ type: Reranking
1672
+ dataset:
1673
+ type: mteb/mind_small
1674
+ name: MTEB MindSmallReranking
1675
+ config: default
1676
+ split: test
1677
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1678
+ metrics:
1679
+ - type: map
1680
+ value: 31.807141746058605
1681
+ - type: mrr
1682
+ value: 32.85025611455029
1683
+ - task:
1684
+ type: Retrieval
1685
+ dataset:
1686
+ type: nfcorpus
1687
+ name: MTEB NFCorpus
1688
+ config: default
1689
+ split: test
1690
+ revision: None
1691
+ metrics:
1692
+ - type: map_at_1
1693
+ value: 6.920999999999999
1694
+ - type: map_at_10
1695
+ value: 16.049
1696
+ - type: map_at_100
1697
+ value: 16.049
1698
+ - type: map_at_1000
1699
+ value: 16.049
1700
+ - type: map_at_3
1701
+ value: 11.865
1702
+ - type: map_at_5
1703
+ value: 13.657
1704
+ - type: mrr_at_1
1705
+ value: 53.87
1706
+ - type: mrr_at_10
1707
+ value: 62.291
1708
+ - type: mrr_at_100
1709
+ value: 62.291
1710
+ - type: mrr_at_1000
1711
+ value: 62.291
1712
+ - type: mrr_at_3
1713
+ value: 60.681
1714
+ - type: mrr_at_5
1715
+ value: 61.61
1716
+ - type: ndcg_at_1
1717
+ value: 51.23799999999999
1718
+ - type: ndcg_at_10
1719
+ value: 40.892
1720
+ - type: ndcg_at_100
1721
+ value: 26.951999999999998
1722
+ - type: ndcg_at_1000
1723
+ value: 26.474999999999998
1724
+ - type: ndcg_at_3
1725
+ value: 46.821
1726
+ - type: ndcg_at_5
1727
+ value: 44.333
1728
+ - type: precision_at_1
1729
+ value: 53.251000000000005
1730
+ - type: precision_at_10
1731
+ value: 30.124000000000002
1732
+ - type: precision_at_100
1733
+ value: 3.012
1734
+ - type: precision_at_1000
1735
+ value: 0.301
1736
+ - type: precision_at_3
1737
+ value: 43.55
1738
+ - type: precision_at_5
1739
+ value: 38.266
1740
+ - type: recall_at_1
1741
+ value: 6.920999999999999
1742
+ - type: recall_at_10
1743
+ value: 20.852
1744
+ - type: recall_at_100
1745
+ value: 20.852
1746
+ - type: recall_at_1000
1747
+ value: 20.852
1748
+ - type: recall_at_3
1749
+ value: 13.628000000000002
1750
+ - type: recall_at_5
1751
+ value: 16.273
1752
+ - task:
1753
+ type: Retrieval
1754
+ dataset:
1755
+ type: nq
1756
+ name: MTEB NQ
1757
+ config: default
1758
+ split: test
1759
+ revision: None
1760
+ metrics:
1761
+ - type: map_at_1
1762
+ value: 46.827999999999996
1763
+ - type: map_at_10
1764
+ value: 63.434000000000005
1765
+ - type: map_at_100
1766
+ value: 63.434000000000005
1767
+ - type: map_at_1000
1768
+ value: 63.434000000000005
1769
+ - type: map_at_3
1770
+ value: 59.794000000000004
1771
+ - type: map_at_5
1772
+ value: 62.08
1773
+ - type: mrr_at_1
1774
+ value: 52.288999999999994
1775
+ - type: mrr_at_10
1776
+ value: 65.95
1777
+ - type: mrr_at_100
1778
+ value: 65.95
1779
+ - type: mrr_at_1000
1780
+ value: 65.95
1781
+ - type: mrr_at_3
1782
+ value: 63.413
1783
+ - type: mrr_at_5
1784
+ value: 65.08
1785
+ - type: ndcg_at_1
1786
+ value: 52.288999999999994
1787
+ - type: ndcg_at_10
1788
+ value: 70.301
1789
+ - type: ndcg_at_100
1790
+ value: 70.301
1791
+ - type: ndcg_at_1000
1792
+ value: 70.301
1793
+ - type: ndcg_at_3
1794
+ value: 63.979
1795
+ - type: ndcg_at_5
1796
+ value: 67.582
1797
+ - type: precision_at_1
1798
+ value: 52.288999999999994
1799
+ - type: precision_at_10
1800
+ value: 10.576
1801
+ - type: precision_at_100
1802
+ value: 1.058
1803
+ - type: precision_at_1000
1804
+ value: 0.106
1805
+ - type: precision_at_3
1806
+ value: 28.177000000000003
1807
+ - type: precision_at_5
1808
+ value: 19.073
1809
+ - type: recall_at_1
1810
+ value: 46.827999999999996
1811
+ - type: recall_at_10
1812
+ value: 88.236
1813
+ - type: recall_at_100
1814
+ value: 88.236
1815
+ - type: recall_at_1000
1816
+ value: 88.236
1817
+ - type: recall_at_3
1818
+ value: 72.371
1819
+ - type: recall_at_5
1820
+ value: 80.56
1821
+ - task:
1822
+ type: Retrieval
1823
+ dataset:
1824
+ type: quora
1825
+ name: MTEB QuoraRetrieval
1826
+ config: default
1827
+ split: test
1828
+ revision: None
1829
+ metrics:
1830
+ - type: map_at_1
1831
+ value: 71.652
1832
+ - type: map_at_10
1833
+ value: 85.953
1834
+ - type: map_at_100
1835
+ value: 85.953
1836
+ - type: map_at_1000
1837
+ value: 85.953
1838
+ - type: map_at_3
1839
+ value: 83.05399999999999
1840
+ - type: map_at_5
1841
+ value: 84.89
1842
+ - type: mrr_at_1
1843
+ value: 82.42
1844
+ - type: mrr_at_10
1845
+ value: 88.473
1846
+ - type: mrr_at_100
1847
+ value: 88.473
1848
+ - type: mrr_at_1000
1849
+ value: 88.473
1850
+ - type: mrr_at_3
1851
+ value: 87.592
1852
+ - type: mrr_at_5
1853
+ value: 88.211
1854
+ - type: ndcg_at_1
1855
+ value: 82.44
1856
+ - type: ndcg_at_10
1857
+ value: 89.467
1858
+ - type: ndcg_at_100
1859
+ value: 89.33
1860
+ - type: ndcg_at_1000
1861
+ value: 89.33
1862
+ - type: ndcg_at_3
1863
+ value: 86.822
1864
+ - type: ndcg_at_5
1865
+ value: 88.307
1866
+ - type: precision_at_1
1867
+ value: 82.44
1868
+ - type: precision_at_10
1869
+ value: 13.616
1870
+ - type: precision_at_100
1871
+ value: 1.362
1872
+ - type: precision_at_1000
1873
+ value: 0.136
1874
+ - type: precision_at_3
1875
+ value: 38.117000000000004
1876
+ - type: precision_at_5
1877
+ value: 25.05
1878
+ - type: recall_at_1
1879
+ value: 71.652
1880
+ - type: recall_at_10
1881
+ value: 96.224
1882
+ - type: recall_at_100
1883
+ value: 96.224
1884
+ - type: recall_at_1000
1885
+ value: 96.224
1886
+ - type: recall_at_3
1887
+ value: 88.571
1888
+ - type: recall_at_5
1889
+ value: 92.812
1890
+ - task:
1891
+ type: Clustering
1892
+ dataset:
1893
+ type: mteb/reddit-clustering
1894
+ name: MTEB RedditClustering
1895
+ config: default
1896
+ split: test
1897
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1898
+ metrics:
1899
+ - type: v_measure
1900
+ value: 61.295010338050474
1901
+ - task:
1902
+ type: Clustering
1903
+ dataset:
1904
+ type: mteb/reddit-clustering-p2p
1905
+ name: MTEB RedditClusteringP2P
1906
+ config: default
1907
+ split: test
1908
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1909
+ metrics:
1910
+ - type: v_measure
1911
+ value: 67.26380819328142
1912
+ - task:
1913
+ type: Retrieval
1914
+ dataset:
1915
+ type: scidocs
1916
+ name: MTEB SCIDOCS
1917
+ config: default
1918
+ split: test
1919
+ revision: None
1920
+ metrics:
1921
+ - type: map_at_1
1922
+ value: 5.683
1923
+ - type: map_at_10
1924
+ value: 14.924999999999999
1925
+ - type: map_at_100
1926
+ value: 17.532
1927
+ - type: map_at_1000
1928
+ value: 17.875
1929
+ - type: map_at_3
1930
+ value: 10.392
1931
+ - type: map_at_5
1932
+ value: 12.592
1933
+ - type: mrr_at_1
1934
+ value: 28.000000000000004
1935
+ - type: mrr_at_10
1936
+ value: 39.951
1937
+ - type: mrr_at_100
1938
+ value: 41.025
1939
+ - type: mrr_at_1000
1940
+ value: 41.056
1941
+ - type: mrr_at_3
1942
+ value: 36.317
1943
+ - type: mrr_at_5
1944
+ value: 38.412
1945
+ - type: ndcg_at_1
1946
+ value: 28.000000000000004
1947
+ - type: ndcg_at_10
1948
+ value: 24.410999999999998
1949
+ - type: ndcg_at_100
1950
+ value: 33.79
1951
+ - type: ndcg_at_1000
1952
+ value: 39.035
1953
+ - type: ndcg_at_3
1954
+ value: 22.845
1955
+ - type: ndcg_at_5
1956
+ value: 20.080000000000002
1957
+ - type: precision_at_1
1958
+ value: 28.000000000000004
1959
+ - type: precision_at_10
1960
+ value: 12.790000000000001
1961
+ - type: precision_at_100
1962
+ value: 2.633
1963
+ - type: precision_at_1000
1964
+ value: 0.388
1965
+ - type: precision_at_3
1966
+ value: 21.367
1967
+ - type: precision_at_5
1968
+ value: 17.7
1969
+ - type: recall_at_1
1970
+ value: 5.683
1971
+ - type: recall_at_10
1972
+ value: 25.91
1973
+ - type: recall_at_100
1974
+ value: 53.443
1975
+ - type: recall_at_1000
1976
+ value: 78.73
1977
+ - type: recall_at_3
1978
+ value: 13.003
1979
+ - type: recall_at_5
1980
+ value: 17.932000000000002
1981
+ - task:
1982
+ type: STS
1983
+ dataset:
1984
+ type: mteb/sickr-sts
1985
+ name: MTEB SICK-R
1986
+ config: default
1987
+ split: test
1988
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1989
+ metrics:
1990
+ - type: cos_sim_pearson
1991
+ value: 84.677978681023
1992
+ - type: cos_sim_spearman
1993
+ value: 83.13093441058189
1994
+ - type: euclidean_pearson
1995
+ value: 83.35535759341572
1996
+ - type: euclidean_spearman
1997
+ value: 83.42583744219611
1998
+ - type: manhattan_pearson
1999
+ value: 83.2243124045889
2000
+ - type: manhattan_spearman
2001
+ value: 83.39801618652632
2002
+ - task:
2003
+ type: STS
2004
+ dataset:
2005
+ type: mteb/sts12-sts
2006
+ name: MTEB STS12
2007
+ config: default
2008
+ split: test
2009
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
2010
+ metrics:
2011
+ - type: cos_sim_pearson
2012
+ value: 81.68960206569666
2013
+ - type: cos_sim_spearman
2014
+ value: 77.3368966488535
2015
+ - type: euclidean_pearson
2016
+ value: 77.62828980560303
2017
+ - type: euclidean_spearman
2018
+ value: 76.77951481444651
2019
+ - type: manhattan_pearson
2020
+ value: 77.88637240839041
2021
+ - type: manhattan_spearman
2022
+ value: 77.22157841466188
2023
+ - task:
2024
+ type: STS
2025
+ dataset:
2026
+ type: mteb/sts13-sts
2027
+ name: MTEB STS13
2028
+ config: default
2029
+ split: test
2030
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
2031
+ metrics:
2032
+ - type: cos_sim_pearson
2033
+ value: 84.18745821650724
2034
+ - type: cos_sim_spearman
2035
+ value: 85.04423285574542
2036
+ - type: euclidean_pearson
2037
+ value: 85.46604816931023
2038
+ - type: euclidean_spearman
2039
+ value: 85.5230593932974
2040
+ - type: manhattan_pearson
2041
+ value: 85.57912805986261
2042
+ - type: manhattan_spearman
2043
+ value: 85.65955905111873
2044
+ - task:
2045
+ type: STS
2046
+ dataset:
2047
+ type: mteb/sts14-sts
2048
+ name: MTEB STS14
2049
+ config: default
2050
+ split: test
2051
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
2052
+ metrics:
2053
+ - type: cos_sim_pearson
2054
+ value: 83.6715333300355
2055
+ - type: cos_sim_spearman
2056
+ value: 82.9058522514908
2057
+ - type: euclidean_pearson
2058
+ value: 83.9640357424214
2059
+ - type: euclidean_spearman
2060
+ value: 83.60415457472637
2061
+ - type: manhattan_pearson
2062
+ value: 84.05621005853469
2063
+ - type: manhattan_spearman
2064
+ value: 83.87077724707746
2065
+ - task:
2066
+ type: STS
2067
+ dataset:
2068
+ type: mteb/sts15-sts
2069
+ name: MTEB STS15
2070
+ config: default
2071
+ split: test
2072
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
2073
+ metrics:
2074
+ - type: cos_sim_pearson
2075
+ value: 87.82422928098886
2076
+ - type: cos_sim_spearman
2077
+ value: 88.12660311894628
2078
+ - type: euclidean_pearson
2079
+ value: 87.50974805056555
2080
+ - type: euclidean_spearman
2081
+ value: 87.91957275596677
2082
+ - type: manhattan_pearson
2083
+ value: 87.74119404878883
2084
+ - type: manhattan_spearman
2085
+ value: 88.2808922165719
2086
+ - task:
2087
+ type: STS
2088
+ dataset:
2089
+ type: mteb/sts16-sts
2090
+ name: MTEB STS16
2091
+ config: default
2092
+ split: test
2093
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
2094
+ metrics:
2095
+ - type: cos_sim_pearson
2096
+ value: 84.80605838552093
2097
+ - type: cos_sim_spearman
2098
+ value: 86.24123388765678
2099
+ - type: euclidean_pearson
2100
+ value: 85.32648347339814
2101
+ - type: euclidean_spearman
2102
+ value: 85.60046671950158
2103
+ - type: manhattan_pearson
2104
+ value: 85.53800168487811
2105
+ - type: manhattan_spearman
2106
+ value: 85.89542420480763
2107
+ - task:
2108
+ type: STS
2109
+ dataset:
2110
+ type: mteb/sts17-crosslingual-sts
2111
+ name: MTEB STS17 (en-en)
2112
+ config: en-en
2113
+ split: test
2114
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
2115
+ metrics:
2116
+ - type: cos_sim_pearson
2117
+ value: 89.87540978988132
2118
+ - type: cos_sim_spearman
2119
+ value: 90.12715295099461
2120
+ - type: euclidean_pearson
2121
+ value: 91.61085993525275
2122
+ - type: euclidean_spearman
2123
+ value: 91.31835942311758
2124
+ - type: manhattan_pearson
2125
+ value: 91.57500202032934
2126
+ - type: manhattan_spearman
2127
+ value: 91.1790925526635
2128
+ - task:
2129
+ type: STS
2130
+ dataset:
2131
+ type: mteb/sts22-crosslingual-sts
2132
+ name: MTEB STS22 (en)
2133
+ config: en
2134
+ split: test
2135
+ revision: eea2b4fe26a775864c896887d910b76a8098ad3f
2136
+ metrics:
2137
+ - type: cos_sim_pearson
2138
+ value: 69.87136205329556
2139
+ - type: cos_sim_spearman
2140
+ value: 68.6253154635078
2141
+ - type: euclidean_pearson
2142
+ value: 68.91536015034222
2143
+ - type: euclidean_spearman
2144
+ value: 67.63744649352542
2145
+ - type: manhattan_pearson
2146
+ value: 69.2000713045275
2147
+ - type: manhattan_spearman
2148
+ value: 68.16002901587316
2149
+ - task:
2150
+ type: STS
2151
+ dataset:
2152
+ type: mteb/stsbenchmark-sts
2153
+ name: MTEB STSBenchmark
2154
+ config: default
2155
+ split: test
2156
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2157
+ metrics:
2158
+ - type: cos_sim_pearson
2159
+ value: 85.21849551039082
2160
+ - type: cos_sim_spearman
2161
+ value: 85.6392959372461
2162
+ - type: euclidean_pearson
2163
+ value: 85.92050852609488
2164
+ - type: euclidean_spearman
2165
+ value: 85.97205649009734
2166
+ - type: manhattan_pearson
2167
+ value: 86.1031154802254
2168
+ - type: manhattan_spearman
2169
+ value: 86.26791155517466
2170
+ - task:
2171
+ type: Reranking
2172
+ dataset:
2173
+ type: mteb/scidocs-reranking
2174
+ name: MTEB SciDocsRR
2175
+ config: default
2176
+ split: test
2177
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2178
+ metrics:
2179
+ - type: map
2180
+ value: 86.83953958636627
2181
+ - type: mrr
2182
+ value: 96.71167612344082
2183
+ - task:
2184
+ type: Retrieval
2185
+ dataset:
2186
+ type: scifact
2187
+ name: MTEB SciFact
2188
+ config: default
2189
+ split: test
2190
+ revision: None
2191
+ metrics:
2192
+ - type: map_at_1
2193
+ value: 64.994
2194
+ - type: map_at_10
2195
+ value: 74.763
2196
+ - type: map_at_100
2197
+ value: 75.127
2198
+ - type: map_at_1000
2199
+ value: 75.143
2200
+ - type: map_at_3
2201
+ value: 71.824
2202
+ - type: map_at_5
2203
+ value: 73.71
2204
+ - type: mrr_at_1
2205
+ value: 68.333
2206
+ - type: mrr_at_10
2207
+ value: 75.749
2208
+ - type: mrr_at_100
2209
+ value: 75.922
2210
+ - type: mrr_at_1000
2211
+ value: 75.938
2212
+ - type: mrr_at_3
2213
+ value: 73.556
2214
+ - type: mrr_at_5
2215
+ value: 74.739
2216
+ - type: ndcg_at_1
2217
+ value: 68.333
2218
+ - type: ndcg_at_10
2219
+ value: 79.174
2220
+ - type: ndcg_at_100
2221
+ value: 80.41
2222
+ - type: ndcg_at_1000
2223
+ value: 80.804
2224
+ - type: ndcg_at_3
2225
+ value: 74.361
2226
+ - type: ndcg_at_5
2227
+ value: 76.861
2228
+ - type: precision_at_1
2229
+ value: 68.333
2230
+ - type: precision_at_10
2231
+ value: 10.333
2232
+ - type: precision_at_100
2233
+ value: 1.0999999999999999
2234
+ - type: precision_at_1000
2235
+ value: 0.11299999999999999
2236
+ - type: precision_at_3
2237
+ value: 28.778
2238
+ - type: precision_at_5
2239
+ value: 19.067
2240
+ - type: recall_at_1
2241
+ value: 64.994
2242
+ - type: recall_at_10
2243
+ value: 91.822
2244
+ - type: recall_at_100
2245
+ value: 97.0
2246
+ - type: recall_at_1000
2247
+ value: 100.0
2248
+ - type: recall_at_3
2249
+ value: 78.878
2250
+ - type: recall_at_5
2251
+ value: 85.172
2252
+ - task:
2253
+ type: PairClassification
2254
+ dataset:
2255
+ type: mteb/sprintduplicatequestions-pairclassification
2256
+ name: MTEB SprintDuplicateQuestions
2257
+ config: default
2258
+ split: test
2259
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2260
+ metrics:
2261
+ - type: cos_sim_accuracy
2262
+ value: 99.72079207920792
2263
+ - type: cos_sim_ap
2264
+ value: 93.00265215525152
2265
+ - type: cos_sim_f1
2266
+ value: 85.06596306068602
2267
+ - type: cos_sim_precision
2268
+ value: 90.05586592178771
2269
+ - type: cos_sim_recall
2270
+ value: 80.60000000000001
2271
+ - type: dot_accuracy
2272
+ value: 99.66039603960397
2273
+ - type: dot_ap
2274
+ value: 91.22371407479089
2275
+ - type: dot_f1
2276
+ value: 82.34693877551021
2277
+ - type: dot_precision
2278
+ value: 84.0625
2279
+ - type: dot_recall
2280
+ value: 80.7
2281
+ - type: euclidean_accuracy
2282
+ value: 99.71881188118812
2283
+ - type: euclidean_ap
2284
+ value: 92.88449963304728
2285
+ - type: euclidean_f1
2286
+ value: 85.19480519480518
2287
+ - type: euclidean_precision
2288
+ value: 88.64864864864866
2289
+ - type: euclidean_recall
2290
+ value: 82.0
2291
+ - type: manhattan_accuracy
2292
+ value: 99.73267326732673
2293
+ - type: manhattan_ap
2294
+ value: 93.23055393056883
2295
+ - type: manhattan_f1
2296
+ value: 85.88957055214725
2297
+ - type: manhattan_precision
2298
+ value: 87.86610878661088
2299
+ - type: manhattan_recall
2300
+ value: 84.0
2301
+ - type: max_accuracy
2302
+ value: 99.73267326732673
2303
+ - type: max_ap
2304
+ value: 93.23055393056883
2305
+ - type: max_f1
2306
+ value: 85.88957055214725
2307
+ - task:
2308
+ type: Clustering
2309
+ dataset:
2310
+ type: mteb/stackexchange-clustering
2311
+ name: MTEB StackExchangeClustering
2312
+ config: default
2313
+ split: test
2314
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2315
+ metrics:
2316
+ - type: v_measure
2317
+ value: 77.3305735900358
2318
+ - task:
2319
+ type: Clustering
2320
+ dataset:
2321
+ type: mteb/stackexchange-clustering-p2p
2322
+ name: MTEB StackExchangeClusteringP2P
2323
+ config: default
2324
+ split: test
2325
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2326
+ metrics:
2327
+ - type: v_measure
2328
+ value: 41.32967136540674
2329
+ - task:
2330
+ type: Reranking
2331
+ dataset:
2332
+ type: mteb/stackoverflowdupquestions-reranking
2333
+ name: MTEB StackOverflowDupQuestions
2334
+ config: default
2335
+ split: test
2336
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2337
+ metrics:
2338
+ - type: map
2339
+ value: 55.95514866379359
2340
+ - type: mrr
2341
+ value: 56.95423245055598
2342
+ - task:
2343
+ type: Summarization
2344
+ dataset:
2345
+ type: mteb/summeval
2346
+ name: MTEB SummEval
2347
+ config: default
2348
+ split: test
2349
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
2350
+ metrics:
2351
+ - type: cos_sim_pearson
2352
+ value: 30.783007208997144
2353
+ - type: cos_sim_spearman
2354
+ value: 30.373444721540533
2355
+ - type: dot_pearson
2356
+ value: 29.210604111143905
2357
+ - type: dot_spearman
2358
+ value: 29.98809758085659
2359
+ - task:
2360
+ type: Retrieval
2361
+ dataset:
2362
+ type: trec-covid
2363
+ name: MTEB TRECCOVID
2364
+ config: default
2365
+ split: test
2366
+ revision: None
2367
+ metrics:
2368
+ - type: map_at_1
2369
+ value: 0.234
2370
+ - type: map_at_10
2371
+ value: 1.894
2372
+ - type: map_at_100
2373
+ value: 1.894
2374
+ - type: map_at_1000
2375
+ value: 1.894
2376
+ - type: map_at_3
2377
+ value: 0.636
2378
+ - type: map_at_5
2379
+ value: 1.0
2380
+ - type: mrr_at_1
2381
+ value: 88.0
2382
+ - type: mrr_at_10
2383
+ value: 93.667
2384
+ - type: mrr_at_100
2385
+ value: 93.667
2386
+ - type: mrr_at_1000
2387
+ value: 93.667
2388
+ - type: mrr_at_3
2389
+ value: 93.667
2390
+ - type: mrr_at_5
2391
+ value: 93.667
2392
+ - type: ndcg_at_1
2393
+ value: 85.0
2394
+ - type: ndcg_at_10
2395
+ value: 74.798
2396
+ - type: ndcg_at_100
2397
+ value: 16.462
2398
+ - type: ndcg_at_1000
2399
+ value: 7.0889999999999995
2400
+ - type: ndcg_at_3
2401
+ value: 80.754
2402
+ - type: ndcg_at_5
2403
+ value: 77.319
2404
+ - type: precision_at_1
2405
+ value: 88.0
2406
+ - type: precision_at_10
2407
+ value: 78.0
2408
+ - type: precision_at_100
2409
+ value: 7.8
2410
+ - type: precision_at_1000
2411
+ value: 0.7799999999999999
2412
+ - type: precision_at_3
2413
+ value: 83.333
2414
+ - type: precision_at_5
2415
+ value: 80.80000000000001
2416
+ - type: recall_at_1
2417
+ value: 0.234
2418
+ - type: recall_at_10
2419
+ value: 2.093
2420
+ - type: recall_at_100
2421
+ value: 2.093
2422
+ - type: recall_at_1000
2423
+ value: 2.093
2424
+ - type: recall_at_3
2425
+ value: 0.662
2426
+ - type: recall_at_5
2427
+ value: 1.0739999999999998
2428
+ - task:
2429
+ type: Retrieval
2430
+ dataset:
2431
+ type: webis-touche2020
2432
+ name: MTEB Touche2020
2433
+ config: default
2434
+ split: test
2435
+ revision: None
2436
+ metrics:
2437
+ - type: map_at_1
2438
+ value: 2.703
2439
+ - type: map_at_10
2440
+ value: 10.866000000000001
2441
+ - type: map_at_100
2442
+ value: 10.866000000000001
2443
+ - type: map_at_1000
2444
+ value: 10.866000000000001
2445
+ - type: map_at_3
2446
+ value: 5.909
2447
+ - type: map_at_5
2448
+ value: 7.35
2449
+ - type: mrr_at_1
2450
+ value: 36.735
2451
+ - type: mrr_at_10
2452
+ value: 53.583000000000006
2453
+ - type: mrr_at_100
2454
+ value: 53.583000000000006
2455
+ - type: mrr_at_1000
2456
+ value: 53.583000000000006
2457
+ - type: mrr_at_3
2458
+ value: 49.32
2459
+ - type: mrr_at_5
2460
+ value: 51.769
2461
+ - type: ndcg_at_1
2462
+ value: 34.694
2463
+ - type: ndcg_at_10
2464
+ value: 27.926000000000002
2465
+ - type: ndcg_at_100
2466
+ value: 22.701
2467
+ - type: ndcg_at_1000
2468
+ value: 22.701
2469
+ - type: ndcg_at_3
2470
+ value: 32.073
2471
+ - type: ndcg_at_5
2472
+ value: 28.327999999999996
2473
+ - type: precision_at_1
2474
+ value: 36.735
2475
+ - type: precision_at_10
2476
+ value: 24.694
2477
+ - type: precision_at_100
2478
+ value: 2.469
2479
+ - type: precision_at_1000
2480
+ value: 0.247
2481
+ - type: precision_at_3
2482
+ value: 31.973000000000003
2483
+ - type: precision_at_5
2484
+ value: 26.939
2485
+ - type: recall_at_1
2486
+ value: 2.703
2487
+ - type: recall_at_10
2488
+ value: 17.702
2489
+ - type: recall_at_100
2490
+ value: 17.702
2491
+ - type: recall_at_1000
2492
+ value: 17.702
2493
+ - type: recall_at_3
2494
+ value: 7.208
2495
+ - type: recall_at_5
2496
+ value: 9.748999999999999
2497
+ - task:
2498
+ type: Classification
2499
+ dataset:
2500
+ type: mteb/toxic_conversations_50k
2501
+ name: MTEB ToxicConversationsClassification
2502
+ config: default
2503
+ split: test
2504
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2505
+ metrics:
2506
+ - type: accuracy
2507
+ value: 70.79960000000001
2508
+ - type: ap
2509
+ value: 15.467565415565815
2510
+ - type: f1
2511
+ value: 55.28639823443618
2512
+ - task:
2513
+ type: Classification
2514
+ dataset:
2515
+ type: mteb/tweet_sentiment_extraction
2516
+ name: MTEB TweetSentimentExtractionClassification
2517
+ config: default
2518
+ split: test
2519
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2520
+ metrics:
2521
+ - type: accuracy
2522
+ value: 64.7792869269949
2523
+ - type: f1
2524
+ value: 65.08597154774318
2525
+ - task:
2526
+ type: Clustering
2527
+ dataset:
2528
+ type: mteb/twentynewsgroups-clustering
2529
+ name: MTEB TwentyNewsgroupsClustering
2530
+ config: default
2531
+ split: test
2532
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2533
+ metrics:
2534
+ - type: v_measure
2535
+ value: 55.70352297774293
2536
+ - task:
2537
+ type: PairClassification
2538
+ dataset:
2539
+ type: mteb/twittersemeval2015-pairclassification
2540
+ name: MTEB TwitterSemEval2015
2541
+ config: default
2542
+ split: test
2543
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2544
+ metrics:
2545
+ - type: cos_sim_accuracy
2546
+ value: 88.27561542588067
2547
+ - type: cos_sim_ap
2548
+ value: 81.08262141256193
2549
+ - type: cos_sim_f1
2550
+ value: 73.82341501361338
2551
+ - type: cos_sim_precision
2552
+ value: 72.5720112159062
2553
+ - type: cos_sim_recall
2554
+ value: 75.11873350923483
2555
+ - type: dot_accuracy
2556
+ value: 86.66030875603504
2557
+ - type: dot_ap
2558
+ value: 76.6052349228621
2559
+ - type: dot_f1
2560
+ value: 70.13897280966768
2561
+ - type: dot_precision
2562
+ value: 64.70457079152732
2563
+ - type: dot_recall
2564
+ value: 76.56992084432717
2565
+ - type: euclidean_accuracy
2566
+ value: 88.37098408535495
2567
+ - type: euclidean_ap
2568
+ value: 81.12515230092113
2569
+ - type: euclidean_f1
2570
+ value: 74.10338225909379
2571
+ - type: euclidean_precision
2572
+ value: 71.76761433868974
2573
+ - type: euclidean_recall
2574
+ value: 76.59630606860158
2575
+ - type: manhattan_accuracy
2576
+ value: 88.34118137926924
2577
+ - type: manhattan_ap
2578
+ value: 80.95751834536561
2579
+ - type: manhattan_f1
2580
+ value: 73.9119496855346
2581
+ - type: manhattan_precision
2582
+ value: 70.625
2583
+ - type: manhattan_recall
2584
+ value: 77.5197889182058
2585
+ - type: max_accuracy
2586
+ value: 88.37098408535495
2587
+ - type: max_ap
2588
+ value: 81.12515230092113
2589
+ - type: max_f1
2590
+ value: 74.10338225909379
2591
+ - task:
2592
+ type: PairClassification
2593
+ dataset:
2594
+ type: mteb/twitterurlcorpus-pairclassification
2595
+ name: MTEB TwitterURLCorpus
2596
+ config: default
2597
+ split: test
2598
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2599
+ metrics:
2600
+ - type: cos_sim_accuracy
2601
+ value: 89.79896767182831
2602
+ - type: cos_sim_ap
2603
+ value: 87.40071784061065
2604
+ - type: cos_sim_f1
2605
+ value: 79.87753144712087
2606
+ - type: cos_sim_precision
2607
+ value: 76.67304015296367
2608
+ - type: cos_sim_recall
2609
+ value: 83.3615645210964
2610
+ - type: dot_accuracy
2611
+ value: 88.95486474948578
2612
+ - type: dot_ap
2613
+ value: 86.00227979119943
2614
+ - type: dot_f1
2615
+ value: 78.54601474525914
2616
+ - type: dot_precision
2617
+ value: 75.00525394045535
2618
+ - type: dot_recall
2619
+ value: 82.43763473975977
2620
+ - type: euclidean_accuracy
2621
+ value: 89.7892653393876
2622
+ - type: euclidean_ap
2623
+ value: 87.42174706480819
2624
+ - type: euclidean_f1
2625
+ value: 80.07283321194465
2626
+ - type: euclidean_precision
2627
+ value: 75.96738529574351
2628
+ - type: euclidean_recall
2629
+ value: 84.6473668001232
2630
+ - type: manhattan_accuracy
2631
+ value: 89.8474793340319
2632
+ - type: manhattan_ap
2633
+ value: 87.47814292587448
2634
+ - type: manhattan_f1
2635
+ value: 80.15461150280949
2636
+ - type: manhattan_precision
2637
+ value: 74.88798234468
2638
+ - type: manhattan_recall
2639
+ value: 86.21804742839544
2640
+ - type: max_accuracy
2641
+ value: 89.8474793340319
2642
+ - type: max_ap
2643
+ value: 87.47814292587448
2644
+ - type: max_f1
2645
+ value: 80.15461150280949
2646
+ ---
2647
+
2648
+ # Model Summary
2649
+
2650
+ > GritLM is a generative representational instruction tuned language model. It unifies text representation (embedding) and text generation into a single model achieving state-of-the-art performance on both types of tasks.
2651
+
2652
+ - **Repository:** [ContextualAI/gritlm](https://github.com/ContextualAI/gritlm)
2653
+ - **Paper:** https://arxiv.org/abs/2402.09906
2654
+ - **Logs:** https://wandb.ai/muennighoff/gritlm/runs/0uui712t/overview
2655
+ - **Script:** https://github.com/ContextualAI/gritlm/blob/main/scripts/training/train_gritlm_7b.sh
2656
+
2657
+ | Model | Description |
2658
+ |-------|-------------|
2659
+ | [GritLM 7B](https://hf.co/GritLM/GritLM-7B) | Mistral 7B finetuned using GRIT |
2660
+ | [GritLM 8x7B](https://hf.co/GritLM/GritLM-8x7B) | Mixtral 8x7B finetuned using GRIT |
2661
+
2662
+ # Use
2663
+
2664
+ The model usage is documented [here](https://github.com/ContextualAI/gritlm?tab=readme-ov-file#inference).
2665
+
2666
+ # Citation
2667
+
2668
+ ```bibtex
2669
+ @misc{muennighoff2024generative,
2670
+ title={Generative Representational Instruction Tuning},
2671
+ author={Niklas Muennighoff and Hongjin Su and Liang Wang and Nan Yang and Furu Wei and Tao Yu and Amanpreet Singh and Douwe Kiela},
2672
+ year={2024},
2673
+ eprint={2402.09906},
2674
+ archivePrefix={arXiv},
2675
+ primaryClass={cs.CL}
2676
+ }
2677
+ ```
2678
+