consciousAI commited on
Commit
068c374
1 Parent(s): 4f835e8

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +1388 -1
README.md CHANGED
@@ -9,7 +9,1394 @@ tags:
9
  - feature-extraction
10
  - sentence-similarity
11
  - transformers
12
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  ---
14
 
15
  # {MODEL_NAME}
 
9
  - feature-extraction
10
  - sentence-similarity
11
  - transformers
12
+ - mteb
13
+ model-index:
14
+ - name: cai-lunaris-text-embeddings
15
+ results:
16
+ - task:
17
+ type: Retrieval
18
+ dataset:
19
+ type: arguana
20
+ name: MTEB ArguAna
21
+ config: default
22
+ split: test
23
+ revision: None
24
+ metrics:
25
+ - type: map_at_1
26
+ value: 17.07
27
+ - type: map_at_10
28
+ value: 29.372999999999998
29
+ - type: map_at_100
30
+ value: 30.79
31
+ - type: map_at_1000
32
+ value: 30.819999999999997
33
+ - type: map_at_3
34
+ value: 24.395
35
+ - type: map_at_5
36
+ value: 27.137
37
+ - type: mrr_at_1
38
+ value: 17.923000000000002
39
+ - type: mrr_at_10
40
+ value: 29.695
41
+ - type: mrr_at_100
42
+ value: 31.098
43
+ - type: mrr_at_1000
44
+ value: 31.128
45
+ - type: mrr_at_3
46
+ value: 24.704
47
+ - type: mrr_at_5
48
+ value: 27.449
49
+ - type: ndcg_at_1
50
+ value: 17.07
51
+ - type: ndcg_at_10
52
+ value: 37.269000000000005
53
+ - type: ndcg_at_100
54
+ value: 43.716
55
+ - type: ndcg_at_1000
56
+ value: 44.531
57
+ - type: ndcg_at_3
58
+ value: 26.839000000000002
59
+ - type: ndcg_at_5
60
+ value: 31.845000000000002
61
+ - type: precision_at_1
62
+ value: 17.07
63
+ - type: precision_at_10
64
+ value: 6.3020000000000005
65
+ - type: precision_at_100
66
+ value: 0.922
67
+ - type: precision_at_1000
68
+ value: 0.099
69
+ - type: precision_at_3
70
+ value: 11.309
71
+ - type: precision_at_5
72
+ value: 9.246
73
+ - type: recall_at_1
74
+ value: 17.07
75
+ - type: recall_at_10
76
+ value: 63.016000000000005
77
+ - type: recall_at_100
78
+ value: 92.24799999999999
79
+ - type: recall_at_1000
80
+ value: 98.72
81
+ - type: recall_at_3
82
+ value: 33.926
83
+ - type: recall_at_5
84
+ value: 46.23
85
+ - task:
86
+ type: Reranking
87
+ dataset:
88
+ type: mteb/askubuntudupquestions-reranking
89
+ name: MTEB AskUbuntuDupQuestions
90
+ config: default
91
+ split: test
92
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
93
+ metrics:
94
+ - type: map
95
+ value: 53.44266265900711
96
+ - type: mrr
97
+ value: 66.54695950402322
98
+ - task:
99
+ type: STS
100
+ dataset:
101
+ type: mteb/biosses-sts
102
+ name: MTEB BIOSSES
103
+ config: default
104
+ split: test
105
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
106
+ metrics:
107
+ - type: cos_sim_pearson
108
+ value: 75.9652953730204
109
+ - type: cos_sim_spearman
110
+ value: 73.96554077670989
111
+ - type: euclidean_pearson
112
+ value: 75.68477255792381
113
+ - type: euclidean_spearman
114
+ value: 74.59447076995703
115
+ - type: manhattan_pearson
116
+ value: 75.94984623881341
117
+ - type: manhattan_spearman
118
+ value: 74.72218452337502
119
+ - task:
120
+ type: Retrieval
121
+ dataset:
122
+ type: BeIR/cqadupstack
123
+ name: MTEB CQADupstackAndroidRetrieval
124
+ config: default
125
+ split: test
126
+ revision: None
127
+ metrics:
128
+ - type: map_at_1
129
+ value: 14.119000000000002
130
+ - type: map_at_10
131
+ value: 19.661
132
+ - type: map_at_100
133
+ value: 20.706
134
+ - type: map_at_1000
135
+ value: 20.848
136
+ - type: map_at_3
137
+ value: 17.759
138
+ - type: map_at_5
139
+ value: 18.645
140
+ - type: mrr_at_1
141
+ value: 17.166999999999998
142
+ - type: mrr_at_10
143
+ value: 23.313
144
+ - type: mrr_at_100
145
+ value: 24.263
146
+ - type: mrr_at_1000
147
+ value: 24.352999999999998
148
+ - type: mrr_at_3
149
+ value: 21.412
150
+ - type: mrr_at_5
151
+ value: 22.313
152
+ - type: ndcg_at_1
153
+ value: 17.166999999999998
154
+ - type: ndcg_at_10
155
+ value: 23.631
156
+ - type: ndcg_at_100
157
+ value: 28.427000000000003
158
+ - type: ndcg_at_1000
159
+ value: 31.862000000000002
160
+ - type: ndcg_at_3
161
+ value: 20.175
162
+ - type: ndcg_at_5
163
+ value: 21.397
164
+ - type: precision_at_1
165
+ value: 17.166999999999998
166
+ - type: precision_at_10
167
+ value: 4.549
168
+ - type: precision_at_100
169
+ value: 0.8370000000000001
170
+ - type: precision_at_1000
171
+ value: 0.136
172
+ - type: precision_at_3
173
+ value: 9.68
174
+ - type: precision_at_5
175
+ value: 6.981
176
+ - type: recall_at_1
177
+ value: 14.119000000000002
178
+ - type: recall_at_10
179
+ value: 32.147999999999996
180
+ - type: recall_at_100
181
+ value: 52.739999999999995
182
+ - type: recall_at_1000
183
+ value: 76.67
184
+ - type: recall_at_3
185
+ value: 22.019
186
+ - type: recall_at_5
187
+ value: 25.361
188
+ - task:
189
+ type: Retrieval
190
+ dataset:
191
+ type: BeIR/cqadupstack
192
+ name: MTEB CQADupstackEnglishRetrieval
193
+ config: default
194
+ split: test
195
+ revision: None
196
+ metrics:
197
+ - type: map_at_1
198
+ value: 16.576
199
+ - type: map_at_10
200
+ value: 22.281000000000002
201
+ - type: map_at_100
202
+ value: 23.066
203
+ - type: map_at_1000
204
+ value: 23.166
205
+ - type: map_at_3
206
+ value: 20.385
207
+ - type: map_at_5
208
+ value: 21.557000000000002
209
+ - type: mrr_at_1
210
+ value: 20.892
211
+ - type: mrr_at_10
212
+ value: 26.605
213
+ - type: mrr_at_100
214
+ value: 27.229
215
+ - type: mrr_at_1000
216
+ value: 27.296
217
+ - type: mrr_at_3
218
+ value: 24.809
219
+ - type: mrr_at_5
220
+ value: 25.927
221
+ - type: ndcg_at_1
222
+ value: 20.892
223
+ - type: ndcg_at_10
224
+ value: 26.092
225
+ - type: ndcg_at_100
226
+ value: 29.398999999999997
227
+ - type: ndcg_at_1000
228
+ value: 31.884
229
+ - type: ndcg_at_3
230
+ value: 23.032
231
+ - type: ndcg_at_5
232
+ value: 24.634
233
+ - type: precision_at_1
234
+ value: 20.892
235
+ - type: precision_at_10
236
+ value: 4.885
237
+ - type: precision_at_100
238
+ value: 0.818
239
+ - type: precision_at_1000
240
+ value: 0.126
241
+ - type: precision_at_3
242
+ value: 10.977
243
+ - type: precision_at_5
244
+ value: 8.013
245
+ - type: recall_at_1
246
+ value: 16.576
247
+ - type: recall_at_10
248
+ value: 32.945
249
+ - type: recall_at_100
250
+ value: 47.337
251
+ - type: recall_at_1000
252
+ value: 64.592
253
+ - type: recall_at_3
254
+ value: 24.053
255
+ - type: recall_at_5
256
+ value: 28.465
257
+ - task:
258
+ type: Retrieval
259
+ dataset:
260
+ type: BeIR/cqadupstack
261
+ name: MTEB CQADupstackGamingRetrieval
262
+ config: default
263
+ split: test
264
+ revision: None
265
+ metrics:
266
+ - type: map_at_1
267
+ value: 20.604
268
+ - type: map_at_10
269
+ value: 28.754999999999995
270
+ - type: map_at_100
271
+ value: 29.767
272
+ - type: map_at_1000
273
+ value: 29.852
274
+ - type: map_at_3
275
+ value: 26.268
276
+ - type: map_at_5
277
+ value: 27.559
278
+ - type: mrr_at_1
279
+ value: 24.326
280
+ - type: mrr_at_10
281
+ value: 31.602000000000004
282
+ - type: mrr_at_100
283
+ value: 32.46
284
+ - type: mrr_at_1000
285
+ value: 32.521
286
+ - type: mrr_at_3
287
+ value: 29.415000000000003
288
+ - type: mrr_at_5
289
+ value: 30.581000000000003
290
+ - type: ndcg_at_1
291
+ value: 24.326
292
+ - type: ndcg_at_10
293
+ value: 33.335
294
+ - type: ndcg_at_100
295
+ value: 38.086
296
+ - type: ndcg_at_1000
297
+ value: 40.319
298
+ - type: ndcg_at_3
299
+ value: 28.796
300
+ - type: ndcg_at_5
301
+ value: 30.758999999999997
302
+ - type: precision_at_1
303
+ value: 24.326
304
+ - type: precision_at_10
305
+ value: 5.712
306
+ - type: precision_at_100
307
+ value: 0.893
308
+ - type: precision_at_1000
309
+ value: 0.11499999999999999
310
+ - type: precision_at_3
311
+ value: 13.208
312
+ - type: precision_at_5
313
+ value: 9.329
314
+ - type: recall_at_1
315
+ value: 20.604
316
+ - type: recall_at_10
317
+ value: 44.505
318
+ - type: recall_at_100
319
+ value: 65.866
320
+ - type: recall_at_1000
321
+ value: 82.61800000000001
322
+ - type: recall_at_3
323
+ value: 31.794
324
+ - type: recall_at_5
325
+ value: 36.831
326
+ - task:
327
+ type: Retrieval
328
+ dataset:
329
+ type: BeIR/cqadupstack
330
+ name: MTEB CQADupstackGisRetrieval
331
+ config: default
332
+ split: test
333
+ revision: None
334
+ metrics:
335
+ - type: map_at_1
336
+ value: 8.280999999999999
337
+ - type: map_at_10
338
+ value: 11.636000000000001
339
+ - type: map_at_100
340
+ value: 12.363
341
+ - type: map_at_1000
342
+ value: 12.469
343
+ - type: map_at_3
344
+ value: 10.415000000000001
345
+ - type: map_at_5
346
+ value: 11.144
347
+ - type: mrr_at_1
348
+ value: 9.266
349
+ - type: mrr_at_10
350
+ value: 12.838
351
+ - type: mrr_at_100
352
+ value: 13.608999999999998
353
+ - type: mrr_at_1000
354
+ value: 13.700999999999999
355
+ - type: mrr_at_3
356
+ value: 11.507000000000001
357
+ - type: mrr_at_5
358
+ value: 12.343
359
+ - type: ndcg_at_1
360
+ value: 9.266
361
+ - type: ndcg_at_10
362
+ value: 13.877
363
+ - type: ndcg_at_100
364
+ value: 18.119
365
+ - type: ndcg_at_1000
366
+ value: 21.247
367
+ - type: ndcg_at_3
368
+ value: 11.376999999999999
369
+ - type: ndcg_at_5
370
+ value: 12.675
371
+ - type: precision_at_1
372
+ value: 9.266
373
+ - type: precision_at_10
374
+ value: 2.226
375
+ - type: precision_at_100
376
+ value: 0.47200000000000003
377
+ - type: precision_at_1000
378
+ value: 0.077
379
+ - type: precision_at_3
380
+ value: 4.859
381
+ - type: precision_at_5
382
+ value: 3.6380000000000003
383
+ - type: recall_at_1
384
+ value: 8.280999999999999
385
+ - type: recall_at_10
386
+ value: 19.872999999999998
387
+ - type: recall_at_100
388
+ value: 40.585
389
+ - type: recall_at_1000
390
+ value: 65.225
391
+ - type: recall_at_3
392
+ value: 13.014000000000001
393
+ - type: recall_at_5
394
+ value: 16.147
395
+ - task:
396
+ type: Retrieval
397
+ dataset:
398
+ type: BeIR/cqadupstack
399
+ name: MTEB CQADupstackMathematicaRetrieval
400
+ config: default
401
+ split: test
402
+ revision: None
403
+ metrics:
404
+ - type: map_at_1
405
+ value: 4.1209999999999996
406
+ - type: map_at_10
407
+ value: 7.272
408
+ - type: map_at_100
409
+ value: 8.079
410
+ - type: map_at_1000
411
+ value: 8.199
412
+ - type: map_at_3
413
+ value: 6.212
414
+ - type: map_at_5
415
+ value: 6.736000000000001
416
+ - type: mrr_at_1
417
+ value: 5.721
418
+ - type: mrr_at_10
419
+ value: 9.418
420
+ - type: mrr_at_100
421
+ value: 10.281
422
+ - type: mrr_at_1000
423
+ value: 10.385
424
+ - type: mrr_at_3
425
+ value: 8.126
426
+ - type: mrr_at_5
427
+ value: 8.779
428
+ - type: ndcg_at_1
429
+ value: 5.721
430
+ - type: ndcg_at_10
431
+ value: 9.673
432
+ - type: ndcg_at_100
433
+ value: 13.852999999999998
434
+ - type: ndcg_at_1000
435
+ value: 17.546999999999997
436
+ - type: ndcg_at_3
437
+ value: 7.509
438
+ - type: ndcg_at_5
439
+ value: 8.373
440
+ - type: precision_at_1
441
+ value: 5.721
442
+ - type: precision_at_10
443
+ value: 2.04
444
+ - type: precision_at_100
445
+ value: 0.48
446
+ - type: precision_at_1000
447
+ value: 0.093
448
+ - type: precision_at_3
449
+ value: 4.022
450
+ - type: precision_at_5
451
+ value: 3.06
452
+ - type: recall_at_1
453
+ value: 4.1209999999999996
454
+ - type: recall_at_10
455
+ value: 15.201
456
+ - type: recall_at_100
457
+ value: 33.922999999999995
458
+ - type: recall_at_1000
459
+ value: 61.529999999999994
460
+ - type: recall_at_3
461
+ value: 8.869
462
+ - type: recall_at_5
463
+ value: 11.257
464
+ - task:
465
+ type: Retrieval
466
+ dataset:
467
+ type: BeIR/cqadupstack
468
+ name: MTEB CQADupstackPhysicsRetrieval
469
+ config: default
470
+ split: test
471
+ revision: None
472
+ metrics:
473
+ - type: map_at_1
474
+ value: 14.09
475
+ - type: map_at_10
476
+ value: 19.573999999999998
477
+ - type: map_at_100
478
+ value: 20.580000000000002
479
+ - type: map_at_1000
480
+ value: 20.704
481
+ - type: map_at_3
482
+ value: 17.68
483
+ - type: map_at_5
484
+ value: 18.64
485
+ - type: mrr_at_1
486
+ value: 17.227999999999998
487
+ - type: mrr_at_10
488
+ value: 23.152
489
+ - type: mrr_at_100
490
+ value: 24.056
491
+ - type: mrr_at_1000
492
+ value: 24.141000000000002
493
+ - type: mrr_at_3
494
+ value: 21.142
495
+ - type: mrr_at_5
496
+ value: 22.201
497
+ - type: ndcg_at_1
498
+ value: 17.227999999999998
499
+ - type: ndcg_at_10
500
+ value: 23.39
501
+ - type: ndcg_at_100
502
+ value: 28.483999999999998
503
+ - type: ndcg_at_1000
504
+ value: 31.709
505
+ - type: ndcg_at_3
506
+ value: 19.883
507
+ - type: ndcg_at_5
508
+ value: 21.34
509
+ - type: precision_at_1
510
+ value: 17.227999999999998
511
+ - type: precision_at_10
512
+ value: 4.3790000000000004
513
+ - type: precision_at_100
514
+ value: 0.826
515
+ - type: precision_at_1000
516
+ value: 0.128
517
+ - type: precision_at_3
518
+ value: 9.496
519
+ - type: precision_at_5
520
+ value: 6.872
521
+ - type: recall_at_1
522
+ value: 14.09
523
+ - type: recall_at_10
524
+ value: 31.580000000000002
525
+ - type: recall_at_100
526
+ value: 54.074
527
+ - type: recall_at_1000
528
+ value: 77.092
529
+ - type: recall_at_3
530
+ value: 21.601
531
+ - type: recall_at_5
532
+ value: 25.333
533
+ - task:
534
+ type: Retrieval
535
+ dataset:
536
+ type: BeIR/cqadupstack
537
+ name: MTEB CQADupstackProgrammersRetrieval
538
+ config: default
539
+ split: test
540
+ revision: None
541
+ metrics:
542
+ - type: map_at_1
543
+ value: 10.538
544
+ - type: map_at_10
545
+ value: 15.75
546
+ - type: map_at_100
547
+ value: 16.71
548
+ - type: map_at_1000
549
+ value: 16.838
550
+ - type: map_at_3
551
+ value: 13.488
552
+ - type: map_at_5
553
+ value: 14.712
554
+ - type: mrr_at_1
555
+ value: 13.813
556
+ - type: mrr_at_10
557
+ value: 19.08
558
+ - type: mrr_at_100
559
+ value: 19.946
560
+ - type: mrr_at_1000
561
+ value: 20.044
562
+ - type: mrr_at_3
563
+ value: 16.838
564
+ - type: mrr_at_5
565
+ value: 17.951
566
+ - type: ndcg_at_1
567
+ value: 13.813
568
+ - type: ndcg_at_10
569
+ value: 19.669
570
+ - type: ndcg_at_100
571
+ value: 24.488
572
+ - type: ndcg_at_1000
573
+ value: 27.87
574
+ - type: ndcg_at_3
575
+ value: 15.479000000000001
576
+ - type: ndcg_at_5
577
+ value: 17.229
578
+ - type: precision_at_1
579
+ value: 13.813
580
+ - type: precision_at_10
581
+ value: 3.916
582
+ - type: precision_at_100
583
+ value: 0.743
584
+ - type: precision_at_1000
585
+ value: 0.122
586
+ - type: precision_at_3
587
+ value: 7.534000000000001
588
+ - type: precision_at_5
589
+ value: 5.822
590
+ - type: recall_at_1
591
+ value: 10.538
592
+ - type: recall_at_10
593
+ value: 28.693
594
+ - type: recall_at_100
595
+ value: 50.308
596
+ - type: recall_at_1000
597
+ value: 74.44
598
+ - type: recall_at_3
599
+ value: 16.866999999999997
600
+ - type: recall_at_5
601
+ value: 21.404999999999998
602
+ - task:
603
+ type: Retrieval
604
+ dataset:
605
+ type: BeIR/cqadupstack
606
+ name: MTEB CQADupstackRetrieval
607
+ config: default
608
+ split: test
609
+ revision: None
610
+ metrics:
611
+ - type: map_at_1
612
+ value: 11.044583333333332
613
+ - type: map_at_10
614
+ value: 15.682833333333335
615
+ - type: map_at_100
616
+ value: 16.506500000000003
617
+ - type: map_at_1000
618
+ value: 16.623833333333334
619
+ - type: map_at_3
620
+ value: 14.130833333333333
621
+ - type: map_at_5
622
+ value: 14.963583333333332
623
+ - type: mrr_at_1
624
+ value: 13.482833333333332
625
+ - type: mrr_at_10
626
+ value: 18.328500000000002
627
+ - type: mrr_at_100
628
+ value: 19.095416666666665
629
+ - type: mrr_at_1000
630
+ value: 19.18241666666666
631
+ - type: mrr_at_3
632
+ value: 16.754749999999998
633
+ - type: mrr_at_5
634
+ value: 17.614749999999997
635
+ - type: ndcg_at_1
636
+ value: 13.482833333333332
637
+ - type: ndcg_at_10
638
+ value: 18.81491666666667
639
+ - type: ndcg_at_100
640
+ value: 22.946833333333334
641
+ - type: ndcg_at_1000
642
+ value: 26.061083333333336
643
+ - type: ndcg_at_3
644
+ value: 15.949333333333332
645
+ - type: ndcg_at_5
646
+ value: 17.218333333333334
647
+ - type: precision_at_1
648
+ value: 13.482833333333332
649
+ - type: precision_at_10
650
+ value: 3.456583333333333
651
+ - type: precision_at_100
652
+ value: 0.6599166666666666
653
+ - type: precision_at_1000
654
+ value: 0.109
655
+ - type: precision_at_3
656
+ value: 7.498833333333332
657
+ - type: precision_at_5
658
+ value: 5.477166666666667
659
+ - type: recall_at_1
660
+ value: 11.044583333333332
661
+ - type: recall_at_10
662
+ value: 25.737750000000005
663
+ - type: recall_at_100
664
+ value: 44.617916666666666
665
+ - type: recall_at_1000
666
+ value: 67.56524999999999
667
+ - type: recall_at_3
668
+ value: 17.598249999999997
669
+ - type: recall_at_5
670
+ value: 20.9035
671
+ - task:
672
+ type: Retrieval
673
+ dataset:
674
+ type: BeIR/cqadupstack
675
+ name: MTEB CQADupstackStatsRetrieval
676
+ config: default
677
+ split: test
678
+ revision: None
679
+ metrics:
680
+ - type: map_at_1
681
+ value: 9.362
682
+ - type: map_at_10
683
+ value: 13.414000000000001
684
+ - type: map_at_100
685
+ value: 14.083000000000002
686
+ - type: map_at_1000
687
+ value: 14.168
688
+ - type: map_at_3
689
+ value: 12.098
690
+ - type: map_at_5
691
+ value: 12.803999999999998
692
+ - type: mrr_at_1
693
+ value: 11.043
694
+ - type: mrr_at_10
695
+ value: 15.158
696
+ - type: mrr_at_100
697
+ value: 15.845999999999998
698
+ - type: mrr_at_1000
699
+ value: 15.916
700
+ - type: mrr_at_3
701
+ value: 13.88
702
+ - type: mrr_at_5
703
+ value: 14.601
704
+ - type: ndcg_at_1
705
+ value: 11.043
706
+ - type: ndcg_at_10
707
+ value: 16.034000000000002
708
+ - type: ndcg_at_100
709
+ value: 19.686
710
+ - type: ndcg_at_1000
711
+ value: 22.188
712
+ - type: ndcg_at_3
713
+ value: 13.530000000000001
714
+ - type: ndcg_at_5
715
+ value: 14.704
716
+ - type: precision_at_1
717
+ value: 11.043
718
+ - type: precision_at_10
719
+ value: 2.791
720
+ - type: precision_at_100
721
+ value: 0.5
722
+ - type: precision_at_1000
723
+ value: 0.077
724
+ - type: precision_at_3
725
+ value: 6.237
726
+ - type: precision_at_5
727
+ value: 4.5089999999999995
728
+ - type: recall_at_1
729
+ value: 9.362
730
+ - type: recall_at_10
731
+ value: 22.396
732
+ - type: recall_at_100
733
+ value: 39.528999999999996
734
+ - type: recall_at_1000
735
+ value: 58.809
736
+ - type: recall_at_3
737
+ value: 15.553
738
+ - type: recall_at_5
739
+ value: 18.512
740
+ - task:
741
+ type: Retrieval
742
+ dataset:
743
+ type: BeIR/cqadupstack
744
+ name: MTEB CQADupstackTexRetrieval
745
+ config: default
746
+ split: test
747
+ revision: None
748
+ metrics:
749
+ - type: map_at_1
750
+ value: 5.657
751
+ - type: map_at_10
752
+ value: 8.273
753
+ - type: map_at_100
754
+ value: 8.875
755
+ - type: map_at_1000
756
+ value: 8.977
757
+ - type: map_at_3
758
+ value: 7.32
759
+ - type: map_at_5
760
+ value: 7.792000000000001
761
+ - type: mrr_at_1
762
+ value: 7.02
763
+ - type: mrr_at_10
764
+ value: 9.966999999999999
765
+ - type: mrr_at_100
766
+ value: 10.636
767
+ - type: mrr_at_1000
768
+ value: 10.724
769
+ - type: mrr_at_3
770
+ value: 8.872
771
+ - type: mrr_at_5
772
+ value: 9.461
773
+ - type: ndcg_at_1
774
+ value: 7.02
775
+ - type: ndcg_at_10
776
+ value: 10.199
777
+ - type: ndcg_at_100
778
+ value: 13.642000000000001
779
+ - type: ndcg_at_1000
780
+ value: 16.643
781
+ - type: ndcg_at_3
782
+ value: 8.333
783
+ - type: ndcg_at_5
784
+ value: 9.103
785
+ - type: precision_at_1
786
+ value: 7.02
787
+ - type: precision_at_10
788
+ value: 1.8929999999999998
789
+ - type: precision_at_100
790
+ value: 0.43
791
+ - type: precision_at_1000
792
+ value: 0.08099999999999999
793
+ - type: precision_at_3
794
+ value: 3.843
795
+ - type: precision_at_5
796
+ value: 2.884
797
+ - type: recall_at_1
798
+ value: 5.657
799
+ - type: recall_at_10
800
+ value: 14.563
801
+ - type: recall_at_100
802
+ value: 30.807000000000002
803
+ - type: recall_at_1000
804
+ value: 53.251000000000005
805
+ - type: recall_at_3
806
+ value: 9.272
807
+ - type: recall_at_5
808
+ value: 11.202
809
+ - task:
810
+ type: Retrieval
811
+ dataset:
812
+ type: BeIR/cqadupstack
813
+ name: MTEB CQADupstackUnixRetrieval
814
+ config: default
815
+ split: test
816
+ revision: None
817
+ metrics:
818
+ - type: map_at_1
819
+ value: 10.671999999999999
820
+ - type: map_at_10
821
+ value: 14.651
822
+ - type: map_at_100
823
+ value: 15.406
824
+ - type: map_at_1000
825
+ value: 15.525
826
+ - type: map_at_3
827
+ value: 13.461
828
+ - type: map_at_5
829
+ value: 14.163
830
+ - type: mrr_at_1
831
+ value: 12.407
832
+ - type: mrr_at_10
833
+ value: 16.782
834
+ - type: mrr_at_100
835
+ value: 17.562
836
+ - type: mrr_at_1000
837
+ value: 17.653
838
+ - type: mrr_at_3
839
+ value: 15.47
840
+ - type: mrr_at_5
841
+ value: 16.262
842
+ - type: ndcg_at_1
843
+ value: 12.407
844
+ - type: ndcg_at_10
845
+ value: 17.251
846
+ - type: ndcg_at_100
847
+ value: 21.378
848
+ - type: ndcg_at_1000
849
+ value: 24.689
850
+ - type: ndcg_at_3
851
+ value: 14.915000000000001
852
+ - type: ndcg_at_5
853
+ value: 16.1
854
+ - type: precision_at_1
855
+ value: 12.407
856
+ - type: precision_at_10
857
+ value: 2.91
858
+ - type: precision_at_100
859
+ value: 0.573
860
+ - type: precision_at_1000
861
+ value: 0.096
862
+ - type: precision_at_3
863
+ value: 6.779
864
+ - type: precision_at_5
865
+ value: 4.888
866
+ - type: recall_at_1
867
+ value: 10.671999999999999
868
+ - type: recall_at_10
869
+ value: 23.099
870
+ - type: recall_at_100
871
+ value: 41.937999999999995
872
+ - type: recall_at_1000
873
+ value: 66.495
874
+ - type: recall_at_3
875
+ value: 16.901
876
+ - type: recall_at_5
877
+ value: 19.807
878
+ - task:
879
+ type: Retrieval
880
+ dataset:
881
+ type: BeIR/cqadupstack
882
+ name: MTEB CQADupstackWebmastersRetrieval
883
+ config: default
884
+ split: test
885
+ revision: None
886
+ metrics:
887
+ - type: map_at_1
888
+ value: 13.364
889
+ - type: map_at_10
890
+ value: 17.772
891
+ - type: map_at_100
892
+ value: 18.659
893
+ - type: map_at_1000
894
+ value: 18.861
895
+ - type: map_at_3
896
+ value: 16.659
897
+ - type: map_at_5
898
+ value: 17.174
899
+ - type: mrr_at_1
900
+ value: 16.996
901
+ - type: mrr_at_10
902
+ value: 21.687
903
+ - type: mrr_at_100
904
+ value: 22.313
905
+ - type: mrr_at_1000
906
+ value: 22.422
907
+ - type: mrr_at_3
908
+ value: 20.652
909
+ - type: mrr_at_5
910
+ value: 21.146
911
+ - type: ndcg_at_1
912
+ value: 16.996
913
+ - type: ndcg_at_10
914
+ value: 21.067
915
+ - type: ndcg_at_100
916
+ value: 24.829
917
+ - type: ndcg_at_1000
918
+ value: 28.866999999999997
919
+ - type: ndcg_at_3
920
+ value: 19.466
921
+ - type: ndcg_at_5
922
+ value: 19.993
923
+ - type: precision_at_1
924
+ value: 16.996
925
+ - type: precision_at_10
926
+ value: 4.071000000000001
927
+ - type: precision_at_100
928
+ value: 0.9329999999999999
929
+ - type: precision_at_1000
930
+ value: 0.183
931
+ - type: precision_at_3
932
+ value: 9.223
933
+ - type: precision_at_5
934
+ value: 6.4030000000000005
935
+ - type: recall_at_1
936
+ value: 13.364
937
+ - type: recall_at_10
938
+ value: 25.976
939
+ - type: recall_at_100
940
+ value: 44.134
941
+ - type: recall_at_1000
942
+ value: 73.181
943
+ - type: recall_at_3
944
+ value: 20.503
945
+ - type: recall_at_5
946
+ value: 22.409000000000002
947
+ - task:
948
+ type: Retrieval
949
+ dataset:
950
+ type: BeIR/cqadupstack
951
+ name: MTEB CQADupstackWordpressRetrieval
952
+ config: default
953
+ split: test
954
+ revision: None
955
+ metrics:
956
+ - type: map_at_1
957
+ value: 5.151
958
+ - type: map_at_10
959
+ value: 9.155000000000001
960
+ - type: map_at_100
961
+ value: 9.783999999999999
962
+ - type: map_at_1000
963
+ value: 9.879
964
+ - type: map_at_3
965
+ value: 7.825
966
+ - type: map_at_5
967
+ value: 8.637
968
+ - type: mrr_at_1
969
+ value: 5.915
970
+ - type: mrr_at_10
971
+ value: 10.34
972
+ - type: mrr_at_100
973
+ value: 10.943999999999999
974
+ - type: mrr_at_1000
975
+ value: 11.033
976
+ - type: mrr_at_3
977
+ value: 8.934000000000001
978
+ - type: mrr_at_5
979
+ value: 9.812
980
+ - type: ndcg_at_1
981
+ value: 5.915
982
+ - type: ndcg_at_10
983
+ value: 11.561
984
+ - type: ndcg_at_100
985
+ value: 14.971
986
+ - type: ndcg_at_1000
987
+ value: 17.907999999999998
988
+ - type: ndcg_at_3
989
+ value: 8.896999999999998
990
+ - type: ndcg_at_5
991
+ value: 10.313
992
+ - type: precision_at_1
993
+ value: 5.915
994
+ - type: precision_at_10
995
+ value: 2.1069999999999998
996
+ - type: precision_at_100
997
+ value: 0.414
998
+ - type: precision_at_1000
999
+ value: 0.074
1000
+ - type: precision_at_3
1001
+ value: 4.128
1002
+ - type: precision_at_5
1003
+ value: 3.327
1004
+ - type: recall_at_1
1005
+ value: 5.151
1006
+ - type: recall_at_10
1007
+ value: 17.874000000000002
1008
+ - type: recall_at_100
1009
+ value: 34.174
1010
+ - type: recall_at_1000
1011
+ value: 56.879999999999995
1012
+ - type: recall_at_3
1013
+ value: 10.732999999999999
1014
+ - type: recall_at_5
1015
+ value: 14.113000000000001
1016
+ - task:
1017
+ type: Retrieval
1018
+ dataset:
1019
+ type: climate-fever
1020
+ name: MTEB ClimateFEVER
1021
+ config: default
1022
+ split: test
1023
+ revision: None
1024
+ metrics:
1025
+ - type: map_at_1
1026
+ value: 3.101
1027
+ - type: map_at_10
1028
+ value: 5.434
1029
+ - type: map_at_100
1030
+ value: 6.267
1031
+ - type: map_at_1000
1032
+ value: 6.418
1033
+ - type: map_at_3
1034
+ value: 4.377000000000001
1035
+ - type: map_at_5
1036
+ value: 4.841
1037
+ - type: mrr_at_1
1038
+ value: 7.166
1039
+ - type: mrr_at_10
1040
+ value: 12.012
1041
+ - type: mrr_at_100
1042
+ value: 13.144
1043
+ - type: mrr_at_1000
1044
+ value: 13.229
1045
+ - type: mrr_at_3
1046
+ value: 9.826
1047
+ - type: mrr_at_5
1048
+ value: 10.921
1049
+ - type: ndcg_at_1
1050
+ value: 7.166
1051
+ - type: ndcg_at_10
1052
+ value: 8.687000000000001
1053
+ - type: ndcg_at_100
1054
+ value: 13.345
1055
+ - type: ndcg_at_1000
1056
+ value: 16.915
1057
+ - type: ndcg_at_3
1058
+ value: 6.276
1059
+ - type: ndcg_at_5
1060
+ value: 7.013
1061
+ - type: precision_at_1
1062
+ value: 7.166
1063
+ - type: precision_at_10
1064
+ value: 2.9250000000000003
1065
+ - type: precision_at_100
1066
+ value: 0.771
1067
+ - type: precision_at_1000
1068
+ value: 0.13999999999999999
1069
+ - type: precision_at_3
1070
+ value: 4.734
1071
+ - type: precision_at_5
1072
+ value: 3.8830000000000005
1073
+ - type: recall_at_1
1074
+ value: 3.101
1075
+ - type: recall_at_10
1076
+ value: 11.774999999999999
1077
+ - type: recall_at_100
1078
+ value: 28.819
1079
+ - type: recall_at_1000
1080
+ value: 49.886
1081
+ - type: recall_at_3
1082
+ value: 5.783
1083
+ - type: recall_at_5
1084
+ value: 7.692
1085
+ - task:
1086
+ type: Retrieval
1087
+ dataset:
1088
+ type: dbpedia-entity
1089
+ name: MTEB DBPedia
1090
+ config: default
1091
+ split: test
1092
+ revision: None
1093
+ metrics:
1094
+ - type: map_at_1
1095
+ value: 2.758
1096
+ - type: map_at_10
1097
+ value: 5.507
1098
+ - type: map_at_100
1099
+ value: 7.1819999999999995
1100
+ - type: map_at_1000
1101
+ value: 7.652
1102
+ - type: map_at_3
1103
+ value: 4.131
1104
+ - type: map_at_5
1105
+ value: 4.702
1106
+ - type: mrr_at_1
1107
+ value: 28.499999999999996
1108
+ - type: mrr_at_10
1109
+ value: 37.693
1110
+ - type: mrr_at_100
1111
+ value: 38.657000000000004
1112
+ - type: mrr_at_1000
1113
+ value: 38.704
1114
+ - type: mrr_at_3
1115
+ value: 34.792
1116
+ - type: mrr_at_5
1117
+ value: 36.417
1118
+ - type: ndcg_at_1
1119
+ value: 20.625
1120
+ - type: ndcg_at_10
1121
+ value: 14.771999999999998
1122
+ - type: ndcg_at_100
1123
+ value: 16.821
1124
+ - type: ndcg_at_1000
1125
+ value: 21.546000000000003
1126
+ - type: ndcg_at_3
1127
+ value: 16.528000000000002
1128
+ - type: ndcg_at_5
1129
+ value: 15.573
1130
+ - type: precision_at_1
1131
+ value: 28.499999999999996
1132
+ - type: precision_at_10
1133
+ value: 12.25
1134
+ - type: precision_at_100
1135
+ value: 3.7600000000000002
1136
+ - type: precision_at_1000
1137
+ value: 0.86
1138
+ - type: precision_at_3
1139
+ value: 19.167
1140
+ - type: precision_at_5
1141
+ value: 16.25
1142
+ - type: recall_at_1
1143
+ value: 2.758
1144
+ - type: recall_at_10
1145
+ value: 9.164
1146
+ - type: recall_at_100
1147
+ value: 21.022
1148
+ - type: recall_at_1000
1149
+ value: 37.053999999999995
1150
+ - type: recall_at_3
1151
+ value: 5.112
1152
+ - type: recall_at_5
1153
+ value: 6.413
1154
+ - task:
1155
+ type: Reranking
1156
+ dataset:
1157
+ type: mteb/mind_small
1158
+ name: MTEB MindSmallReranking
1159
+ config: default
1160
+ split: test
1161
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1162
+ metrics:
1163
+ - type: map
1164
+ value: 28.53554681148413
1165
+ - type: mrr
1166
+ value: 29.290078704990325
1167
+ - task:
1168
+ type: STS
1169
+ dataset:
1170
+ type: mteb/sickr-sts
1171
+ name: MTEB SICK-R
1172
+ config: default
1173
+ split: test
1174
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1175
+ metrics:
1176
+ - type: cos_sim_pearson
1177
+ value: 76.52926207453477
1178
+ - type: cos_sim_spearman
1179
+ value: 68.98528351149498
1180
+ - type: euclidean_pearson
1181
+ value: 73.7744559091218
1182
+ - type: euclidean_spearman
1183
+ value: 69.03481995814735
1184
+ - type: manhattan_pearson
1185
+ value: 73.72818267270651
1186
+ - type: manhattan_spearman
1187
+ value: 69.00576442086793
1188
+ - task:
1189
+ type: STS
1190
+ dataset:
1191
+ type: mteb/sts12-sts
1192
+ name: MTEB STS12
1193
+ config: default
1194
+ split: test
1195
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1196
+ metrics:
1197
+ - type: cos_sim_pearson
1198
+ value: 61.71540153163407
1199
+ - type: cos_sim_spearman
1200
+ value: 58.502746406116614
1201
+ - type: euclidean_pearson
1202
+ value: 60.82817999438477
1203
+ - type: euclidean_spearman
1204
+ value: 58.988494433752756
1205
+ - type: manhattan_pearson
1206
+ value: 60.87147859170236
1207
+ - type: manhattan_spearman
1208
+ value: 59.03527382025516
1209
+ - task:
1210
+ type: STS
1211
+ dataset:
1212
+ type: mteb/sts13-sts
1213
+ name: MTEB STS13
1214
+ config: default
1215
+ split: test
1216
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1217
+ metrics:
1218
+ - type: cos_sim_pearson
1219
+ value: 72.89990498692094
1220
+ - type: cos_sim_spearman
1221
+ value: 74.03028513377879
1222
+ - type: euclidean_pearson
1223
+ value: 73.8252088833803
1224
+ - type: euclidean_spearman
1225
+ value: 74.15554246478399
1226
+ - type: manhattan_pearson
1227
+ value: 73.80947397334666
1228
+ - type: manhattan_spearman
1229
+ value: 74.13117958176566
1230
+ - task:
1231
+ type: STS
1232
+ dataset:
1233
+ type: mteb/sts14-sts
1234
+ name: MTEB STS14
1235
+ config: default
1236
+ split: test
1237
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
1238
+ metrics:
1239
+ - type: cos_sim_pearson
1240
+ value: 70.67974206005906
1241
+ - type: cos_sim_spearman
1242
+ value: 66.18263558486296
1243
+ - type: euclidean_pearson
1244
+ value: 69.5048876024341
1245
+ - type: euclidean_spearman
1246
+ value: 66.36380457878391
1247
+ - type: manhattan_pearson
1248
+ value: 69.4895372451589
1249
+ - type: manhattan_spearman
1250
+ value: 66.36941569935124
1251
+ - task:
1252
+ type: STS
1253
+ dataset:
1254
+ type: mteb/sts15-sts
1255
+ name: MTEB STS15
1256
+ config: default
1257
+ split: test
1258
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
1259
+ metrics:
1260
+ - type: cos_sim_pearson
1261
+ value: 73.99856913569187
1262
+ - type: cos_sim_spearman
1263
+ value: 75.54712054246464
1264
+ - type: euclidean_pearson
1265
+ value: 74.55692573876115
1266
+ - type: euclidean_spearman
1267
+ value: 75.34499056740096
1268
+ - type: manhattan_pearson
1269
+ value: 74.59342318869683
1270
+ - type: manhattan_spearman
1271
+ value: 75.35708317926819
1272
+ - task:
1273
+ type: STS
1274
+ dataset:
1275
+ type: mteb/sts16-sts
1276
+ name: MTEB STS16
1277
+ config: default
1278
+ split: test
1279
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
1280
+ metrics:
1281
+ - type: cos_sim_pearson
1282
+ value: 72.3343670787494
1283
+ - type: cos_sim_spearman
1284
+ value: 73.7136650302399
1285
+ - type: euclidean_pearson
1286
+ value: 73.86004257913046
1287
+ - type: euclidean_spearman
1288
+ value: 73.9557418048638
1289
+ - type: manhattan_pearson
1290
+ value: 73.78919091538661
1291
+ - type: manhattan_spearman
1292
+ value: 73.86316425954108
1293
+ - task:
1294
+ type: STS
1295
+ dataset:
1296
+ type: mteb/sts17-crosslingual-sts
1297
+ name: MTEB STS17 (en-en)
1298
+ config: en-en
1299
+ split: test
1300
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1301
+ metrics:
1302
+ - type: cos_sim_pearson
1303
+ value: 79.08159601556619
1304
+ - type: cos_sim_spearman
1305
+ value: 80.13910828685532
1306
+ - type: euclidean_pearson
1307
+ value: 79.39197806617453
1308
+ - type: euclidean_spearman
1309
+ value: 79.85692277871196
1310
+ - type: manhattan_pearson
1311
+ value: 79.32452246324705
1312
+ - type: manhattan_spearman
1313
+ value: 79.70120373587193
1314
+ - task:
1315
+ type: STS
1316
+ dataset:
1317
+ type: mteb/sts22-crosslingual-sts
1318
+ name: MTEB STS22 (en)
1319
+ config: en
1320
+ split: test
1321
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
1322
+ metrics:
1323
+ - type: cos_sim_pearson
1324
+ value: 62.29720207747786
1325
+ - type: cos_sim_spearman
1326
+ value: 65.65260681394685
1327
+ - type: euclidean_pearson
1328
+ value: 64.49002165983158
1329
+ - type: euclidean_spearman
1330
+ value: 65.25917651158736
1331
+ - type: manhattan_pearson
1332
+ value: 64.49981108236335
1333
+ - type: manhattan_spearman
1334
+ value: 65.20426825202405
1335
+ - task:
1336
+ type: STS
1337
+ dataset:
1338
+ type: mteb/stsbenchmark-sts
1339
+ name: MTEB STSBenchmark
1340
+ config: default
1341
+ split: test
1342
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
1343
+ metrics:
1344
+ - type: cos_sim_pearson
1345
+ value: 71.1871068550574
1346
+ - type: cos_sim_spearman
1347
+ value: 71.40167034949341
1348
+ - type: euclidean_pearson
1349
+ value: 72.2373684855404
1350
+ - type: euclidean_spearman
1351
+ value: 71.90255429812984
1352
+ - type: manhattan_pearson
1353
+ value: 72.23173532049509
1354
+ - type: manhattan_spearman
1355
+ value: 71.87843489689064
1356
+ - task:
1357
+ type: Reranking
1358
+ dataset:
1359
+ type: mteb/scidocs-reranking
1360
+ name: MTEB SciDocsRR
1361
+ config: default
1362
+ split: test
1363
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
1364
+ metrics:
1365
+ - type: map
1366
+ value: 68.65000574464773
1367
+ - type: mrr
1368
+ value: 88.29363084265044
1369
+ - task:
1370
+ type: Reranking
1371
+ dataset:
1372
+ type: mteb/stackoverflowdupquestions-reranking
1373
+ name: MTEB StackOverflowDupQuestions
1374
+ config: default
1375
+ split: test
1376
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
1377
+ metrics:
1378
+ - type: map
1379
+ value: 40.76107749144358
1380
+ - type: mrr
1381
+ value: 41.03689202953908
1382
+ - task:
1383
+ type: Summarization
1384
+ dataset:
1385
+ type: mteb/summeval
1386
+ name: MTEB SummEval
1387
+ config: default
1388
+ split: test
1389
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
1390
+ metrics:
1391
+ - type: cos_sim_pearson
1392
+ value: 28.68520527813894
1393
+ - type: cos_sim_spearman
1394
+ value: 29.017620841627433
1395
+ - type: dot_pearson
1396
+ value: 29.25380949876322
1397
+ - type: dot_spearman
1398
+ value: 29.33885250837327
1399
+ ---
1400
  ---
1401
 
1402
  # {MODEL_NAME}