tomaarsen HF staff commited on
Commit
4e00b0e
1 Parent(s): 1743762

Add new SentenceTransformer model

Browse files
0_StaticEmbedding/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e6d170a3b9af143f9e4debbf54966f17acafbd5780b6bf0662abc875e91cda4b
3
+ size 125018208
0_StaticEmbedding/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
README.md ADDED
@@ -0,0 +1,1307 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - generated_from_trainer
10
+ - dataset_size:3012496
11
+ - loss:MatryoshkaLoss
12
+ - loss:MultipleNegativesRankingLoss
13
+ widget:
14
+ - source_sentence: how to sign legal documents as power of attorney?
15
+ sentences:
16
+ - 'After the principal''s name, write “by” and then sign your own name. Under or
17
+ after the signature line, indicate your status as POA by including any of the
18
+ following identifiers: as POA, as Agent, as Attorney in Fact or as Power of Attorney.'
19
+ - '[''From the Home screen, swipe left to Apps.'', ''Tap Transfer my Data.'', ''Tap
20
+ Menu (...).'', ''Tap Export to SD card.'']'
21
+ - Ginger Dank Nugs (Grape) - 350mg. Feast your eyes on these unique and striking
22
+ gourmet chocolates; Coco Nugs created by Ginger Dank. Crafted to resemble perfect
23
+ nugs of cannabis, each of the 10 buds contains 35mg of THC. ... This is a perfect
24
+ product for both cannabis and chocolate lovers, who appreciate a little twist.
25
+ - source_sentence: how to delete vdom in fortigate?
26
+ sentences:
27
+ - Go to System -> VDOM -> VDOM2 and select 'Delete'. This VDOM is now successfully
28
+ removed from the configuration.
29
+ - 'Both combination birth control pills and progestin-only pills may cause headaches
30
+ as a side effect. Additional side effects of birth control pills may include:
31
+ breast tenderness. nausea.'
32
+ - White cheese tends to show imperfections more readily and as consumers got more
33
+ used to yellow-orange cheese, it became an expected option. Today, many cheddars
34
+ are yellow. While most cheesemakers use annatto, some use an artificial coloring
35
+ agent instead, according to Sachs.
36
+ - source_sentence: where are earthquakes most likely to occur on earth?
37
+ sentences:
38
+ - Zelle in the Bank of the America app is a fast, safe, and easy way to send and
39
+ receive money with family and friends who have a bank account in the U.S., all
40
+ with no fees. Money moves in minutes directly between accounts that are already
41
+ enrolled with Zelle.
42
+ - It takes about 3 days for a spacecraft to reach the Moon. During that time a spacecraft
43
+ travels at least 240,000 miles (386,400 kilometers) which is the distance between
44
+ Earth and the Moon.
45
+ - Most earthquakes occur along the edge of the oceanic and continental plates. The
46
+ earth's crust (the outer layer of the planet) is made up of several pieces, called
47
+ plates. The plates under the oceans are called oceanic plates and the rest are
48
+ continental plates.
49
+ - source_sentence: fix iphone is disabled connect to itunes without itunes?
50
+ sentences:
51
+ - To fix a disabled iPhone or iPad without iTunes, you have to erase your device.
52
+ Click on the "Erase iPhone" option and confirm your selection. Wait for a while
53
+ as the "Find My iPhone" feature will remotely erase your iOS device. Needless
54
+ to say, it will also disable its lock.
55
+ - How Māui brought fire to the world. One evening, after eating a hearty meal, Māui
56
+ lay beside his fire staring into the flames. ... In the middle of the night, while
57
+ everyone was sleeping, Māui went from village to village and extinguished all
58
+ the fires until not a single fire burned in the world.
59
+ - Angry Orchard makes a variety of year-round craft cider styles, including Angry
60
+ Orchard Crisp Apple, a fruit-forward hard cider that balances the sweetness of
61
+ culinary apples with dryness and bright acidity of bittersweet apples for a complex,
62
+ refreshing taste.
63
+ - source_sentence: how to reverse a video on tiktok that's not yours?
64
+ sentences:
65
+ - '[''Tap "Effects" at the bottom of your screen — it\''s an icon that looks like
66
+ a clock. Open the Effects menu. ... '', ''At the end of the new list that appears,
67
+ tap "Time." Select "Time" at the end. ... '', ''Select "Reverse" — you\''ll then
68
+ see a preview of your new, reversed video appear on the screen.'']'
69
+ - Franchise Facts Poke Bar has a franchise fee of up to $30,000, with a total initial
70
+ investment range of $157,800 to $438,000. The initial cost of a franchise includes
71
+ several fees -- Unlock this franchise to better understand the costs such as training
72
+ and territory fees.
73
+ - Relative age is the age of a rock layer (or the fossils it contains) compared
74
+ to other layers. It can be determined by looking at the position of rock layers.
75
+ Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can
76
+ be determined by using radiometric dating.
77
+ datasets:
78
+ - sentence-transformers/gooaq
79
+ pipeline_tag: sentence-similarity
80
+ library_name: sentence-transformers
81
+ metrics:
82
+ - cosine_accuracy@1
83
+ - cosine_accuracy@3
84
+ - cosine_accuracy@5
85
+ - cosine_accuracy@10
86
+ - cosine_precision@1
87
+ - cosine_precision@3
88
+ - cosine_precision@5
89
+ - cosine_precision@10
90
+ - cosine_recall@1
91
+ - cosine_recall@3
92
+ - cosine_recall@5
93
+ - cosine_recall@10
94
+ - cosine_ndcg@10
95
+ - cosine_mrr@10
96
+ - cosine_map@100
97
+ co2_eq_emissions:
98
+ emissions: 6.483463467240631
99
+ energy_consumed: 0.01667977902671103
100
+ source: codecarbon
101
+ training_type: fine-tuning
102
+ on_cloud: false
103
+ cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
104
+ ram_total_size: 31.777088165283203
105
+ hours_used: 0.112
106
+ hardware_used: 1 x NVIDIA GeForce RTX 3090
107
+ model-index:
108
+ - name: Static Embeddings with BERT uncased tokenizer finetuned on GooAQ pairs
109
+ results:
110
+ - task:
111
+ type: information-retrieval
112
+ name: Information Retrieval
113
+ dataset:
114
+ name: NanoClimateFEVER
115
+ type: NanoClimateFEVER
116
+ metrics:
117
+ - type: cosine_accuracy@1
118
+ value: 0.2
119
+ name: Cosine Accuracy@1
120
+ - type: cosine_accuracy@3
121
+ value: 0.42
122
+ name: Cosine Accuracy@3
123
+ - type: cosine_accuracy@5
124
+ value: 0.58
125
+ name: Cosine Accuracy@5
126
+ - type: cosine_accuracy@10
127
+ value: 0.76
128
+ name: Cosine Accuracy@10
129
+ - type: cosine_precision@1
130
+ value: 0.2
131
+ name: Cosine Precision@1
132
+ - type: cosine_precision@3
133
+ value: 0.16666666666666663
134
+ name: Cosine Precision@3
135
+ - type: cosine_precision@5
136
+ value: 0.148
137
+ name: Cosine Precision@5
138
+ - type: cosine_precision@10
139
+ value: 0.10399999999999998
140
+ name: Cosine Precision@10
141
+ - type: cosine_recall@1
142
+ value: 0.10566666666666666
143
+ name: Cosine Recall@1
144
+ - type: cosine_recall@3
145
+ value: 0.22233333333333336
146
+ name: Cosine Recall@3
147
+ - type: cosine_recall@5
148
+ value: 0.30566666666666664
149
+ name: Cosine Recall@5
150
+ - type: cosine_recall@10
151
+ value: 0.40399999999999997
152
+ name: Cosine Recall@10
153
+ - type: cosine_ndcg@10
154
+ value: 0.3021857757296797
155
+ name: Cosine Ndcg@10
156
+ - type: cosine_mrr@10
157
+ value: 0.35745238095238085
158
+ name: Cosine Mrr@10
159
+ - type: cosine_map@100
160
+ value: 0.23166090256020686
161
+ name: Cosine Map@100
162
+ - task:
163
+ type: information-retrieval
164
+ name: Information Retrieval
165
+ dataset:
166
+ name: NanoDBPedia
167
+ type: NanoDBPedia
168
+ metrics:
169
+ - type: cosine_accuracy@1
170
+ value: 0.52
171
+ name: Cosine Accuracy@1
172
+ - type: cosine_accuracy@3
173
+ value: 0.7
174
+ name: Cosine Accuracy@3
175
+ - type: cosine_accuracy@5
176
+ value: 0.82
177
+ name: Cosine Accuracy@5
178
+ - type: cosine_accuracy@10
179
+ value: 0.9
180
+ name: Cosine Accuracy@10
181
+ - type: cosine_precision@1
182
+ value: 0.52
183
+ name: Cosine Precision@1
184
+ - type: cosine_precision@3
185
+ value: 0.5133333333333333
186
+ name: Cosine Precision@3
187
+ - type: cosine_precision@5
188
+ value: 0.48
189
+ name: Cosine Precision@5
190
+ - type: cosine_precision@10
191
+ value: 0.43800000000000006
192
+ name: Cosine Precision@10
193
+ - type: cosine_recall@1
194
+ value: 0.04048260039152364
195
+ name: Cosine Recall@1
196
+ - type: cosine_recall@3
197
+ value: 0.10679067052991392
198
+ name: Cosine Recall@3
199
+ - type: cosine_recall@5
200
+ value: 0.16517406885695451
201
+ name: Cosine Recall@5
202
+ - type: cosine_recall@10
203
+ value: 0.29331552217012935
204
+ name: Cosine Recall@10
205
+ - type: cosine_ndcg@10
206
+ value: 0.5008496215473859
207
+ name: Cosine Ndcg@10
208
+ - type: cosine_mrr@10
209
+ value: 0.6488571428571429
210
+ name: Cosine Mrr@10
211
+ - type: cosine_map@100
212
+ value: 0.3752676117852694
213
+ name: Cosine Map@100
214
+ - task:
215
+ type: information-retrieval
216
+ name: Information Retrieval
217
+ dataset:
218
+ name: NanoFEVER
219
+ type: NanoFEVER
220
+ metrics:
221
+ - type: cosine_accuracy@1
222
+ value: 0.42
223
+ name: Cosine Accuracy@1
224
+ - type: cosine_accuracy@3
225
+ value: 0.68
226
+ name: Cosine Accuracy@3
227
+ - type: cosine_accuracy@5
228
+ value: 0.68
229
+ name: Cosine Accuracy@5
230
+ - type: cosine_accuracy@10
231
+ value: 0.82
232
+ name: Cosine Accuracy@10
233
+ - type: cosine_precision@1
234
+ value: 0.42
235
+ name: Cosine Precision@1
236
+ - type: cosine_precision@3
237
+ value: 0.2333333333333333
238
+ name: Cosine Precision@3
239
+ - type: cosine_precision@5
240
+ value: 0.14
241
+ name: Cosine Precision@5
242
+ - type: cosine_precision@10
243
+ value: 0.08599999999999998
244
+ name: Cosine Precision@10
245
+ - type: cosine_recall@1
246
+ value: 0.3966666666666667
247
+ name: Cosine Recall@1
248
+ - type: cosine_recall@3
249
+ value: 0.6466666666666667
250
+ name: Cosine Recall@3
251
+ - type: cosine_recall@5
252
+ value: 0.6466666666666667
253
+ name: Cosine Recall@5
254
+ - type: cosine_recall@10
255
+ value: 0.7766666666666667
256
+ name: Cosine Recall@10
257
+ - type: cosine_ndcg@10
258
+ value: 0.5890710274148659
259
+ name: Cosine Ndcg@10
260
+ - type: cosine_mrr@10
261
+ value: 0.546047619047619
262
+ name: Cosine Mrr@10
263
+ - type: cosine_map@100
264
+ value: 0.5325906780111076
265
+ name: Cosine Map@100
266
+ - task:
267
+ type: information-retrieval
268
+ name: Information Retrieval
269
+ dataset:
270
+ name: NanoFiQA2018
271
+ type: NanoFiQA2018
272
+ metrics:
273
+ - type: cosine_accuracy@1
274
+ value: 0.36
275
+ name: Cosine Accuracy@1
276
+ - type: cosine_accuracy@3
277
+ value: 0.48
278
+ name: Cosine Accuracy@3
279
+ - type: cosine_accuracy@5
280
+ value: 0.54
281
+ name: Cosine Accuracy@5
282
+ - type: cosine_accuracy@10
283
+ value: 0.64
284
+ name: Cosine Accuracy@10
285
+ - type: cosine_precision@1
286
+ value: 0.36
287
+ name: Cosine Precision@1
288
+ - type: cosine_precision@3
289
+ value: 0.22
290
+ name: Cosine Precision@3
291
+ - type: cosine_precision@5
292
+ value: 0.16
293
+ name: Cosine Precision@5
294
+ - type: cosine_precision@10
295
+ value: 0.106
296
+ name: Cosine Precision@10
297
+ - type: cosine_recall@1
298
+ value: 0.1734126984126984
299
+ name: Cosine Recall@1
300
+ - type: cosine_recall@3
301
+ value: 0.32126984126984126
302
+ name: Cosine Recall@3
303
+ - type: cosine_recall@5
304
+ value: 0.3737936507936508
305
+ name: Cosine Recall@5
306
+ - type: cosine_recall@10
307
+ value: 0.47868253968253976
308
+ name: Cosine Recall@10
309
+ - type: cosine_ndcg@10
310
+ value: 0.384612736899094
311
+ name: Cosine Ndcg@10
312
+ - type: cosine_mrr@10
313
+ value: 0.44405555555555554
314
+ name: Cosine Mrr@10
315
+ - type: cosine_map@100
316
+ value: 0.32183898737919203
317
+ name: Cosine Map@100
318
+ - task:
319
+ type: information-retrieval
320
+ name: Information Retrieval
321
+ dataset:
322
+ name: NanoHotpotQA
323
+ type: NanoHotpotQA
324
+ metrics:
325
+ - type: cosine_accuracy@1
326
+ value: 0.58
327
+ name: Cosine Accuracy@1
328
+ - type: cosine_accuracy@3
329
+ value: 0.74
330
+ name: Cosine Accuracy@3
331
+ - type: cosine_accuracy@5
332
+ value: 0.78
333
+ name: Cosine Accuracy@5
334
+ - type: cosine_accuracy@10
335
+ value: 0.86
336
+ name: Cosine Accuracy@10
337
+ - type: cosine_precision@1
338
+ value: 0.58
339
+ name: Cosine Precision@1
340
+ - type: cosine_precision@3
341
+ value: 0.3133333333333333
342
+ name: Cosine Precision@3
343
+ - type: cosine_precision@5
344
+ value: 0.21600000000000003
345
+ name: Cosine Precision@5
346
+ - type: cosine_precision@10
347
+ value: 0.126
348
+ name: Cosine Precision@10
349
+ - type: cosine_recall@1
350
+ value: 0.29
351
+ name: Cosine Recall@1
352
+ - type: cosine_recall@3
353
+ value: 0.47
354
+ name: Cosine Recall@3
355
+ - type: cosine_recall@5
356
+ value: 0.54
357
+ name: Cosine Recall@5
358
+ - type: cosine_recall@10
359
+ value: 0.63
360
+ name: Cosine Recall@10
361
+ - type: cosine_ndcg@10
362
+ value: 0.5630232180814766
363
+ name: Cosine Ndcg@10
364
+ - type: cosine_mrr@10
365
+ value: 0.675079365079365
366
+ name: Cosine Mrr@10
367
+ - type: cosine_map@100
368
+ value: 0.48992202928149226
369
+ name: Cosine Map@100
370
+ - task:
371
+ type: information-retrieval
372
+ name: Information Retrieval
373
+ dataset:
374
+ name: NanoMSMARCO
375
+ type: NanoMSMARCO
376
+ metrics:
377
+ - type: cosine_accuracy@1
378
+ value: 0.2
379
+ name: Cosine Accuracy@1
380
+ - type: cosine_accuracy@3
381
+ value: 0.5
382
+ name: Cosine Accuracy@3
383
+ - type: cosine_accuracy@5
384
+ value: 0.52
385
+ name: Cosine Accuracy@5
386
+ - type: cosine_accuracy@10
387
+ value: 0.6
388
+ name: Cosine Accuracy@10
389
+ - type: cosine_precision@1
390
+ value: 0.2
391
+ name: Cosine Precision@1
392
+ - type: cosine_precision@3
393
+ value: 0.16666666666666663
394
+ name: Cosine Precision@3
395
+ - type: cosine_precision@5
396
+ value: 0.10400000000000002
397
+ name: Cosine Precision@5
398
+ - type: cosine_precision@10
399
+ value: 0.06000000000000001
400
+ name: Cosine Precision@10
401
+ - type: cosine_recall@1
402
+ value: 0.2
403
+ name: Cosine Recall@1
404
+ - type: cosine_recall@3
405
+ value: 0.5
406
+ name: Cosine Recall@3
407
+ - type: cosine_recall@5
408
+ value: 0.52
409
+ name: Cosine Recall@5
410
+ - type: cosine_recall@10
411
+ value: 0.6
412
+ name: Cosine Recall@10
413
+ - type: cosine_ndcg@10
414
+ value: 0.41343867686046815
415
+ name: Cosine Ndcg@10
416
+ - type: cosine_mrr@10
417
+ value: 0.3524603174603175
418
+ name: Cosine Mrr@10
419
+ - type: cosine_map@100
420
+ value: 0.3712333972436779
421
+ name: Cosine Map@100
422
+ - task:
423
+ type: information-retrieval
424
+ name: Information Retrieval
425
+ dataset:
426
+ name: NanoNFCorpus
427
+ type: NanoNFCorpus
428
+ metrics:
429
+ - type: cosine_accuracy@1
430
+ value: 0.38
431
+ name: Cosine Accuracy@1
432
+ - type: cosine_accuracy@3
433
+ value: 0.54
434
+ name: Cosine Accuracy@3
435
+ - type: cosine_accuracy@5
436
+ value: 0.68
437
+ name: Cosine Accuracy@5
438
+ - type: cosine_accuracy@10
439
+ value: 0.68
440
+ name: Cosine Accuracy@10
441
+ - type: cosine_precision@1
442
+ value: 0.38
443
+ name: Cosine Precision@1
444
+ - type: cosine_precision@3
445
+ value: 0.36
446
+ name: Cosine Precision@3
447
+ - type: cosine_precision@5
448
+ value: 0.3440000000000001
449
+ name: Cosine Precision@5
450
+ - type: cosine_precision@10
451
+ value: 0.264
452
+ name: Cosine Precision@10
453
+ - type: cosine_recall@1
454
+ value: 0.019665573227317924
455
+ name: Cosine Recall@1
456
+ - type: cosine_recall@3
457
+ value: 0.07420738619382097
458
+ name: Cosine Recall@3
459
+ - type: cosine_recall@5
460
+ value: 0.09536630802985016
461
+ name: Cosine Recall@5
462
+ - type: cosine_recall@10
463
+ value: 0.11619353053313819
464
+ name: Cosine Recall@10
465
+ - type: cosine_ndcg@10
466
+ value: 0.3204704228749859
467
+ name: Cosine Ndcg@10
468
+ - type: cosine_mrr@10
469
+ value: 0.48633333333333334
470
+ name: Cosine Mrr@10
471
+ - type: cosine_map@100
472
+ value: 0.12237170785886863
473
+ name: Cosine Map@100
474
+ - task:
475
+ type: information-retrieval
476
+ name: Information Retrieval
477
+ dataset:
478
+ name: NanoNQ
479
+ type: NanoNQ
480
+ metrics:
481
+ - type: cosine_accuracy@1
482
+ value: 0.2
483
+ name: Cosine Accuracy@1
484
+ - type: cosine_accuracy@3
485
+ value: 0.36
486
+ name: Cosine Accuracy@3
487
+ - type: cosine_accuracy@5
488
+ value: 0.48
489
+ name: Cosine Accuracy@5
490
+ - type: cosine_accuracy@10
491
+ value: 0.64
492
+ name: Cosine Accuracy@10
493
+ - type: cosine_precision@1
494
+ value: 0.2
495
+ name: Cosine Precision@1
496
+ - type: cosine_precision@3
497
+ value: 0.12
498
+ name: Cosine Precision@3
499
+ - type: cosine_precision@5
500
+ value: 0.09600000000000002
501
+ name: Cosine Precision@5
502
+ - type: cosine_precision@10
503
+ value: 0.068
504
+ name: Cosine Precision@10
505
+ - type: cosine_recall@1
506
+ value: 0.19
507
+ name: Cosine Recall@1
508
+ - type: cosine_recall@3
509
+ value: 0.34
510
+ name: Cosine Recall@3
511
+ - type: cosine_recall@5
512
+ value: 0.45
513
+ name: Cosine Recall@5
514
+ - type: cosine_recall@10
515
+ value: 0.62
516
+ name: Cosine Recall@10
517
+ - type: cosine_ndcg@10
518
+ value: 0.3932776776815765
519
+ name: Cosine Ndcg@10
520
+ - type: cosine_mrr@10
521
+ value: 0.33038888888888884
522
+ name: Cosine Mrr@10
523
+ - type: cosine_map@100
524
+ value: 0.33440957968177043
525
+ name: Cosine Map@100
526
+ - task:
527
+ type: information-retrieval
528
+ name: Information Retrieval
529
+ dataset:
530
+ name: NanoQuoraRetrieval
531
+ type: NanoQuoraRetrieval
532
+ metrics:
533
+ - type: cosine_accuracy@1
534
+ value: 0.9
535
+ name: Cosine Accuracy@1
536
+ - type: cosine_accuracy@3
537
+ value: 0.98
538
+ name: Cosine Accuracy@3
539
+ - type: cosine_accuracy@5
540
+ value: 0.98
541
+ name: Cosine Accuracy@5
542
+ - type: cosine_accuracy@10
543
+ value: 1.0
544
+ name: Cosine Accuracy@10
545
+ - type: cosine_precision@1
546
+ value: 0.9
547
+ name: Cosine Precision@1
548
+ - type: cosine_precision@3
549
+ value: 0.38666666666666655
550
+ name: Cosine Precision@3
551
+ - type: cosine_precision@5
552
+ value: 0.23599999999999993
553
+ name: Cosine Precision@5
554
+ - type: cosine_precision@10
555
+ value: 0.13399999999999998
556
+ name: Cosine Precision@10
557
+ - type: cosine_recall@1
558
+ value: 0.8106666666666666
559
+ name: Cosine Recall@1
560
+ - type: cosine_recall@3
561
+ value: 0.922
562
+ name: Cosine Recall@3
563
+ - type: cosine_recall@5
564
+ value: 0.9259999999999999
565
+ name: Cosine Recall@5
566
+ - type: cosine_recall@10
567
+ value: 0.99
568
+ name: Cosine Recall@10
569
+ - type: cosine_ndcg@10
570
+ value: 0.9424143419536263
571
+ name: Cosine Ndcg@10
572
+ - type: cosine_mrr@10
573
+ value: 0.9395238095238095
574
+ name: Cosine Mrr@10
575
+ - type: cosine_map@100
576
+ value: 0.9189180735930736
577
+ name: Cosine Map@100
578
+ - task:
579
+ type: information-retrieval
580
+ name: Information Retrieval
581
+ dataset:
582
+ name: NanoSCIDOCS
583
+ type: NanoSCIDOCS
584
+ metrics:
585
+ - type: cosine_accuracy@1
586
+ value: 0.28
587
+ name: Cosine Accuracy@1
588
+ - type: cosine_accuracy@3
589
+ value: 0.44
590
+ name: Cosine Accuracy@3
591
+ - type: cosine_accuracy@5
592
+ value: 0.56
593
+ name: Cosine Accuracy@5
594
+ - type: cosine_accuracy@10
595
+ value: 0.76
596
+ name: Cosine Accuracy@10
597
+ - type: cosine_precision@1
598
+ value: 0.28
599
+ name: Cosine Precision@1
600
+ - type: cosine_precision@3
601
+ value: 0.21333333333333332
602
+ name: Cosine Precision@3
603
+ - type: cosine_precision@5
604
+ value: 0.16799999999999998
605
+ name: Cosine Precision@5
606
+ - type: cosine_precision@10
607
+ value: 0.13
608
+ name: Cosine Precision@10
609
+ - type: cosine_recall@1
610
+ value: 0.059666666666666666
611
+ name: Cosine Recall@1
612
+ - type: cosine_recall@3
613
+ value: 0.13166666666666668
614
+ name: Cosine Recall@3
615
+ - type: cosine_recall@5
616
+ value: 0.17266666666666666
617
+ name: Cosine Recall@5
618
+ - type: cosine_recall@10
619
+ value: 0.26666666666666666
620
+ name: Cosine Recall@10
621
+ - type: cosine_ndcg@10
622
+ value: 0.24548416934230666
623
+ name: Cosine Ndcg@10
624
+ - type: cosine_mrr@10
625
+ value: 0.3969603174603174
626
+ name: Cosine Mrr@10
627
+ - type: cosine_map@100
628
+ value: 0.1751060490177909
629
+ name: Cosine Map@100
630
+ - task:
631
+ type: information-retrieval
632
+ name: Information Retrieval
633
+ dataset:
634
+ name: NanoArguAna
635
+ type: NanoArguAna
636
+ metrics:
637
+ - type: cosine_accuracy@1
638
+ value: 0.12
639
+ name: Cosine Accuracy@1
640
+ - type: cosine_accuracy@3
641
+ value: 0.4
642
+ name: Cosine Accuracy@3
643
+ - type: cosine_accuracy@5
644
+ value: 0.48
645
+ name: Cosine Accuracy@5
646
+ - type: cosine_accuracy@10
647
+ value: 0.64
648
+ name: Cosine Accuracy@10
649
+ - type: cosine_precision@1
650
+ value: 0.12
651
+ name: Cosine Precision@1
652
+ - type: cosine_precision@3
653
+ value: 0.13333333333333333
654
+ name: Cosine Precision@3
655
+ - type: cosine_precision@5
656
+ value: 0.09600000000000002
657
+ name: Cosine Precision@5
658
+ - type: cosine_precision@10
659
+ value: 0.06400000000000002
660
+ name: Cosine Precision@10
661
+ - type: cosine_recall@1
662
+ value: 0.12
663
+ name: Cosine Recall@1
664
+ - type: cosine_recall@3
665
+ value: 0.4
666
+ name: Cosine Recall@3
667
+ - type: cosine_recall@5
668
+ value: 0.48
669
+ name: Cosine Recall@5
670
+ - type: cosine_recall@10
671
+ value: 0.64
672
+ name: Cosine Recall@10
673
+ - type: cosine_ndcg@10
674
+ value: 0.3703136948358056
675
+ name: Cosine Ndcg@10
676
+ - type: cosine_mrr@10
677
+ value: 0.2856587301587301
678
+ name: Cosine Mrr@10
679
+ - type: cosine_map@100
680
+ value: 0.2982488157827007
681
+ name: Cosine Map@100
682
+ - task:
683
+ type: information-retrieval
684
+ name: Information Retrieval
685
+ dataset:
686
+ name: NanoSciFact
687
+ type: NanoSciFact
688
+ metrics:
689
+ - type: cosine_accuracy@1
690
+ value: 0.46
691
+ name: Cosine Accuracy@1
692
+ - type: cosine_accuracy@3
693
+ value: 0.5
694
+ name: Cosine Accuracy@3
695
+ - type: cosine_accuracy@5
696
+ value: 0.6
697
+ name: Cosine Accuracy@5
698
+ - type: cosine_accuracy@10
699
+ value: 0.7
700
+ name: Cosine Accuracy@10
701
+ - type: cosine_precision@1
702
+ value: 0.46
703
+ name: Cosine Precision@1
704
+ - type: cosine_precision@3
705
+ value: 0.1733333333333333
706
+ name: Cosine Precision@3
707
+ - type: cosine_precision@5
708
+ value: 0.128
709
+ name: Cosine Precision@5
710
+ - type: cosine_precision@10
711
+ value: 0.076
712
+ name: Cosine Precision@10
713
+ - type: cosine_recall@1
714
+ value: 0.425
715
+ name: Cosine Recall@1
716
+ - type: cosine_recall@3
717
+ value: 0.47
718
+ name: Cosine Recall@3
719
+ - type: cosine_recall@5
720
+ value: 0.58
721
+ name: Cosine Recall@5
722
+ - type: cosine_recall@10
723
+ value: 0.685
724
+ name: Cosine Recall@10
725
+ - type: cosine_ndcg@10
726
+ value: 0.5455895863246394
727
+ name: Cosine Ndcg@10
728
+ - type: cosine_mrr@10
729
+ value: 0.5181349206349206
730
+ name: Cosine Mrr@10
731
+ - type: cosine_map@100
732
+ value: 0.5100938735556383
733
+ name: Cosine Map@100
734
+ - task:
735
+ type: information-retrieval
736
+ name: Information Retrieval
737
+ dataset:
738
+ name: NanoTouche2020
739
+ type: NanoTouche2020
740
+ metrics:
741
+ - type: cosine_accuracy@1
742
+ value: 0.6326530612244898
743
+ name: Cosine Accuracy@1
744
+ - type: cosine_accuracy@3
745
+ value: 0.9591836734693877
746
+ name: Cosine Accuracy@3
747
+ - type: cosine_accuracy@5
748
+ value: 1.0
749
+ name: Cosine Accuracy@5
750
+ - type: cosine_accuracy@10
751
+ value: 1.0
752
+ name: Cosine Accuracy@10
753
+ - type: cosine_precision@1
754
+ value: 0.6326530612244898
755
+ name: Cosine Precision@1
756
+ - type: cosine_precision@3
757
+ value: 0.6326530612244897
758
+ name: Cosine Precision@3
759
+ - type: cosine_precision@5
760
+ value: 0.5877551020408164
761
+ name: Cosine Precision@5
762
+ - type: cosine_precision@10
763
+ value: 0.5163265306122449
764
+ name: Cosine Precision@10
765
+ - type: cosine_recall@1
766
+ value: 0.04221140303473122
767
+ name: Cosine Recall@1
768
+ - type: cosine_recall@3
769
+ value: 0.12126049151597706
770
+ name: Cosine Recall@3
771
+ - type: cosine_recall@5
772
+ value: 0.18889590300402684
773
+ name: Cosine Recall@5
774
+ - type: cosine_recall@10
775
+ value: 0.3304256352667907
776
+ name: Cosine Recall@10
777
+ - type: cosine_ndcg@10
778
+ value: 0.563482462405376
779
+ name: Cosine Ndcg@10
780
+ - type: cosine_mrr@10
781
+ value: 0.7857142857142857
782
+ name: Cosine Mrr@10
783
+ - type: cosine_map@100
784
+ value: 0.42991471736271825
785
+ name: Cosine Map@100
786
+ - task:
787
+ type: nano-beir
788
+ name: Nano BEIR
789
+ dataset:
790
+ name: NanoBEIR mean
791
+ type: NanoBEIR_mean
792
+ metrics:
793
+ - type: cosine_accuracy@1
794
+ value: 0.4040502354788069
795
+ name: Cosine Accuracy@1
796
+ - type: cosine_accuracy@3
797
+ value: 0.5922448979591837
798
+ name: Cosine Accuracy@3
799
+ - type: cosine_accuracy@5
800
+ value: 0.6692307692307693
801
+ name: Cosine Accuracy@5
802
+ - type: cosine_accuracy@10
803
+ value: 0.7692307692307693
804
+ name: Cosine Accuracy@10
805
+ - type: cosine_precision@1
806
+ value: 0.4040502354788069
807
+ name: Cosine Precision@1
808
+ - type: cosine_precision@3
809
+ value: 0.2794348508634223
810
+ name: Cosine Precision@3
811
+ - type: cosine_precision@5
812
+ value: 0.2233657770800628
813
+ name: Cosine Precision@5
814
+ - type: cosine_precision@10
815
+ value: 0.1671020408163265
816
+ name: Cosine Precision@10
817
+ - type: cosine_recall@1
818
+ value: 0.22103376474868755
819
+ name: Cosine Recall@1
820
+ - type: cosine_recall@3
821
+ value: 0.3635534658597092
822
+ name: Cosine Recall@3
823
+ - type: cosine_recall@5
824
+ value: 0.41878691774496013
825
+ name: Cosine Recall@5
826
+ - type: cosine_recall@10
827
+ value: 0.5254577354604563
828
+ name: Cosine Recall@10
829
+ - type: cosine_ndcg@10
830
+ value: 0.47186257015009897
831
+ name: Cosine Ndcg@10
832
+ - type: cosine_mrr@10
833
+ value: 0.5205128205128206
834
+ name: Cosine Mrr@10
835
+ - type: cosine_map@100
836
+ value: 0.39319818639334664
837
+ name: Cosine Map@100
838
+ ---
839
+
840
+ # Static Embeddings with BERT uncased tokenizer finetuned on GooAQ pairs
841
+
842
+ This is a [sentence-transformers](https://www.SBERT.net) model trained on the [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
843
+
844
+ ## Model Details
845
+
846
+ ### Model Description
847
+ - **Model Type:** Sentence Transformer
848
+ <!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
849
+ - **Maximum Sequence Length:** inf tokens
850
+ - **Output Dimensionality:** 1024 tokens
851
+ - **Similarity Function:** Cosine Similarity
852
+ - **Training Dataset:**
853
+ - [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq)
854
+ - **Language:** en
855
+ - **License:** apache-2.0
856
+
857
+ ### Model Sources
858
+
859
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
860
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
861
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
862
+
863
+ ### Full Model Architecture
864
+
865
+ ```
866
+ SentenceTransformer(
867
+ (0): StaticEmbedding(
868
+ (embedding): EmbeddingBag(30522, 1024, mode='mean')
869
+ )
870
+ )
871
+ ```
872
+
873
+ ## Usage
874
+
875
+ ### Direct Usage (Sentence Transformers)
876
+
877
+ First install the Sentence Transformers library:
878
+
879
+ ```bash
880
+ pip install -U sentence-transformers
881
+ ```
882
+
883
+ Then you can load this model and run inference.
884
+ ```python
885
+ from sentence_transformers import SentenceTransformer
886
+
887
+ # Download from the 🤗 Hub
888
+ model = SentenceTransformer("tomaarsen/static-bert-uncased-gooaq-beir-4")
889
+ # Run inference
890
+ sentences = [
891
+ "how to reverse a video on tiktok that's not yours?",
892
+ '[\'Tap "Effects" at the bottom of your screen — it\\\'s an icon that looks like a clock. Open the Effects menu. ... \', \'At the end of the new list that appears, tap "Time." Select "Time" at the end. ... \', \'Select "Reverse" — you\\\'ll then see a preview of your new, reversed video appear on the screen.\']',
893
+ 'Relative age is the age of a rock layer (or the fossils it contains) compared to other layers. It can be determined by looking at the position of rock layers. Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can be determined by using radiometric dating.',
894
+ ]
895
+ embeddings = model.encode(sentences)
896
+ print(embeddings.shape)
897
+ # [3, 1024]
898
+
899
+ # Get the similarity scores for the embeddings
900
+ similarities = model.similarity(embeddings, embeddings)
901
+ print(similarities.shape)
902
+ # [3, 3]
903
+ ```
904
+
905
+ <!--
906
+ ### Direct Usage (Transformers)
907
+
908
+ <details><summary>Click to see the direct usage in Transformers</summary>
909
+
910
+ </details>
911
+ -->
912
+
913
+ <!--
914
+ ### Downstream Usage (Sentence Transformers)
915
+
916
+ You can finetune this model on your own dataset.
917
+
918
+ <details><summary>Click to expand</summary>
919
+
920
+ </details>
921
+ -->
922
+
923
+ <!--
924
+ ### Out-of-Scope Use
925
+
926
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
927
+ -->
928
+
929
+ ## Evaluation
930
+
931
+ ### Metrics
932
+
933
+ #### Information Retrieval
934
+
935
+ * Datasets: `NanoClimateFEVER`, `NanoDBPedia`, `NanoFEVER`, `NanoFiQA2018`, `NanoHotpotQA`, `NanoMSMARCO`, `NanoNFCorpus`, `NanoNQ`, `NanoQuoraRetrieval`, `NanoSCIDOCS`, `NanoArguAna`, `NanoSciFact` and `NanoTouche2020`
936
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
937
+
938
+ | Metric | NanoClimateFEVER | NanoDBPedia | NanoFEVER | NanoFiQA2018 | NanoHotpotQA | NanoMSMARCO | NanoNFCorpus | NanoNQ | NanoQuoraRetrieval | NanoSCIDOCS | NanoArguAna | NanoSciFact | NanoTouche2020 |
939
+ |:--------------------|:-----------------|:------------|:-----------|:-------------|:-------------|:------------|:-------------|:-----------|:-------------------|:------------|:------------|:------------|:---------------|
940
+ | cosine_accuracy@1 | 0.2 | 0.52 | 0.42 | 0.36 | 0.58 | 0.2 | 0.38 | 0.2 | 0.9 | 0.28 | 0.12 | 0.46 | 0.6327 |
941
+ | cosine_accuracy@3 | 0.42 | 0.7 | 0.68 | 0.48 | 0.74 | 0.5 | 0.54 | 0.36 | 0.98 | 0.44 | 0.4 | 0.5 | 0.9592 |
942
+ | cosine_accuracy@5 | 0.58 | 0.82 | 0.68 | 0.54 | 0.78 | 0.52 | 0.68 | 0.48 | 0.98 | 0.56 | 0.48 | 0.6 | 1.0 |
943
+ | cosine_accuracy@10 | 0.76 | 0.9 | 0.82 | 0.64 | 0.86 | 0.6 | 0.68 | 0.64 | 1.0 | 0.76 | 0.64 | 0.7 | 1.0 |
944
+ | cosine_precision@1 | 0.2 | 0.52 | 0.42 | 0.36 | 0.58 | 0.2 | 0.38 | 0.2 | 0.9 | 0.28 | 0.12 | 0.46 | 0.6327 |
945
+ | cosine_precision@3 | 0.1667 | 0.5133 | 0.2333 | 0.22 | 0.3133 | 0.1667 | 0.36 | 0.12 | 0.3867 | 0.2133 | 0.1333 | 0.1733 | 0.6327 |
946
+ | cosine_precision@5 | 0.148 | 0.48 | 0.14 | 0.16 | 0.216 | 0.104 | 0.344 | 0.096 | 0.236 | 0.168 | 0.096 | 0.128 | 0.5878 |
947
+ | cosine_precision@10 | 0.104 | 0.438 | 0.086 | 0.106 | 0.126 | 0.06 | 0.264 | 0.068 | 0.134 | 0.13 | 0.064 | 0.076 | 0.5163 |
948
+ | cosine_recall@1 | 0.1057 | 0.0405 | 0.3967 | 0.1734 | 0.29 | 0.2 | 0.0197 | 0.19 | 0.8107 | 0.0597 | 0.12 | 0.425 | 0.0422 |
949
+ | cosine_recall@3 | 0.2223 | 0.1068 | 0.6467 | 0.3213 | 0.47 | 0.5 | 0.0742 | 0.34 | 0.922 | 0.1317 | 0.4 | 0.47 | 0.1213 |
950
+ | cosine_recall@5 | 0.3057 | 0.1652 | 0.6467 | 0.3738 | 0.54 | 0.52 | 0.0954 | 0.45 | 0.926 | 0.1727 | 0.48 | 0.58 | 0.1889 |
951
+ | cosine_recall@10 | 0.404 | 0.2933 | 0.7767 | 0.4787 | 0.63 | 0.6 | 0.1162 | 0.62 | 0.99 | 0.2667 | 0.64 | 0.685 | 0.3304 |
952
+ | **cosine_ndcg@10** | **0.3022** | **0.5008** | **0.5891** | **0.3846** | **0.563** | **0.4134** | **0.3205** | **0.3933** | **0.9424** | **0.2455** | **0.3703** | **0.5456** | **0.5635** |
953
+ | cosine_mrr@10 | 0.3575 | 0.6489 | 0.546 | 0.4441 | 0.6751 | 0.3525 | 0.4863 | 0.3304 | 0.9395 | 0.397 | 0.2857 | 0.5181 | 0.7857 |
954
+ | cosine_map@100 | 0.2317 | 0.3753 | 0.5326 | 0.3218 | 0.4899 | 0.3712 | 0.1224 | 0.3344 | 0.9189 | 0.1751 | 0.2982 | 0.5101 | 0.4299 |
955
+
956
+ #### Nano BEIR
957
+
958
+ * Dataset: `NanoBEIR_mean`
959
+ * Evaluated with [<code>NanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.NanoBEIREvaluator)
960
+
961
+ | Metric | Value |
962
+ |:--------------------|:-----------|
963
+ | cosine_accuracy@1 | 0.4041 |
964
+ | cosine_accuracy@3 | 0.5922 |
965
+ | cosine_accuracy@5 | 0.6692 |
966
+ | cosine_accuracy@10 | 0.7692 |
967
+ | cosine_precision@1 | 0.4041 |
968
+ | cosine_precision@3 | 0.2794 |
969
+ | cosine_precision@5 | 0.2234 |
970
+ | cosine_precision@10 | 0.1671 |
971
+ | cosine_recall@1 | 0.221 |
972
+ | cosine_recall@3 | 0.3636 |
973
+ | cosine_recall@5 | 0.4188 |
974
+ | cosine_recall@10 | 0.5255 |
975
+ | **cosine_ndcg@10** | **0.4719** |
976
+ | cosine_mrr@10 | 0.5205 |
977
+ | cosine_map@100 | 0.3932 |
978
+
979
+ <!--
980
+ ## Bias, Risks and Limitations
981
+
982
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
983
+ -->
984
+
985
+ <!--
986
+ ### Recommendations
987
+
988
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
989
+ -->
990
+
991
+ ## Training Details
992
+
993
+ ### Training Dataset
994
+
995
+ #### gooaq
996
+
997
+ * Dataset: [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
998
+ * Size: 3,012,496 training samples
999
+ * Columns: <code>question</code> and <code>answer</code>
1000
+ * Approximate statistics based on the first 1000 samples:
1001
+ | | question | answer |
1002
+ |:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
1003
+ | type | string | string |
1004
+ | details | <ul><li>min: 18 characters</li><li>mean: 43.23 characters</li><li>max: 96 characters</li></ul> | <ul><li>min: 55 characters</li><li>mean: 253.36 characters</li><li>max: 371 characters</li></ul> |
1005
+ * Samples:
1006
+ | question | answer |
1007
+ |:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
1008
+ | <code>what is the difference between broilers and layers?</code> | <code>An egg laying poultry is called egger or layer whereas broilers are reared for obtaining meat. So a layer should be able to produce more number of large sized eggs, without growing too much. On the other hand, a broiler should yield more meat and hence should be able to grow well.</code> |
1009
+ | <code>what is the difference between chronological order and spatial order?</code> | <code>As a writer, you should always remember that unlike chronological order and the other organizational methods for data, spatial order does not take into account the time. Spatial order is primarily focused on the location. All it does is take into account the location of objects and not the time.</code> |
1010
+ | <code>is kamagra same as viagra?</code> | <code>Kamagra is thought to contain the same active ingredient as Viagra, sildenafil citrate. In theory, it should work in much the same way as Viagra, taking about 45 minutes to take effect, and lasting for around 4-6 hours. However, this will vary from person to person.</code> |
1011
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
1012
+ ```json
1013
+ {
1014
+ "loss": "MultipleNegativesRankingLoss",
1015
+ "matryoshka_dims": [
1016
+ 1024,
1017
+ 512,
1018
+ 256,
1019
+ 128,
1020
+ 64,
1021
+ 32
1022
+ ],
1023
+ "matryoshka_weights": [
1024
+ 1,
1025
+ 1,
1026
+ 1,
1027
+ 1,
1028
+ 1,
1029
+ 1
1030
+ ],
1031
+ "n_dims_per_step": -1
1032
+ }
1033
+ ```
1034
+
1035
+ ### Evaluation Dataset
1036
+
1037
+ #### gooaq
1038
+
1039
+ * Dataset: [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
1040
+ * Size: 3,012,496 evaluation samples
1041
+ * Columns: <code>question</code> and <code>answer</code>
1042
+ * Approximate statistics based on the first 1000 samples:
1043
+ | | question | answer |
1044
+ |:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
1045
+ | type | string | string |
1046
+ | details | <ul><li>min: 18 characters</li><li>mean: 43.17 characters</li><li>max: 98 characters</li></ul> | <ul><li>min: 51 characters</li><li>mean: 254.12 characters</li><li>max: 360 characters</li></ul> |
1047
+ * Samples:
1048
+ | question | answer |
1049
+ |:-----------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
1050
+ | <code>how do i program my directv remote with my tv?</code> | <code>['Press MENU on your remote.', 'Select Settings & Help > Settings > Remote Control > Program Remote.', 'Choose the device (TV, audio, DVD) you wish to program. ... ', 'Follow the on-screen prompts to complete programming.']</code> |
1051
+ | <code>are rodrigues fruit bats nocturnal?</code> | <code>Before its numbers were threatened by habitat destruction, storms, and hunting, some of those groups could number 500 or more members. Sunrise, sunset. Rodrigues fruit bats are most active at dawn, at dusk, and at night.</code> |
1052
+ | <code>why does your heart rate increase during exercise bbc bitesize?</code> | <code>During exercise there is an increase in physical activity and muscle cells respire more than they do when the body is at rest. The heart rate increases during exercise. The rate and depth of breathing increases - this makes sure that more oxygen is absorbed into the blood, and more carbon dioxide is removed from it.</code> |
1053
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
1054
+ ```json
1055
+ {
1056
+ "loss": "MultipleNegativesRankingLoss",
1057
+ "matryoshka_dims": [
1058
+ 1024,
1059
+ 512,
1060
+ 256,
1061
+ 128,
1062
+ 64,
1063
+ 32
1064
+ ],
1065
+ "matryoshka_weights": [
1066
+ 1,
1067
+ 1,
1068
+ 1,
1069
+ 1,
1070
+ 1,
1071
+ 1
1072
+ ],
1073
+ "n_dims_per_step": -1
1074
+ }
1075
+ ```
1076
+
1077
+ ### Training Hyperparameters
1078
+ #### Non-Default Hyperparameters
1079
+
1080
+ - `eval_strategy`: steps
1081
+ - `per_device_train_batch_size`: 2048
1082
+ - `per_device_eval_batch_size`: 2048
1083
+ - `learning_rate`: 0.2
1084
+ - `num_train_epochs`: 1
1085
+ - `warmup_ratio`: 0.1
1086
+ - `bf16`: True
1087
+ - `batch_sampler`: no_duplicates
1088
+
1089
+ #### All Hyperparameters
1090
+ <details><summary>Click to expand</summary>
1091
+
1092
+ - `overwrite_output_dir`: False
1093
+ - `do_predict`: False
1094
+ - `eval_strategy`: steps
1095
+ - `prediction_loss_only`: True
1096
+ - `per_device_train_batch_size`: 2048
1097
+ - `per_device_eval_batch_size`: 2048
1098
+ - `per_gpu_train_batch_size`: None
1099
+ - `per_gpu_eval_batch_size`: None
1100
+ - `gradient_accumulation_steps`: 1
1101
+ - `eval_accumulation_steps`: None
1102
+ - `torch_empty_cache_steps`: None
1103
+ - `learning_rate`: 0.2
1104
+ - `weight_decay`: 0.0
1105
+ - `adam_beta1`: 0.9
1106
+ - `adam_beta2`: 0.999
1107
+ - `adam_epsilon`: 1e-08
1108
+ - `max_grad_norm`: 1.0
1109
+ - `num_train_epochs`: 1
1110
+ - `max_steps`: -1
1111
+ - `lr_scheduler_type`: linear
1112
+ - `lr_scheduler_kwargs`: {}
1113
+ - `warmup_ratio`: 0.1
1114
+ - `warmup_steps`: 0
1115
+ - `log_level`: passive
1116
+ - `log_level_replica`: warning
1117
+ - `log_on_each_node`: True
1118
+ - `logging_nan_inf_filter`: True
1119
+ - `save_safetensors`: True
1120
+ - `save_on_each_node`: False
1121
+ - `save_only_model`: False
1122
+ - `restore_callback_states_from_checkpoint`: False
1123
+ - `no_cuda`: False
1124
+ - `use_cpu`: False
1125
+ - `use_mps_device`: False
1126
+ - `seed`: 42
1127
+ - `data_seed`: None
1128
+ - `jit_mode_eval`: False
1129
+ - `use_ipex`: False
1130
+ - `bf16`: True
1131
+ - `fp16`: False
1132
+ - `fp16_opt_level`: O1
1133
+ - `half_precision_backend`: auto
1134
+ - `bf16_full_eval`: False
1135
+ - `fp16_full_eval`: False
1136
+ - `tf32`: None
1137
+ - `local_rank`: 0
1138
+ - `ddp_backend`: None
1139
+ - `tpu_num_cores`: None
1140
+ - `tpu_metrics_debug`: False
1141
+ - `debug`: []
1142
+ - `dataloader_drop_last`: False
1143
+ - `dataloader_num_workers`: 0
1144
+ - `dataloader_prefetch_factor`: None
1145
+ - `past_index`: -1
1146
+ - `disable_tqdm`: False
1147
+ - `remove_unused_columns`: True
1148
+ - `label_names`: None
1149
+ - `load_best_model_at_end`: False
1150
+ - `ignore_data_skip`: False
1151
+ - `fsdp`: []
1152
+ - `fsdp_min_num_params`: 0
1153
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1154
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1155
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1156
+ - `deepspeed`: None
1157
+ - `label_smoothing_factor`: 0.0
1158
+ - `optim`: adamw_torch
1159
+ - `optim_args`: None
1160
+ - `adafactor`: False
1161
+ - `group_by_length`: False
1162
+ - `length_column_name`: length
1163
+ - `ddp_find_unused_parameters`: None
1164
+ - `ddp_bucket_cap_mb`: None
1165
+ - `ddp_broadcast_buffers`: False
1166
+ - `dataloader_pin_memory`: True
1167
+ - `dataloader_persistent_workers`: False
1168
+ - `skip_memory_metrics`: True
1169
+ - `use_legacy_prediction_loop`: False
1170
+ - `push_to_hub`: False
1171
+ - `resume_from_checkpoint`: None
1172
+ - `hub_model_id`: None
1173
+ - `hub_strategy`: every_save
1174
+ - `hub_private_repo`: False
1175
+ - `hub_always_push`: False
1176
+ - `gradient_checkpointing`: False
1177
+ - `gradient_checkpointing_kwargs`: None
1178
+ - `include_inputs_for_metrics`: False
1179
+ - `eval_do_concat_batches`: True
1180
+ - `fp16_backend`: auto
1181
+ - `push_to_hub_model_id`: None
1182
+ - `push_to_hub_organization`: None
1183
+ - `mp_parameters`:
1184
+ - `auto_find_batch_size`: False
1185
+ - `full_determinism`: False
1186
+ - `torchdynamo`: None
1187
+ - `ray_scope`: last
1188
+ - `ddp_timeout`: 1800
1189
+ - `torch_compile`: False
1190
+ - `torch_compile_backend`: None
1191
+ - `torch_compile_mode`: None
1192
+ - `dispatch_batches`: None
1193
+ - `split_batches`: None
1194
+ - `include_tokens_per_second`: False
1195
+ - `include_num_input_tokens_seen`: False
1196
+ - `neftune_noise_alpha`: None
1197
+ - `optim_target_modules`: None
1198
+ - `batch_eval_metrics`: False
1199
+ - `eval_on_start`: False
1200
+ - `use_liger_kernel`: False
1201
+ - `eval_use_gather_object`: False
1202
+ - `batch_sampler`: no_duplicates
1203
+ - `multi_dataset_batch_sampler`: proportional
1204
+
1205
+ </details>
1206
+
1207
+ ### Training Logs
1208
+ | Epoch | Step | Training Loss | Validation Loss | NanoClimateFEVER_cosine_ndcg@10 | NanoDBPedia_cosine_ndcg@10 | NanoFEVER_cosine_ndcg@10 | NanoFiQA2018_cosine_ndcg@10 | NanoHotpotQA_cosine_ndcg@10 | NanoMSMARCO_cosine_ndcg@10 | NanoNFCorpus_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoQuoraRetrieval_cosine_ndcg@10 | NanoSCIDOCS_cosine_ndcg@10 | NanoArguAna_cosine_ndcg@10 | NanoSciFact_cosine_ndcg@10 | NanoTouche2020_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
1209
+ |:------:|:----:|:-------------:|:---------------:|:-------------------------------:|:--------------------------:|:------------------------:|:---------------------------:|:---------------------------:|:--------------------------:|:---------------------------:|:---------------------:|:---------------------------------:|:--------------------------:|:--------------------------:|:--------------------------:|:-----------------------------:|:----------------------------:|
1210
+ | 0 | 0 | - | - | 0.0726 | 0.3715 | 0.2100 | 0.1058 | 0.3196 | 0.3109 | 0.2221 | 0.1401 | 0.6737 | 0.1618 | 0.1183 | 0.4337 | 0.1331 | 0.2518 |
1211
+ | 0.0007 | 1 | 35.3437 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
1212
+ | 0.0682 | 100 | 16.3878 | 2.4139 | 0.2927 | 0.4729 | 0.5725 | 0.3235 | 0.5905 | 0.3674 | 0.2994 | 0.3324 | 0.9123 | 0.2326 | 0.3407 | 0.5618 | 0.5352 | 0.4488 |
1213
+ | 0.1363 | 200 | 5.94 | 1.8298 | 0.2897 | 0.4880 | 0.5624 | 0.3447 | 0.5683 | 0.4311 | 0.3066 | 0.3502 | 0.9129 | 0.2533 | 0.3335 | 0.5696 | 0.5365 | 0.4575 |
1214
+ | 0.2045 | 300 | 4.8307 | 1.5955 | 0.2780 | 0.4896 | 0.5746 | 0.3513 | 0.5815 | 0.4040 | 0.3125 | 0.3897 | 0.9190 | 0.2578 | 0.3556 | 0.5461 | 0.5401 | 0.4615 |
1215
+ | 0.2727 | 400 | 4.33 | 1.4696 | 0.3113 | 0.4909 | 0.5920 | 0.3795 | 0.5836 | 0.3919 | 0.3201 | 0.4023 | 0.9355 | 0.2535 | 0.3419 | 0.5236 | 0.5524 | 0.4676 |
1216
+ | 0.3408 | 500 | 4.0423 | 1.3887 | 0.3085 | 0.4966 | 0.5986 | 0.3794 | 0.5914 | 0.3914 | 0.3174 | 0.3590 | 0.9309 | 0.2441 | 0.3537 | 0.5311 | 0.5534 | 0.4658 |
1217
+ | 0.4090 | 600 | 3.8422 | 1.3120 | 0.3034 | 0.5052 | 0.6075 | 0.3680 | 0.5834 | 0.4136 | 0.3122 | 0.3725 | 0.9257 | 0.2477 | 0.3583 | 0.5309 | 0.5646 | 0.4687 |
1218
+ | 0.4772 | 700 | 3.6795 | 1.2693 | 0.2975 | 0.4988 | 0.5954 | 0.3785 | 0.5811 | 0.4160 | 0.3142 | 0.3908 | 0.9362 | 0.2471 | 0.3479 | 0.5520 | 0.5601 | 0.4704 |
1219
+ | 0.5453 | 800 | 3.5367 | 1.2285 | 0.3011 | 0.4947 | 0.5829 | 0.3463 | 0.5689 | 0.4369 | 0.3224 | 0.3791 | 0.9310 | 0.2430 | 0.3663 | 0.5577 | 0.5585 | 0.4684 |
1220
+ | 0.6135 | 900 | 3.4279 | 1.1963 | 0.3059 | 0.5027 | 0.5894 | 0.3674 | 0.5758 | 0.4126 | 0.3186 | 0.4066 | 0.9349 | 0.2456 | 0.3672 | 0.5560 | 0.5624 | 0.4727 |
1221
+ | 0.6817 | 1000 | 3.3637 | 1.1652 | 0.3056 | 0.5022 | 0.5849 | 0.3702 | 0.5714 | 0.4238 | 0.3161 | 0.4007 | 0.9373 | 0.2430 | 0.3699 | 0.5618 | 0.5657 | 0.4733 |
1222
+ | 0.7498 | 1100 | 3.2336 | 1.1312 | 0.3006 | 0.5038 | 0.5920 | 0.3884 | 0.5733 | 0.4241 | 0.3247 | 0.3974 | 0.9369 | 0.2431 | 0.3670 | 0.5644 | 0.5608 | 0.4751 |
1223
+ | 0.8180 | 1200 | 3.1952 | 1.1132 | 0.3044 | 0.4987 | 0.5770 | 0.3630 | 0.5735 | 0.4259 | 0.3279 | 0.3955 | 0.9428 | 0.2416 | 0.3798 | 0.5659 | 0.5641 | 0.4739 |
1224
+ | 0.8862 | 1300 | 3.1535 | 1.0926 | 0.2983 | 0.4968 | 0.5753 | 0.3812 | 0.5684 | 0.4108 | 0.3203 | 0.3965 | 0.9421 | 0.2428 | 0.3685 | 0.5608 | 0.5628 | 0.4711 |
1225
+ | 0.9543 | 1400 | 3.0691 | 1.0862 | 0.3109 | 0.5008 | 0.5870 | 0.3761 | 0.5612 | 0.4121 | 0.3204 | 0.3947 | 0.9426 | 0.2414 | 0.3708 | 0.5456 | 0.5588 | 0.4709 |
1226
+ | 1.0 | 1467 | - | - | 0.3022 | 0.5008 | 0.5891 | 0.3846 | 0.5630 | 0.4134 | 0.3205 | 0.3933 | 0.9424 | 0.2455 | 0.3703 | 0.5456 | 0.5635 | 0.4719 |
1227
+
1228
+
1229
+ ### Environmental Impact
1230
+ Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
1231
+ - **Energy Consumed**: 0.017 kWh
1232
+ - **Carbon Emitted**: 0.006 kg of CO2
1233
+ - **Hours Used**: 0.112 hours
1234
+
1235
+ ### Training Hardware
1236
+ - **On Cloud**: No
1237
+ - **GPU Model**: 1 x NVIDIA GeForce RTX 3090
1238
+ - **CPU Model**: 13th Gen Intel(R) Core(TM) i7-13700K
1239
+ - **RAM Size**: 31.78 GB
1240
+
1241
+ ### Framework Versions
1242
+ - Python: 3.11.6
1243
+ - Sentence Transformers: 3.3.0.dev0
1244
+ - Transformers: 4.45.2
1245
+ - PyTorch: 2.5.0.dev20240807+cu121
1246
+ - Accelerate: 1.0.0
1247
+ - Datasets: 2.20.0
1248
+ - Tokenizers: 0.20.1-dev.0
1249
+
1250
+ ## Citation
1251
+
1252
+ ### BibTeX
1253
+
1254
+ #### Sentence Transformers
1255
+ ```bibtex
1256
+ @inproceedings{reimers-2019-sentence-bert,
1257
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1258
+ author = "Reimers, Nils and Gurevych, Iryna",
1259
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1260
+ month = "11",
1261
+ year = "2019",
1262
+ publisher = "Association for Computational Linguistics",
1263
+ url = "https://arxiv.org/abs/1908.10084",
1264
+ }
1265
+ ```
1266
+
1267
+ #### MatryoshkaLoss
1268
+ ```bibtex
1269
+ @misc{kusupati2024matryoshka,
1270
+ title={Matryoshka Representation Learning},
1271
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
1272
+ year={2024},
1273
+ eprint={2205.13147},
1274
+ archivePrefix={arXiv},
1275
+ primaryClass={cs.LG}
1276
+ }
1277
+ ```
1278
+
1279
+ #### MultipleNegativesRankingLoss
1280
+ ```bibtex
1281
+ @misc{henderson2017efficient,
1282
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
1283
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
1284
+ year={2017},
1285
+ eprint={1705.00652},
1286
+ archivePrefix={arXiv},
1287
+ primaryClass={cs.CL}
1288
+ }
1289
+ ```
1290
+
1291
+ <!--
1292
+ ## Glossary
1293
+
1294
+ *Clearly define terms in order to be accessible across audiences.*
1295
+ -->
1296
+
1297
+ <!--
1298
+ ## Model Card Authors
1299
+
1300
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1301
+ -->
1302
+
1303
+ <!--
1304
+ ## Model Card Contact
1305
+
1306
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1307
+ -->
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.0.dev0",
4
+ "transformers": "4.45.2",
5
+ "pytorch": "2.5.0.dev20240807+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
modules.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "0_StaticEmbedding",
6
+ "type": "sentence_transformers.models.StaticEmbedding"
7
+ }
8
+ ]