SentenceTransformer based on FacebookAI/xlm-roberta-base

This is a sentence-transformers model finetuned from FacebookAI/xlm-roberta-base. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: FacebookAI/xlm-roberta-base
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'Most Common Apple Varieties',
    'The most popular apple varieties are Cortland, Red Delicious, Golden Delicious, Empire, Fuji, Gala, Ida Red, Macoun, McIntosh, Northern Spy, and Winesap. Olwen Woodier also offers descriptions for an additional 20 varieties of apples in this very useful and informative cookbook. Cortland.',
    'Well, rest easy, because this condensed list of the 18 most popular apple varieties breaks down the information every apple eater should knowâ\x80\x94how to cook them, best recipes, and when they are in season. Red Delicious: A popular eating apple that looks just how we all imagine an apple should.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 502,912 training samples
  • Columns: sentence_0, sentence_1, sentence_2, and label
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1 sentence_2 label
    type string string string float
    details
    • min: 4 tokens
    • mean: 9.88 tokens
    • max: 59 tokens
    • min: 19 tokens
    • mean: 88.5 tokens
    • max: 232 tokens
    • min: 17 tokens
    • mean: 87.87 tokens
    • max: 282 tokens
    • min: -16.56
    • mean: 0.96
    • max: 20.84
  • Samples:
    sentence_0 sentence_1 sentence_2 label
    how long are bank issued checks good for Your mom is correct....most checks are good for anywhere between 180 days up to 1 year. Sorry, but you probably won't be able to cash those checks, although it never hurts to check with your bank on the issue. DH · 9 years ago. Non-local personal and business checks. If the check is from a bank in a different federal reserve district than the depositing bank, it can be held for 5 business days under normal circumstances. Exceptions for new customers during the first 30 days. Banks are not required to give next day ability on the first $100 of deposits, and both local and non-local personal and business checks can be held for a maximum of 11 business days. 2.6526598930358887
    11:11 meaning 11-11-11 11:11:11 example. 11-11 11:11 example. Numerologists believe that events linked to the time 11:11 appear more often than can be explained by chance or coincidence. This belief is related to the concept of synchronicity. Some authors claim that seeing 11:11 on a clock is an auspicious sign. Sometimes it's difficult to describe what seeing the 11:11 means, because it is a personal experience for everyone. If you feel you are having these experiences for a reason, then it might be that only you will know what these number prompts and wake-up calls mean. -1.3284940719604492
    did someone from pawn stars die Did someone from pawn stars on history channel die? kgb answers » Arts & Entertainment » Actors and Actresses » Did someone from pawn stars on history channel die? None from the actors & cast of Pawn Stars died. There was a rumor that Leonard Shaffer, a coin expert, died but it is not true. He is alive & well. Tags: pawn stars, lists of actors. Austin Russell, also known as Chumlee, star of History's reality series Pawn Stars, has died from an apparent heart attack, sources confirm to eBuzzd. 1.7131614685058594
  • Loss: MarginMSELoss

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • num_train_epochs: 30
  • fp16: True
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 64
  • per_device_eval_batch_size: 64
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 30
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Click to expand
Epoch Step Training Loss
0.0636 500 92.5416
0.1273 1000 20.6659
0.1909 1500 14.7631
0.2545 2000 14.3025
0.3181 2500 13.5257
0.3818 3000 12.8666
0.4454 3500 12.397
0.5090 4000 12.2718
0.5727 4500 11.539
0.6363 5000 11.1145
0.6999 5500 11.1232
0.7636 6000 10.6021
0.8272 6500 10.4115
0.8908 7000 10.4529
0.9544 7500 10.1329
1.0181 8000 10.1367
1.0817 8500 9.5914
1.1453 9000 9.2799
1.2090 9500 9.266
1.2726 10000 9.1661
1.3362 10500 8.954
1.3998 11000 8.9562
1.4635 11500 9.4717
1.5271 12000 8.6758
1.5907 12500 8.87
1.6544 13000 8.5826
1.7180 13500 8.4827
1.7816 14000 8.5306
1.8453 14500 8.182
1.9089 15000 8.3592
1.9725 15500 8.3879
2.0361 16000 7.4399
2.0998 16500 7.0406
2.1634 17000 6.89
2.2270 17500 6.8651
2.2907 18000 6.8461
2.3543 18500 6.7663
2.4179 19000 6.9313
2.4815 19500 6.9688
2.5452 20000 6.7821
2.6088 20500 6.9468
2.6724 21000 6.731
2.7361 21500 6.649
2.7997 22000 6.7055
2.8633 22500 6.7744
2.9270 23000 6.9481
2.9906 23500 6.5967
3.0542 24000 5.7351
3.1178 24500 5.4125
3.1815 25000 5.4095
3.2451 25500 5.4253
3.3087 26000 5.3774
3.3724 26500 5.5277
3.4360 27000 5.4516
3.4996 27500 5.322
3.5632 28000 5.5531
3.6269 28500 5.5238
3.6905 29000 5.5992
3.7541 29500 5.5351
3.8178 30000 5.3985
3.8814 30500 5.4313
3.9450 31000 5.4173
4.0087 31500 5.2333
4.0723 32000 4.3352
4.1359 32500 4.3442
4.1995 33000 4.3288
4.2632 33500 4.367
4.3268 34000 4.4607
4.3904 34500 4.4461
4.4541 35000 4.6218
4.5177 35500 4.4249
4.5813 36000 4.4129
4.6449 36500 4.4065
4.7086 37000 4.5452
4.7722 37500 4.5411
4.8358 38000 4.5423
4.8995 38500 4.4942
4.9631 39000 4.5332
5.0267 39500 4.0759
5.0904 40000 3.6274
5.1540 40500 3.6795
5.2176 41000 3.6741
5.2812 41500 3.7396
5.3449 42000 3.6839
5.4085 42500 3.732
5.4721 43000 3.6557
5.5358 43500 3.6925
5.5994 44000 3.7149
5.6630 44500 3.6744
5.7266 45000 3.7669
5.7903 45500 3.651
5.8539 46000 3.721
5.9175 46500 3.7012
5.9812 47000 3.7294
6.0448 47500 3.2432
6.1084 48000 3.0295
6.1721 48500 3.0364
6.2357 49000 3.0687
6.2993 49500 3.064
6.3629 50000 3.112
6.4266 50500 3.1438
6.4902 51000 3.0733
6.5538 51500 3.1719
6.6175 52000 3.1355
6.6811 52500 3.1612
6.7447 53000 3.1938
6.8083 53500 3.1375
6.8720 54000 3.1969
6.9356 54500 3.2214
6.9992 55000 3.1364
7.0629 55500 2.63
7.1265 56000 2.5451
7.1901 56500 2.644
7.2538 57000 2.6482
7.3174 57500 2.6017
7.3810 58000 2.6626
7.4446 58500 2.6698
7.5083 59000 2.6595
7.5719 59500 2.6683
7.6355 60000 2.7187
7.6992 60500 2.6213
7.7628 61000 2.7119
7.8264 61500 2.739
7.8900 62000 2.686
7.9537 62500 2.7295
8.0173 63000 2.6062
8.0809 63500 2.2272
8.1446 64000 2.2692
8.2082 64500 2.3135
8.2718 65000 2.2546
8.3355 65500 2.2882
8.3991 66000 2.2749
8.4627 66500 2.363
8.5263 67000 2.2923
8.5900 67500 2.3275
8.6536 68000 2.3738
8.7172 68500 2.3416
8.7809 69000 2.3851
8.8445 69500 2.3356
8.9081 70000 2.3598
8.9717 70500 2.4272
9.0354 71000 2.141
9.0990 71500 2.001
9.1626 72000 2.014
9.2263 72500 1.9826
9.2899 73000 1.995
9.3535 73500 2.0097
9.4172 74000 2.0412
9.4808 74500 2.0144
9.5444 75000 2.0653
9.6080 75500 2.022
9.6717 76000 2.0327
9.7353 76500 2.0596
9.7989 77000 2.0761
9.8626 77500 2.1245
9.9262 78000 2.1062
9.9898 78500 2.1186
10.0534 79000 1.8283
10.1171 79500 1.7627
10.1807 80000 1.7775
10.2443 80500 1.7865
10.3080 81000 1.8018
10.3716 81500 1.7851
10.4352 82000 1.8085
10.4989 82500 1.8293
10.5625 83000 1.8549
10.6261 83500 1.8531
10.6897 84000 1.8538
10.7534 84500 1.8814
10.8170 85000 1.8576
10.8806 85500 1.8516
10.9443 86000 1.8555
11.0079 86500 1.8631
11.0715 87000 1.6189
11.1351 87500 1.6143
11.1988 88000 1.6246
11.2624 88500 1.5997
11.3260 89000 1.646
11.3897 89500 1.6323
11.4533 90000 1.6623
11.5169 90500 1.6544
11.5806 91000 1.6671
11.6442 91500 1.6742
11.7078 92000 1.6409
11.7714 92500 1.6504
11.8351 93000 1.6791
11.8987 93500 1.6923
11.9623 94000 1.697
12.0260 94500 1.6136
12.0896 95000 1.4437
12.1532 95500 1.49
12.2168 96000 1.4567
12.2805 96500 1.5007
12.3441 97000 1.4826
12.4077 97500 1.4668
12.4714 98000 1.5009
12.5350 98500 1.5008
12.5986 99000 1.5336
12.6623 99500 1.5057
12.7259 100000 1.5081
12.7895 100500 1.5402
12.8531 101000 1.5519
12.9168 101500 1.5171
12.9804 102000 1.5249
13.0440 102500 1.4117
13.1077 103000 1.3524
13.1713 103500 1.3564
13.2349 104000 1.3483
13.2985 104500 1.386
13.3622 105000 1.3723
13.4258 105500 1.3933
13.4894 106000 1.3672
13.5531 106500 1.3796
13.6167 107000 1.3637
13.6803 107500 1.4061
13.7440 108000 1.3897
13.8076 108500 1.4342
13.8712 109000 1.3821
13.9348 109500 1.411
13.9985 110000 1.4214
14.0621 110500 1.2551
14.1257 111000 1.2366
14.1894 111500 1.2553
14.2530 112000 1.2553
14.3166 112500 1.2624
14.3802 113000 1.2771
14.4439 113500 1.2744
14.5075 114000 1.2616
14.5711 114500 1.2744
14.6348 115000 1.2705
14.6984 115500 1.3005
14.7620 116000 1.3013
14.8257 116500 1.298
14.8893 117000 1.2972
14.9529 117500 1.277
15.0165 118000 1.2718
15.0802 118500 1.1697
15.1438 119000 1.1819
15.2074 119500 1.1916
15.2711 120000 1.1829
15.3347 120500 1.1632
15.3983 121000 1.1809
15.4619 121500 1.1913
15.5256 122000 1.1916
15.5892 122500 1.1969
15.6528 123000 1.1929
15.7165 123500 1.2086
15.7801 124000 1.1864
15.8437 124500 1.2068
15.9074 125000 1.2253
15.9710 125500 1.1963
16.0346 126000 1.1585
16.0982 126500 1.0834
16.1619 127000 1.0937
16.2255 127500 1.0995
16.2891 128000 1.0787
16.3528 128500 1.1217
16.4164 129000 1.1185
16.4800 129500 1.1203
16.5436 130000 1.1201
16.6073 130500 1.125
16.6709 131000 1.1214
16.7345 131500 1.1228
16.7982 132000 1.1381
16.8618 132500 1.1414
16.9254 133000 1.123
16.9891 133500 1.1003
17.0527 134000 1.0447
17.1163 134500 1.036
17.1799 135000 1.0264
17.2436 135500 1.0375
17.3072 136000 1.0509
17.3708 136500 1.0452
17.4345 137000 1.0519
17.4981 137500 1.0498
17.5617 138000 1.0514
17.6253 138500 1.054
17.6890 139000 1.0457
17.7526 139500 1.0582
17.8162 140000 1.0566
17.8799 140500 1.0644
17.9435 141000 1.0579
18.0071 141500 1.0647
18.0708 142000 0.9704
18.1344 142500 0.9787
18.1980 143000 0.9875
18.2616 143500 0.987
18.3253 144000 0.9834
18.3889 144500 0.999
18.4525 145000 0.9872
18.5162 145500 0.9851
18.5798 146000 0.9986
18.6434 146500 0.9853
18.7071 147000 0.9973
18.7707 147500 0.988
18.8343 148000 0.999
18.8979 148500 0.9899
18.9616 149000 1.0053
19.0252 149500 0.9802
19.0888 150000 0.9301
19.1525 150500 0.9295
19.2161 151000 0.9334
19.2797 151500 0.9503
19.3433 152000 0.9161
19.4070 152500 0.9433
19.4706 153000 0.9376
19.5342 153500 0.9274
19.5979 154000 0.9414
19.6615 154500 0.94
19.7251 155000 0.9344
19.7888 155500 0.9464
19.8524 156000 0.9583
19.9160 156500 0.953
19.9796 157000 0.9481
20.0433 157500 0.8982
20.1069 158000 0.8974
20.1705 158500 0.9022
20.2342 159000 0.8923
20.2978 159500 0.8935
20.3614 160000 0.8917
20.4250 160500 0.9021
20.4887 161000 0.8978
20.5523 161500 0.9078
20.6159 162000 0.903
20.6796 162500 0.8989
20.7432 163000 0.9023
20.8068 163500 0.8918
20.8705 164000 0.8968
20.9341 164500 0.8977
20.9977 165000 0.9035
21.0613 165500 0.8347
21.1250 166000 0.8415
21.1886 166500 0.8472
21.2522 167000 0.8663
21.3159 167500 0.8633
21.3795 168000 0.8569
21.4431 168500 0.8529
21.5067 169000 0.8485
21.5704 169500 0.8759
21.6340 170000 0.8667
21.6976 170500 0.8615
21.7613 171000 0.8623
21.8249 171500 0.8613
21.8885 172000 0.8515
21.9522 172500 0.8615
22.0158 173000 0.8457
22.0794 173500 0.8106
22.1430 174000 0.8109
22.2067 174500 0.8108
22.2703 175000 0.8197
22.3339 175500 0.8165
22.3976 176000 0.8289
22.4612 176500 0.8288
22.5248 177000 0.8145
22.5884 177500 0.8249
22.6521 178000 0.8218
22.7157 178500 0.8284
22.7793 179000 0.833
22.8430 179500 0.8176
22.9066 180000 0.8431
22.9702 180500 0.8234
23.0339 181000 0.7998
23.0975 181500 0.7821
23.1611 182000 0.7914
23.2247 182500 0.7851
23.2884 183000 0.7797
23.3520 183500 0.7931
23.4156 184000 0.7912
23.4793 184500 0.7876
23.5429 185000 0.7954
23.6065 185500 0.7946
23.6701 186000 0.7782
23.7338 186500 0.7952
23.7974 187000 0.8015
23.8610 187500 0.7977
23.9247 188000 0.7875
23.9883 188500 0.7935
24.0519 189000 0.7617
24.1156 189500 0.7625
24.1792 190000 0.7514
24.2428 190500 0.7662
24.3064 191000 0.7692
24.3701 191500 0.7733
24.4337 192000 0.7561
24.4973 192500 0.7577
24.5610 193000 0.7687
24.6246 193500 0.7647
24.6882 194000 0.7717
24.7518 194500 0.761
24.8155 195000 0.7661
24.8791 195500 0.7446
24.9427 196000 0.7659
25.0064 196500 0.7559
25.0700 197000 0.7183
25.1336 197500 0.7399
25.1973 198000 0.7308
25.2609 198500 0.733
25.3245 199000 0.746
25.3881 199500 0.7274
25.4518 200000 0.7358
25.5154 200500 0.7468
25.5790 201000 0.734
25.6427 201500 0.7493
25.7063 202000 0.7263
25.7699 202500 0.7355
25.8335 203000 0.745
25.8972 203500 0.7301
25.9608 204000 0.7457
26.0244 204500 0.7072
26.0881 205000 0.7212
26.1517 205500 0.7186
26.2153 206000 0.7225
26.2790 206500 0.7065
26.3426 207000 0.7153
26.4062 207500 0.72
26.4698 208000 0.7074
26.5335 208500 0.7117
26.5971 209000 0.7206
26.6607 209500 0.7132
26.7244 210000 0.7199
26.7880 210500 0.7102
26.8516 211000 0.7155
26.9152 211500 0.7057
26.9789 212000 0.7191
27.0425 212500 0.6942
27.1061 213000 0.6924
27.1698 213500 0.7025
27.2334 214000 0.6911
27.2970 214500 0.6955
27.3607 215000 0.6875
27.4243 215500 0.698
27.4879 216000 0.7054
27.5515 216500 0.6968
27.6152 217000 0.7044
27.6788 217500 0.6946
27.7424 218000 0.6865
27.8061 218500 0.6974
27.8697 219000 0.698
27.9333 219500 0.6943
27.9969 220000 0.6985
28.0606 220500 0.6785
28.1242 221000 0.6842
28.1878 221500 0.6832
28.2515 222000 0.6863
28.3151 222500 0.6806
28.3787 223000 0.6897
28.4424 223500 0.6975
28.5060 224000 0.6802
28.5696 224500 0.6836
28.6332 225000 0.6849
28.6969 225500 0.6781
28.7605 226000 0.6761
28.8241 226500 0.6762
28.8878 227000 0.6781
28.9514 227500 0.682
29.0150 228000 0.6742
29.0786 228500 0.6595
29.1423 229000 0.683
29.2059 229500 0.6721
29.2695 230000 0.669
29.3332 230500 0.683
29.3968 231000 0.6652
29.4604 231500 0.671
29.5241 232000 0.6662
29.5877 232500 0.6665
29.6513 233000 0.6718
29.7149 233500 0.6657
29.7786 234000 0.6677
29.8422 234500 0.6732
29.9058 235000 0.6687
29.9695 235500 0.6732

Framework Versions

  • Python: 3.11.5
  • Sentence Transformers: 3.4.0
  • Transformers: 4.48.0
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.2.1
  • Datasets: 2.21.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MarginMSELoss

@misc{hofstätter2021improving,
    title={Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation},
    author={Sebastian Hofstätter and Sophia Althammer and Michael Schröder and Mete Sertkan and Allan Hanbury},
    year={2021},
    eprint={2010.02666},
    archivePrefix={arXiv},
    primaryClass={cs.IR}
}
Downloads last month
12
Safetensors
Model size
278M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for anonymous202501/xlm-roberta-base-msmarco

Finetuned
(2723)
this model