elsayovita's picture
Add new SentenceTransformer model.
f3d41f0 verified
|
raw
history blame
27.9 kB
metadata
base_model: BAAI/bge-base-en-v1.5
datasets: []
language:
  - en
library_name: sentence-transformers
license: apache-2.0
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:6300
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
widget:
  - source_sentence: The net interest income for the first quarter of 2023 was $14,448 million.
    sentences:
      - >-
        What was the fair value of investments in fixed maturity securities at
        the end of 2023 after a hypothetical 100 basis point increase in
        interest rates?
      - What was the net interest income for the first quarter of 2023?
      - >-
        What are the expected consequences of the EMIR 3.0 proposals for ICE
        Futures Europe and ICE Clear Europe?
  - source_sentence: >-
      The consolidated financial statements and accompanying notes are listed in
      Part IV, Item 15(a)(1) of the Annual Report on Form 10-K
    sentences:
      - >-
        What was the total amount invested in purchases from Vebu during the
        year ended December 31, 2023?
      - >-
        What section of the Annual Report on Form 10-K includes the consolidated
        financial statements and accompanying notes?
      - >-
        What is the purpose of using constant currency to measure financial
        performance?
  - source_sentence: >-
      Cash provided by operating activities was impacted by the provision from
      the Tax Cuts and Jobs Act of 2017 which became effective in fiscal 2023
      and requires the capitalization and amortization of research and
      development costs. The change increased our cash taxes paid in fiscal
      2023.
    sentences:
      - >-
        How much did the provision from the Tax Cuts and Jobs Act increase the
        cash taxes paid in fiscal 2023?
      - What is the principal amount of debt maturing in fiscal year 2023?
      - >-
        What is the projected increase in effective tax rate starting from
        fiscal 2024?
  - source_sentence: Item 8. Financial Statements and Supplementary Data.
    sentences:
      - How does FedEx Express primarily fulfill its jet fuel needs?
      - >-
        What legislative act in the United States established a new corporate
        alternative minimum tax of 15% on large corporations?
      - What is the title of Item 8 that covers financial data in the report?
  - source_sentence: >-
      Electronic Arts paid cash dividends totaling $210 million during the
      fiscal year ended March 31, 2023.
    sentences:
      - >-
        What was the total cash dividend paid by Electronic Arts in the fiscal
        year ended March 31, 2023?
      - >-
        What was the SRO's accrued amount as a receivable for CAT implementation
        expenses as of December 31, 2023?
      - >-
        What percentage of our total U.S. dialysis patients in 2023 was covered
        under some form of government-based program?
model-index:
  - name: BGE base Financial Matryoshka
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 768
          type: dim_768
        metrics:
          - type: cosine_accuracy@1
            value: 0.6842857142857143
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8128571428571428
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.86
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.8985714285714286
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.6842857142857143
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.27095238095238094
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.172
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08985714285714284
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.6842857142857143
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8128571428571428
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.86
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.8985714285714286
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.7929325221389678
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7588820861678003
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7629563080276819
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 512
          type: dim_512
        metrics:
          - type: cosine_accuracy@1
            value: 0.6857142857142857
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.82
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8585714285714285
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9057142857142857
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.6857142857142857
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2733333333333333
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1717142857142857
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09057142857142857
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.6857142857142857
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.82
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8585714285714285
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9057142857142857
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.7963845502294126
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7614115646258502
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7648837754793252
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.6771428571428572
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8042857142857143
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8571428571428571
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.89
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.6771428571428572
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2680952380952381
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.17142857142857137
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08899999999999998
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.6771428571428572
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8042857142857143
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8571428571428571
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.89
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.784627431591255
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7506218820861676
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7549970210504993
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.6614285714285715
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.7957142857142857
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8271428571428572
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.88
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.6614285714285715
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2652380952380952
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1654285714285714
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.088
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.6614285714285715
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.7957142857142857
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8271428571428572
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.88
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.7728766261768507
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7384614512471652
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.74301468254304
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 64
          type: dim_64
        metrics:
          - type: cosine_accuracy@1
            value: 0.6128571428571429
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.7628571428571429
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.7957142857142857
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.8471428571428572
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.6128571428571429
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2542857142857143
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.15914285714285714
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0847142857142857
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.6128571428571429
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.7628571428571429
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.7957142857142857
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.8471428571428572
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.7315764159717033
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.6946094104308389
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7001749041654559
            name: Cosine Map@100

BGE base Financial Matryoshka

This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-base-en-v1.5
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("elsayovita/bge-base-financial-matryoshka-testing")
# Run inference
sentences = [
    'Electronic Arts paid cash dividends totaling $210 million during the fiscal year ended March 31, 2023.',
    'What was the total cash dividend paid by Electronic Arts in the fiscal year ended March 31, 2023?',
    "What was the SRO's accrued amount as a receivable for CAT implementation expenses as of December 31, 2023?",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.6843
cosine_accuracy@3 0.8129
cosine_accuracy@5 0.86
cosine_accuracy@10 0.8986
cosine_precision@1 0.6843
cosine_precision@3 0.271
cosine_precision@5 0.172
cosine_precision@10 0.0899
cosine_recall@1 0.6843
cosine_recall@3 0.8129
cosine_recall@5 0.86
cosine_recall@10 0.8986
cosine_ndcg@10 0.7929
cosine_mrr@10 0.7589
cosine_map@100 0.763

Information Retrieval

Metric Value
cosine_accuracy@1 0.6857
cosine_accuracy@3 0.82
cosine_accuracy@5 0.8586
cosine_accuracy@10 0.9057
cosine_precision@1 0.6857
cosine_precision@3 0.2733
cosine_precision@5 0.1717
cosine_precision@10 0.0906
cosine_recall@1 0.6857
cosine_recall@3 0.82
cosine_recall@5 0.8586
cosine_recall@10 0.9057
cosine_ndcg@10 0.7964
cosine_mrr@10 0.7614
cosine_map@100 0.7649

Information Retrieval

Metric Value
cosine_accuracy@1 0.6771
cosine_accuracy@3 0.8043
cosine_accuracy@5 0.8571
cosine_accuracy@10 0.89
cosine_precision@1 0.6771
cosine_precision@3 0.2681
cosine_precision@5 0.1714
cosine_precision@10 0.089
cosine_recall@1 0.6771
cosine_recall@3 0.8043
cosine_recall@5 0.8571
cosine_recall@10 0.89
cosine_ndcg@10 0.7846
cosine_mrr@10 0.7506
cosine_map@100 0.755

Information Retrieval

Metric Value
cosine_accuracy@1 0.6614
cosine_accuracy@3 0.7957
cosine_accuracy@5 0.8271
cosine_accuracy@10 0.88
cosine_precision@1 0.6614
cosine_precision@3 0.2652
cosine_precision@5 0.1654
cosine_precision@10 0.088
cosine_recall@1 0.6614
cosine_recall@3 0.7957
cosine_recall@5 0.8271
cosine_recall@10 0.88
cosine_ndcg@10 0.7729
cosine_mrr@10 0.7385
cosine_map@100 0.743

Information Retrieval

Metric Value
cosine_accuracy@1 0.6129
cosine_accuracy@3 0.7629
cosine_accuracy@5 0.7957
cosine_accuracy@10 0.8471
cosine_precision@1 0.6129
cosine_precision@3 0.2543
cosine_precision@5 0.1591
cosine_precision@10 0.0847
cosine_recall@1 0.6129
cosine_recall@3 0.7629
cosine_recall@5 0.7957
cosine_recall@10 0.8471
cosine_ndcg@10 0.7316
cosine_mrr@10 0.6946
cosine_map@100 0.7002

Training Details

Training Dataset

Unnamed Dataset

  • Size: 6,300 training samples
  • Columns: positive and anchor
  • Approximate statistics based on the first 1000 samples:
    positive anchor
    type string string
    details
    • min: 6 tokens
    • mean: 46.86 tokens
    • max: 252 tokens
    • min: 7 tokens
    • mean: 20.5 tokens
    • max: 51 tokens
  • Samples:
    positive anchor
    For the year ended December 31, 2023, the average balance for savings and transaction accounts was $86,102 and the interest expense for these accounts was $3,357. What was the average balance and interest expense for savings and transaction accounts in the year 2023?
    Limits are used at various levels and types to manage the size of liquidity exposures, relative to acceptable risk levels according the the organization's liquidity risk tolerance. What is the purpose of the liquidity risk limits used by the organization?
    Value-Based Care refers to the goal of incentivizing healthcare providers to simultaneously increase quality while lowering the cost of care for patients. What is the primary goal of value-based care according to the company?
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 16
  • learning_rate: 2e-05
  • num_train_epochs: 2
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • bf16: True
  • tf32: False
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 2
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: False
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss dim_128_cosine_map@100 dim_256_cosine_map@100 dim_512_cosine_map@100 dim_64_cosine_map@100 dim_768_cosine_map@100
0.8122 10 1.4746 - - - - -
0.9746 12 - 0.7378 0.7470 0.7589 0.6941 0.7563
1.6244 20 0.6694 - - - - -
1.9492 24 - 0.743 0.755 0.7649 0.7002 0.763
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.42.4
  • PyTorch: 2.4.0+cu121
  • Accelerate: 0.32.1
  • Datasets: 2.21.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning}, 
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}