|
--- |
|
base_model: BAAI/bge-small-en |
|
datasets: |
|
- sentence-transformers/hotpotqa |
|
language: |
|
- en |
|
library_name: sentence-transformers |
|
license: apache-2.0 |
|
metrics: |
|
- cosine_accuracy |
|
- dot_accuracy |
|
- manhattan_accuracy |
|
- euclidean_accuracy |
|
- max_accuracy |
|
pipeline_tag: sentence-similarity |
|
tags: |
|
- sentence-transformers |
|
- sentence-similarity |
|
- feature-extraction |
|
- generated_from_trainer |
|
- dataset_size:76064 |
|
- loss:MatryoshkaLoss |
|
- loss:TripletLoss |
|
widget: |
|
- source_sentence: The person who released "Sun Arise" was born in what year? |
|
sentences: |
|
- Peter Frampton Peter Kenneth Frampton (born 22 April 1950) is an English rock |
|
musician, singer, songwriter, producer, and guitarist. He was previously associated |
|
with the bands Humble Pie and The Herd. At the end of his 'group' career was Frampton's |
|
international breakthrough album his live release, "Frampton Comes Alive!" The |
|
album sold in the United States more than 8 million copies and spawned several |
|
single hits. Since then he has released several major albums. He has also worked |
|
with David Bowie and both Matt Cameron and Mike McCready from Pearl Jam, among |
|
others. |
|
- Sun Arise "Sun Arise" is the fourth single released by the Australian singer-songwriter |
|
Rolf Harris. Released in January 1961 in Australia and October 1962 in the UK, |
|
it was Harris' third charting hit in Australia (following "The Big Black Hat" |
|
in 1960) and second in the UK (following "Tie Me Kangaroo Down, Sport" also 1960). |
|
Unlike his early chart hits, "Sun Arise" was not a comedy record, but came within |
|
the genre of world music with its didgeridoo-inspired sound. |
|
- Circa Survive Circa Survive is an American rock band from the Philadelphia suburb |
|
of Doylestown, formed in 2004. The band, led by Anthony Green, consists of former |
|
members from Saosin, This Day Forward, and Taken. |
|
- source_sentence: What year was Chuang Chia-jung's partner in the 2010 MPS Group |
|
Championships – Doubles born? |
|
sentences: |
|
- Ko Olina Station and Center Ko Olina Station and Ko Olina Center make up a lifestyle |
|
center in the resort town of Ko Olina, a neighborhood in Kapolei, Hawaii. The |
|
shopping mall opened in 2009 and consists of two centers located across a street |
|
from each other. Ko Olina Station debuted in 2009, while the more recent Ko Olina |
|
Center finished construction in 2010. The centers contain a total of approximately |
|
31 retail tenants, with the majority of them being native Hawaiian businesses, |
|
such as ABC Stores and Peter Merriman's MonkeyPod Kitchen. |
|
- 2010 MPS Group Championships – Doubles Chuang Chia-jung and Sania Mirza were the |
|
defenders of championship title, but Mirza chose not to compete. |
|
- Lu Chia-hung Lu Chia-hung (; born 4 March 1997) is a Taiwanese male badminton |
|
player. |
|
- source_sentence: What son of Zeus in Greek mythology was said to have fatheres an |
|
Argonaut seer? |
|
sentences: |
|
- All Net Resort and Arena All Net Resort and Arena is a planned entertainment complex |
|
in Las Vegas. A project of businessman and former basketball player Jackie Robinson, |
|
the complex would encompass a resort hotel, a retail and restaurant streetscape, |
|
and a multi-purpose indoor arena with a retractable roof. Its location is set |
|
on the Strip at the former site of a Wet 'n Wild waterpark, next to the SLS Las |
|
Vegas in Winchester, Nevada. Designed by the Cuningham Group, it was planned to |
|
open in 2017, but is delayed until 2018 or 2019. |
|
- 'Piras (mythology) In Greek mythology, Piras (Ancient Greek: Πείραντα) was a king |
|
of Argos, otherwise also known as Piren, Peiren, Peiras, Peirasus and Piranthus.' |
|
- Idmon In Greek mythology, Idmon was an Argonaut seer. Allegedly a son of Apollo, |
|
he had Abas (or Ampycus) as his mortal father. His mother was Asteria, daughter |
|
of Coronus, or Cyrene, or else Antianeira, daughter of Pheres. By Laothoe he had |
|
a son Thestor. Idmon foresaw his own death in the Argonaut expedition, but joined |
|
anyway. During the outbound voyage of "Argo", a boar killed him in the land of |
|
the Mariandyni, in Bithynia. |
|
- source_sentence: In what year was the drama film in which Dorothy Duffy played Rose |
|
/ Patricia released? |
|
sentences: |
|
- Keith Davis (safety) Keith Lamont Davis (born December 30, 1978) is a former American |
|
football safety in the National Football League for the Dallas Cowboys. He played |
|
college football at Sam Houston State University. |
|
- Dorothy Duffy Dorothy Duffy (born in Douglas Bridge, Northern Ireland) is an Irish |
|
actress. She is best known for her performance as Rose / Patricia in "The Magdalene |
|
Sisters". |
|
- The Franchise Affair (film) The Franchise Affair is a 1951 British thriller film |
|
directed by Lawrence Huntington and starring Michael Denison, Dulcie Gray, Anthony |
|
Nicholls and Marjorie Fielding. It is a faithful adaptation of the novel "The |
|
Franchise Affair" by Josephine Tey. |
|
- source_sentence: Was McDull, Kung Fu Kindergarten or Pettson and Findus created |
|
first? |
|
sentences: |
|
- 'Tabaluga Tabaluga is a media franchise featuring a fictional little green Dragon |
|
of the same name, created by German Rock musician Peter Maffay, children''s songwriter |
|
and the author . The artist Helme Heine drew the image of Tabaluga as it is currently |
|
known. The character Tabaluga was first introduced by Peter Maffay in a musical |
|
fairy tale "Tabaluga ... oder die Reise zur Vernunft" (Tabaluga or... The Journey |
|
to Reason) in 1983. This first studio album was the step to success: within the |
|
next years some Helme Heine books, four sequel concept studio albums, two resounding |
|
tours, a stage musical, "Tabaluga und Lilli" ("Tabaluga and Lilli"), based on |
|
the third concept album and many TV Cartoons which have been broadcasting in over |
|
100 countries round the world followed and a children''s game show. Over 100 kindergartens |
|
and child care groups carry the word "Tabaluga" in their names.' |
|
- 2005–06 FC Bayern Munich season FC Bayern Munich won the domestic double, beating |
|
Werder Bremen by five points in Bundesliga, and defeating Eintracht Frankfurt |
|
1–0 in the DFB-Pokal final, thanks to a goal from Claudio Pizarro. The season |
|
was in spite of that tainted due to a big defeat to Milan in the UEFA Champions |
|
League, losing out 5–2 on aggregate in the Last 16. At the end of the season, |
|
Bayern signed German football's wonderkid Lukas Podolski from Köln. |
|
- 'Pettson and Findus Pettson and Findus (Swedish: "Pettson och Findus" ) is a series |
|
of children''s books written and illustrated by Swedish author Sven Nordqvist. |
|
The books feature an old farmer (Pettson) and his cat (Findus) who live in a small |
|
ramshackle farmhouse in the countryside. The first of the Pettson och Findus book |
|
to be published was "Pannkakstårtan" in 1984 (first published in English in 1985 |
|
as "Pancake Pie").' |
|
model-index: |
|
- name: BGE-base-en-v1.5-Hotpotqa |
|
results: |
|
- task: |
|
type: triplet |
|
name: Triplet |
|
dataset: |
|
name: dim 384 |
|
type: dim_384 |
|
metrics: |
|
- type: cosine_accuracy |
|
value: 0.8853525792711784 |
|
name: Cosine Accuracy |
|
- type: dot_accuracy |
|
value: 0.11464742072882159 |
|
name: Dot Accuracy |
|
- type: manhattan_accuracy |
|
value: 0.8862991008045433 |
|
name: Manhattan Accuracy |
|
- type: euclidean_accuracy |
|
value: 0.8853525792711784 |
|
name: Euclidean Accuracy |
|
- type: max_accuracy |
|
value: 0.8862991008045433 |
|
name: Max Accuracy |
|
- task: |
|
type: triplet |
|
name: Triplet |
|
dataset: |
|
name: dim 256 |
|
type: dim_256 |
|
metrics: |
|
- type: cosine_accuracy |
|
value: 0.8840511121628017 |
|
name: Cosine Accuracy |
|
- type: dot_accuracy |
|
value: 0.11571225745385708 |
|
name: Dot Accuracy |
|
- type: manhattan_accuracy |
|
value: 0.8851159488878372 |
|
name: Manhattan Accuracy |
|
- type: euclidean_accuracy |
|
value: 0.8841694273544723 |
|
name: Euclidean Accuracy |
|
- type: max_accuracy |
|
value: 0.8851159488878372 |
|
name: Max Accuracy |
|
- task: |
|
type: triplet |
|
name: Triplet |
|
dataset: |
|
name: dim 128 |
|
type: dim_128 |
|
metrics: |
|
- type: cosine_accuracy |
|
value: 0.8829862754377662 |
|
name: Cosine Accuracy |
|
- type: dot_accuracy |
|
value: 0.11831519167061051 |
|
name: Dot Accuracy |
|
- type: manhattan_accuracy |
|
value: 0.8823946994794132 |
|
name: Manhattan Accuracy |
|
- type: euclidean_accuracy |
|
value: 0.8836961665877898 |
|
name: Euclidean Accuracy |
|
- type: max_accuracy |
|
value: 0.8836961665877898 |
|
name: Max Accuracy |
|
- task: |
|
type: triplet |
|
name: Triplet |
|
dataset: |
|
name: dim 64 |
|
type: dim_64 |
|
metrics: |
|
- type: cosine_accuracy |
|
value: 0.8815664931377188 |
|
name: Cosine Accuracy |
|
- type: dot_accuracy |
|
value: 0.12434926644581164 |
|
name: Dot Accuracy |
|
- type: manhattan_accuracy |
|
value: 0.88180312352106 |
|
name: Manhattan Accuracy |
|
- type: euclidean_accuracy |
|
value: 0.88180312352106 |
|
name: Euclidean Accuracy |
|
- type: max_accuracy |
|
value: 0.88180312352106 |
|
name: Max Accuracy |
|
--- |
|
|
|
# BGE-base-en-v1.5-Hotpotqa |
|
|
|
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) on the [sentence-transformers/hotpotqa](https://huggingface.co/datasets/sentence-transformers/hotpotqa) dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
- **Model Type:** Sentence Transformer |
|
- **Base model:** [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) <!-- at revision 2275a7bdee235e9b4f01fa73aa60d3311983cfea --> |
|
- **Maximum Sequence Length:** 512 tokens |
|
- **Output Dimensionality:** 384 tokens |
|
- **Similarity Function:** Cosine Similarity |
|
- **Training Dataset:** |
|
- [sentence-transformers/hotpotqa](https://huggingface.co/datasets/sentence-transformers/hotpotqa) |
|
- **Language:** en |
|
- **License:** apache-2.0 |
|
|
|
### Model Sources |
|
|
|
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net) |
|
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) |
|
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) |
|
|
|
### Full Model Architecture |
|
|
|
``` |
|
SentenceTransformer( |
|
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel |
|
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) |
|
(2): Normalize() |
|
) |
|
``` |
|
|
|
## Usage |
|
|
|
### Direct Usage (Sentence Transformers) |
|
|
|
First install the Sentence Transformers library: |
|
|
|
```bash |
|
pip install -U sentence-transformers |
|
``` |
|
|
|
Then you can load this model and run inference. |
|
```python |
|
from sentence_transformers import SentenceTransformer |
|
|
|
# Download from the 🤗 Hub |
|
model = SentenceTransformer("sentence_transformers_model_id") |
|
# Run inference |
|
sentences = [ |
|
'Was McDull, Kung Fu Kindergarten or Pettson and Findus created first?', |
|
'Pettson and Findus Pettson and Findus (Swedish: "Pettson och Findus" ) is a series of children\'s books written and illustrated by Swedish author Sven Nordqvist. The books feature an old farmer (Pettson) and his cat (Findus) who live in a small ramshackle farmhouse in the countryside. The first of the Pettson och Findus book to be published was "Pannkakstårtan" in 1984 (first published in English in 1985 as "Pancake Pie").', |
|
'Tabaluga Tabaluga is a media franchise featuring a fictional little green Dragon of the same name, created by German Rock musician Peter Maffay, children\'s songwriter and the author . The artist Helme Heine drew the image of Tabaluga as it is currently known. The character Tabaluga was first introduced by Peter Maffay in a musical fairy tale "Tabaluga ... oder die Reise zur Vernunft" (Tabaluga or... The Journey to Reason) in 1983. This first studio album was the step to success: within the next years some Helme Heine books, four sequel concept studio albums, two resounding tours, a stage musical, "Tabaluga und Lilli" ("Tabaluga and Lilli"), based on the third concept album and many TV Cartoons which have been broadcasting in over 100 countries round the world followed and a children\'s game show. Over 100 kindergartens and child care groups carry the word "Tabaluga" in their names.', |
|
] |
|
embeddings = model.encode(sentences) |
|
print(embeddings.shape) |
|
# [3, 384] |
|
|
|
# Get the similarity scores for the embeddings |
|
similarities = model.similarity(embeddings, embeddings) |
|
print(similarities.shape) |
|
# [3, 3] |
|
``` |
|
|
|
<!-- |
|
### Direct Usage (Transformers) |
|
|
|
<details><summary>Click to see the direct usage in Transformers</summary> |
|
|
|
</details> |
|
--> |
|
|
|
<!-- |
|
### Downstream Usage (Sentence Transformers) |
|
|
|
You can finetune this model on your own dataset. |
|
|
|
<details><summary>Click to expand</summary> |
|
|
|
</details> |
|
--> |
|
|
|
<!-- |
|
### Out-of-Scope Use |
|
|
|
*List how the model may foreseeably be misused and address what users ought not to do with the model.* |
|
--> |
|
|
|
## Evaluation |
|
|
|
### Metrics |
|
|
|
#### Triplet |
|
* Dataset: `dim_384` |
|
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) |
|
|
|
| Metric | Value | |
|
|:--------------------|:-----------| |
|
| **cosine_accuracy** | **0.8854** | |
|
| dot_accuracy | 0.1146 | |
|
| manhattan_accuracy | 0.8863 | |
|
| euclidean_accuracy | 0.8854 | |
|
| max_accuracy | 0.8863 | |
|
|
|
#### Triplet |
|
* Dataset: `dim_256` |
|
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) |
|
|
|
| Metric | Value | |
|
|:--------------------|:-----------| |
|
| **cosine_accuracy** | **0.8841** | |
|
| dot_accuracy | 0.1157 | |
|
| manhattan_accuracy | 0.8851 | |
|
| euclidean_accuracy | 0.8842 | |
|
| max_accuracy | 0.8851 | |
|
|
|
#### Triplet |
|
* Dataset: `dim_128` |
|
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) |
|
|
|
| Metric | Value | |
|
|:--------------------|:----------| |
|
| **cosine_accuracy** | **0.883** | |
|
| dot_accuracy | 0.1183 | |
|
| manhattan_accuracy | 0.8824 | |
|
| euclidean_accuracy | 0.8837 | |
|
| max_accuracy | 0.8837 | |
|
|
|
#### Triplet |
|
* Dataset: `dim_64` |
|
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) |
|
|
|
| Metric | Value | |
|
|:--------------------|:-----------| |
|
| **cosine_accuracy** | **0.8816** | |
|
| dot_accuracy | 0.1243 | |
|
| manhattan_accuracy | 0.8818 | |
|
| euclidean_accuracy | 0.8818 | |
|
| max_accuracy | 0.8818 | |
|
|
|
<!-- |
|
## Bias, Risks and Limitations |
|
|
|
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* |
|
--> |
|
|
|
<!-- |
|
### Recommendations |
|
|
|
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* |
|
--> |
|
|
|
## Training Details |
|
|
|
### Training Dataset |
|
|
|
#### sentence-transformers/hotpotqa |
|
|
|
* Dataset: [sentence-transformers/hotpotqa](https://huggingface.co/datasets/sentence-transformers/hotpotqa) at [f07d3cd](https://huggingface.co/datasets/sentence-transformers/hotpotqa/tree/f07d3cd2d290ea2e83ed35e33d67d6a4658b8786) |
|
* Size: 76,064 training samples |
|
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> |
|
* Approximate statistics based on the first 1000 samples: |
|
| | anchor | positive | negative | |
|
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| |
|
| type | string | string | string | |
|
| details | <ul><li>min: 7 tokens</li><li>mean: 25.02 tokens</li><li>max: 103 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 100.08 tokens</li><li>max: 315 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 89.42 tokens</li><li>max: 375 tokens</li></ul> | |
|
* Samples: |
|
| anchor | positive | negative | |
|
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |
|
| <code>What type of songs is the singer of Saahore Baahubali best known for?</code> | <code>Saahore Baahubali "Saahore Baahubali" (English: Glory be to Baahubali) is a Telugu song from the 2017 film . Sung by Daler Mehndi, the song is composed by M. M. Keeravani, with lyrics penned by his father Siva Shakti Datta and Kodi Ramakrishna. Most of the lyrics were composed in Sanskrit.</code> | <code>Anupama Deshpande Anupama Deshapande is a Bollywood playback singer who has won the Filmfare Award for Best Female Playback Singer for her folk song "Sohni Chinab Di" in the film "Sohni Mahiwal" (1984). This song was originally meant for Asha Bhonsle who since was busy those days. Therefore, Annu Malik recorded this song in the voice of Anupama Deshpande so that it could later on dubbed by Asha Bhonsle. But on listening the song, Asha Bhonsle sportingly advised to retain the song as it was, in the voice of Anupama Deshpande by giving full credit to the anupama's singing talent. She has sung a total of 124 songs in 92 films.</code> | |
|
| <code>'Dot TV' was owned and operated by a Pan-European satellite broadcasting, on-demand internet streaming media, broadband and telephone services company with headquarters where?</code> | <code>.tv (TV channel) .tv (Pronounced as 'Dot TV', referred to onscreen as .tv - the technology channel) was a British television channel dedicated to technology. .tv was owned and operated by British Sky Broadcasting. The channel began broadcasting on 1 September 1996 as "The Computer Channel" and broadcast between 18:00 and 20:00. The broadcasting hours were increased to midday-midnight when "The Computer Channel" (later .tv) started broadcasting on British Sky Broadcasting's digital satellite platform, Sky Digital in 1998. In 1999 the channel interviewed then Microsoft CEO Bill Gates.</code> | <code>Movistar TV Movistar TV is an IPTV service operated by Telefónica. The service was started as a commercial test pilot in the city of Alicante in 2001 and later extended to some major cities such as Madrid and Barcelona in April 2004. In 2013, Movistar Imagenio was rebranded to Movistar TV.</code> | |
|
| <code>Elvira Madigan's father was born in what year?</code> | <code>Gisela Brož Gisela Antonia Brož (Brosch) (also sometimes referred to as Gisela Madigan), (4 April 1865 - 1945) was an Austrian-American circus performer, tight rope dancer, and clown. Her parents were shoemaker Joseph Brož and his wife Maria. She went to convent school in Siebenbürgen and at the age of 15 she got to know the circus family Madigans with John and Laura who at that time toured with circus Krembser in Vienna. Gisela became their foster child and got to learn tight rope dancing, this along with the couple's two year younger daughter Elvira Madigan.</code> | <code>Elvira Casazza Elvira Casazza (15 November 1887 – 24 January 1965) was an Italian mezzo-soprano opera singer (also known as Elvira Mari-Casazza). One of Toscanini's favourite singers, she was considered an outstanding interpreter of Mistress Quickly in Verdi's "Falstaff" during the 1920s and created several roles in Italian operas of the early 20th century.</code> | |
|
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters: |
|
```json |
|
{ |
|
"loss": "TripletLoss", |
|
"matryoshka_dims": [ |
|
384, |
|
256, |
|
128, |
|
64 |
|
], |
|
"matryoshka_weights": [ |
|
1, |
|
1, |
|
1, |
|
1 |
|
], |
|
"n_dims_per_step": -1 |
|
} |
|
``` |
|
|
|
### Evaluation Dataset |
|
|
|
#### sentence-transformers/hotpotqa |
|
|
|
* Dataset: [sentence-transformers/hotpotqa](https://huggingface.co/datasets/sentence-transformers/hotpotqa) at [f07d3cd](https://huggingface.co/datasets/sentence-transformers/hotpotqa/tree/f07d3cd2d290ea2e83ed35e33d67d6a4658b8786) |
|
* Size: 8,452 evaluation samples |
|
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> |
|
* Approximate statistics based on the first 1000 samples: |
|
| | anchor | positive | negative | |
|
|:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| |
|
| type | string | string | string | |
|
| details | <ul><li>min: 10 tokens</li><li>mean: 25.14 tokens</li><li>max: 130 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 102.4 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 88.09 tokens</li><li>max: 358 tokens</li></ul> | |
|
* Samples: |
|
| anchor | positive | negative | |
|
|:--------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |
|
| <code>When was the English former professional footballer which Tslil Sela has an alledged relationship with born?</code> | <code>Tslil Sela Tslil Sela (Hebrew: צליל סלע , born 26 October 1987) is an Israeli model, most known for her modeling work and for her alleged relationship with English footballer Rio Ferdinand. Sela is leading the campaign for KOOI fashion 2010, and Sanyang Motorcycles (SYM Motors) in Israel.</code> | <code>Sam Collins (English footballer) Samuel Jason Collins (born 5 June 1977) is an English football manager and former footballer who played as a defender. His brother, Simon, is also a former professional footballer and manager.</code> | |
|
| <code>Gebhard Leberecht von Blucher the Prussian Generalfieldmarschall led his army against this famous commander in the Battle of Lepzig?</code> | <code>Gebhard Leberecht von Blücher Gebhard Leberecht von Blücher, Fürst von Wahlstatt (] ; 16 December 1742 – 12 September 1819), "Graf" (count), later elevated to "Fürst" (sovereign prince) von Wahlstatt, was a Prussian "Generalfeldmarschall" (field marshal). He earned his greatest recognition after leading his army against Napoleon I at the Battle of the Nations at Leipzig in 1813 and the Battle of Waterloo in 1815.</code> | <code>Karl Freiherr von Müffling Friedrich Karl Ferdinand Freiherr von Müffling, called Weiss (12 June 177510 January 1851) was a Prussian "Generalfeldmarschall" and military theorist. He served as Blücher's liaison officer in Wellington's headquarters during the Battle of Waterloo and was one of the organizers of the final victory over Napoleon. After the wars he served a diplomatic role at the Congress of Aix-la-Chappelle and was a major contributor to the development of the Prussian General Staff as Chief. Müffling also specialized in military topography and cartography.</code> | |
|
| <code>The Platonia Dilemma was introduced in the book "Metamagical Themas" which was written by an author born in what year?</code> | <code>Platonia dilemma In the platonia dilemma introduced in Douglas Hofstadter's book "Metamagical Themas", an eccentric trillionaire gathers 20 people together, and tells them that if one and only one of them sends him a telegram (reverse charges) by noon the next day, that person will receive a billion dollars. If he receives more than one telegram, or none at all, no one will get any money, and cooperation between players is forbidden. In this situation, the superrational thing to do is to send a telegram with probability 1/20.</code> | <code>John Alexander Stewart (philosopher) John Alexander Stewart (19 October 1846 – 27 December 1933) was a Scottish writer, educator and philosopher. He was a university professor and classical lecturer at Christ Church, Oxford from 1875 to 1883, White's Professor of Moral Philosophy at Oxford, and professorial fellow of Corpus Christi College, from 1897 to his retirement in 1927. Throughout his academic career, he was an editor and author of works on Aristotle and considered one of the foremost experts on the subject. His best known books were "Notes on the Nicomachean Ethics of Aristotle" (1892) and "The Myths of Plato" (1905).</code> | |
|
* Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters: |
|
```json |
|
{ |
|
"loss": "TripletLoss", |
|
"matryoshka_dims": [ |
|
384, |
|
256, |
|
128, |
|
64 |
|
], |
|
"matryoshka_weights": [ |
|
1, |
|
1, |
|
1, |
|
1 |
|
], |
|
"n_dims_per_step": -1 |
|
} |
|
``` |
|
|
|
### Training Hyperparameters |
|
#### Non-Default Hyperparameters |
|
|
|
- `eval_strategy`: steps |
|
- `per_device_train_batch_size`: 32 |
|
- `per_device_eval_batch_size`: 32 |
|
- `gradient_accumulation_steps`: 16 |
|
- `learning_rate`: 2e-05 |
|
- `num_train_epochs`: 20 |
|
- `lr_scheduler_type`: cosine |
|
- `warmup_ratio`: 0.1 |
|
- `bf16`: True |
|
- `tf32`: True |
|
- `load_best_model_at_end`: True |
|
- `optim`: adamw_torch_fused |
|
- `resume_from_checkpoint`: bge-small-hotpotwa-matryoshka |
|
- `batch_sampler`: no_duplicates |
|
|
|
#### All Hyperparameters |
|
<details><summary>Click to expand</summary> |
|
|
|
- `overwrite_output_dir`: False |
|
- `do_predict`: False |
|
- `eval_strategy`: steps |
|
- `prediction_loss_only`: True |
|
- `per_device_train_batch_size`: 32 |
|
- `per_device_eval_batch_size`: 32 |
|
- `per_gpu_train_batch_size`: None |
|
- `per_gpu_eval_batch_size`: None |
|
- `gradient_accumulation_steps`: 16 |
|
- `eval_accumulation_steps`: None |
|
- `learning_rate`: 2e-05 |
|
- `weight_decay`: 0.0 |
|
- `adam_beta1`: 0.9 |
|
- `adam_beta2`: 0.999 |
|
- `adam_epsilon`: 1e-08 |
|
- `max_grad_norm`: 1.0 |
|
- `num_train_epochs`: 20 |
|
- `max_steps`: -1 |
|
- `lr_scheduler_type`: cosine |
|
- `lr_scheduler_kwargs`: {} |
|
- `warmup_ratio`: 0.1 |
|
- `warmup_steps`: 0 |
|
- `log_level`: passive |
|
- `log_level_replica`: warning |
|
- `log_on_each_node`: True |
|
- `logging_nan_inf_filter`: True |
|
- `save_safetensors`: True |
|
- `save_on_each_node`: False |
|
- `save_only_model`: False |
|
- `restore_callback_states_from_checkpoint`: False |
|
- `no_cuda`: False |
|
- `use_cpu`: False |
|
- `use_mps_device`: False |
|
- `seed`: 42 |
|
- `data_seed`: None |
|
- `jit_mode_eval`: False |
|
- `use_ipex`: False |
|
- `bf16`: True |
|
- `fp16`: False |
|
- `fp16_opt_level`: O1 |
|
- `half_precision_backend`: auto |
|
- `bf16_full_eval`: False |
|
- `fp16_full_eval`: False |
|
- `tf32`: True |
|
- `local_rank`: 0 |
|
- `ddp_backend`: None |
|
- `tpu_num_cores`: None |
|
- `tpu_metrics_debug`: False |
|
- `debug`: [] |
|
- `dataloader_drop_last`: False |
|
- `dataloader_num_workers`: 0 |
|
- `dataloader_prefetch_factor`: None |
|
- `past_index`: -1 |
|
- `disable_tqdm`: False |
|
- `remove_unused_columns`: True |
|
- `label_names`: None |
|
- `load_best_model_at_end`: True |
|
- `ignore_data_skip`: False |
|
- `fsdp`: [] |
|
- `fsdp_min_num_params`: 0 |
|
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} |
|
- `fsdp_transformer_layer_cls_to_wrap`: None |
|
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} |
|
- `deepspeed`: None |
|
- `label_smoothing_factor`: 0.0 |
|
- `optim`: adamw_torch_fused |
|
- `optim_args`: None |
|
- `adafactor`: False |
|
- `group_by_length`: False |
|
- `length_column_name`: length |
|
- `ddp_find_unused_parameters`: None |
|
- `ddp_bucket_cap_mb`: None |
|
- `ddp_broadcast_buffers`: False |
|
- `dataloader_pin_memory`: True |
|
- `dataloader_persistent_workers`: False |
|
- `skip_memory_metrics`: True |
|
- `use_legacy_prediction_loop`: False |
|
- `push_to_hub`: False |
|
- `resume_from_checkpoint`: bge-small-hotpotwa-matryoshka |
|
- `hub_model_id`: None |
|
- `hub_strategy`: every_save |
|
- `hub_private_repo`: False |
|
- `hub_always_push`: False |
|
- `gradient_checkpointing`: False |
|
- `gradient_checkpointing_kwargs`: None |
|
- `include_inputs_for_metrics`: False |
|
- `eval_do_concat_batches`: True |
|
- `fp16_backend`: auto |
|
- `push_to_hub_model_id`: None |
|
- `push_to_hub_organization`: None |
|
- `mp_parameters`: |
|
- `auto_find_batch_size`: False |
|
- `full_determinism`: False |
|
- `torchdynamo`: None |
|
- `ray_scope`: last |
|
- `ddp_timeout`: 1800 |
|
- `torch_compile`: False |
|
- `torch_compile_backend`: None |
|
- `torch_compile_mode`: None |
|
- `dispatch_batches`: None |
|
- `split_batches`: None |
|
- `include_tokens_per_second`: False |
|
- `include_num_input_tokens_seen`: False |
|
- `neftune_noise_alpha`: None |
|
- `optim_target_modules`: None |
|
- `batch_eval_metrics`: False |
|
- `batch_sampler`: no_duplicates |
|
- `multi_dataset_batch_sampler`: proportional |
|
|
|
</details> |
|
|
|
### Training Logs |
|
| Epoch | Step | Training Loss | loss | dim_128_cosine_accuracy | dim_256_cosine_accuracy | dim_384_cosine_accuracy | dim_64_cosine_accuracy | |
|
|:------:|:----:|:-------------:|:-------:|:-----------------------:|:-----------------------:|:-----------------------:|:----------------------:| |
|
| 0.3366 | 50 | 19.5492 | 19.2604 | 0.9585 | 0.9657 | 0.9663 | 0.9432 | |
|
| 0.6731 | 100 | 19.1976 | 18.2958 | 0.9359 | 0.9392 | 0.9425 | 0.9276 | |
|
| 1.0097 | 150 | 18.4746 | 16.9846 | 0.9053 | 0.9075 | 0.9085 | 0.8996 | |
|
| 1.3462 | 200 | 18.0684 | 16.6869 | 0.9030 | 0.9051 | 0.9049 | 0.8959 | |
|
| 1.6828 | 250 | 17.8979 | 16.5780 | 0.9017 | 0.9030 | 0.9016 | 0.8954 | |
|
| 2.0194 | 300 | 17.7545 | 16.5135 | 0.8977 | 0.8991 | 0.8984 | 0.8925 | |
|
| 2.3559 | 350 | 17.6046 | 16.4917 | 0.8894 | 0.8894 | 0.8907 | 0.8862 | |
|
| 2.6925 | 400 | 17.4434 | 16.4926 | 0.8874 | 0.8862 | 0.8875 | 0.8858 | |
|
| 3.0290 | 450 | 17.3278 | 16.4757 | 0.8854 | 0.8861 | 0.8869 | 0.8859 | |
|
| 3.3656 | 500 | 17.247 | 16.4735 | 0.8830 | 0.8841 | 0.8854 | 0.8816 | |
|
|
|
|
|
### Framework Versions |
|
- Python: 3.10.10 |
|
- Sentence Transformers: 3.0.1 |
|
- Transformers: 4.41.2 |
|
- PyTorch: 2.1.2+cu121 |
|
- Accelerate: 0.33.0 |
|
- Datasets: 2.19.1 |
|
- Tokenizers: 0.19.1 |
|
|
|
## Citation |
|
|
|
### BibTeX |
|
|
|
#### Sentence Transformers |
|
```bibtex |
|
@inproceedings{reimers-2019-sentence-bert, |
|
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", |
|
author = "Reimers, Nils and Gurevych, Iryna", |
|
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", |
|
month = "11", |
|
year = "2019", |
|
publisher = "Association for Computational Linguistics", |
|
url = "https://arxiv.org/abs/1908.10084", |
|
} |
|
``` |
|
|
|
#### MatryoshkaLoss |
|
```bibtex |
|
@misc{kusupati2024matryoshka, |
|
title={Matryoshka Representation Learning}, |
|
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi}, |
|
year={2024}, |
|
eprint={2205.13147}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.LG} |
|
} |
|
``` |
|
|
|
#### TripletLoss |
|
```bibtex |
|
@misc{hermans2017defense, |
|
title={In Defense of the Triplet Loss for Person Re-Identification}, |
|
author={Alexander Hermans and Lucas Beyer and Bastian Leibe}, |
|
year={2017}, |
|
eprint={1703.07737}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.CV} |
|
} |
|
``` |
|
|
|
<!-- |
|
## Glossary |
|
|
|
*Clearly define terms in order to be accessible across audiences.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Authors |
|
|
|
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Contact |
|
|
|
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* |
|
--> |