---
base_model: Alibaba-NLP/gte-base-en-v1.5
datasets: []
language:
- en
library_name: sentence-transformers
license: apache-2.0
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:32833
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: Anonymity in online interactions can lead to a disinhibition effect,
where individuals feel free to express hostile or aggressive opinions they might
otherwise suppress.
sentences:
- What are the implications of anonymity in online interactions?
- How does creativity function as a form of costly signalling in personal expressions
such as invitations?
- Why is conflict considered essential in a creative organization?
- source_sentence: The author decides to release their novel into the world despite
its imperfections, and finds that this allows them to move on to new projects
and experiences, and to focus on the value of the work itself rather than its
flaws.
sentences:
- How does the author's experience with their novel illustrate the concept of 'embracing
imperfection' in creative work?
- What does the author mean by 'ambitious programmers are better off doing their
own thing'?
- What is the role of 'show me' in the design process?
- source_sentence: Tokens become more valuable as more users adopt them, creating
a positive feedback loop that enhances their utility and encourages further adoption
across various applications.
sentences:
- In what ways do tokens exhibit network effects?
- What can sometimes be found when considering a startup with a lame-sounding idea?
- How do social norms influence decision-making in the context of airport choices?
- source_sentence: Philosophers are often viewed as the guardians of critical thinking;
however, their reliance on bureaucratic structures and abstract discussions can
become problematic. Instead of fostering open-mindedness, they may perpetuate
dogmatic thinking and limit the exploration of diverse perspectives, thereby failing
to fulfill their duty of promoting genuine critical engagement.
sentences:
- In what ways can the role of philosophers be seen as essential or problematic
within the context of critical thinking?
- How does the evolution of pair-bonding facilitate cultural exchange between groups?
- What is the role of autonomy in the success of acquired startups?
- source_sentence: Society tends to admire those who despair when others hope, viewing
them as sages or wise figures.
sentences:
- What is often the societal perception of those who express pessimism about the
future?
- How did the realization about user engagement influence the app development strategy?
- What lessons can be learned from the historical context of employee relations
in large corporations?
model-index:
- name: Custom Embedding Test - Anudit Nagar
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.7683027145599123
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8755141211955032
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.9097888675623801
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9465313956676721
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7683027145599123
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.29183804039850103
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.18195777351247602
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09465313956676721
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7683027145599123
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8755141211955032
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.9097888675623801
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9465313956676721
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8566925927271383
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8279207524340517
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8302321946792381
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.762818755141212
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8700301617768028
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.9062242939402249
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.946257197696737
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.762818755141212
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2900100539256009
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.18124485878804497
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09462571976967371
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.762818755141212
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8700301617768028
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.9062242939402249
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.946257197696737
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8529743473843932
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8231949721667308
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.825407004380477
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.762818755141212
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8683849739511927
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.9015629284343296
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9418700301617768
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.762818755141212
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.28946165798373086
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.18031258568686592
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09418700301617768
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.762818755141212
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8683849739511927
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.9015629284343296
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9418700301617768
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.850685453111757
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8215859088357048
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8239714751253995
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.7573347957225116
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8634494104743625
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8952563751028242
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9347408829174664
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7573347957225116
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2878164701581208
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17905127502056484
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09347408829174664
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7573347957225116
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8634494104743625
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8952563751028242
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9347408829174664
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8445055968214926
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8157123053956075
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8184088689781863
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.7419797093501508
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8530298875788319
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8859336440910337
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9284343295859611
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7419797093501508
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.28434329585961066
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17718672881820677
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09284343295859611
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7419797093501508
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8530298875788319
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8859336440910337
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9284343295859611
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8334906130922063
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8032139919307455
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8057146368194794
name: Cosine Map@100
---
# Custom Embedding Test - Anudit Nagar
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [Alibaba-NLP/gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5)
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NewModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'Society tends to admire those who despair when others hope, viewing them as sages or wise figures.',
'What is often the societal perception of those who express pessimism about the future?',
'How did the realization about user engagement influence the app development strategy?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `dim_768`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.7683 |
| cosine_accuracy@3 | 0.8755 |
| cosine_accuracy@5 | 0.9098 |
| cosine_accuracy@10 | 0.9465 |
| cosine_precision@1 | 0.7683 |
| cosine_precision@3 | 0.2918 |
| cosine_precision@5 | 0.182 |
| cosine_precision@10 | 0.0947 |
| cosine_recall@1 | 0.7683 |
| cosine_recall@3 | 0.8755 |
| cosine_recall@5 | 0.9098 |
| cosine_recall@10 | 0.9465 |
| cosine_ndcg@10 | 0.8567 |
| cosine_mrr@10 | 0.8279 |
| **cosine_map@100** | **0.8302** |
#### Information Retrieval
* Dataset: `dim_512`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.7628 |
| cosine_accuracy@3 | 0.87 |
| cosine_accuracy@5 | 0.9062 |
| cosine_accuracy@10 | 0.9463 |
| cosine_precision@1 | 0.7628 |
| cosine_precision@3 | 0.29 |
| cosine_precision@5 | 0.1812 |
| cosine_precision@10 | 0.0946 |
| cosine_recall@1 | 0.7628 |
| cosine_recall@3 | 0.87 |
| cosine_recall@5 | 0.9062 |
| cosine_recall@10 | 0.9463 |
| cosine_ndcg@10 | 0.853 |
| cosine_mrr@10 | 0.8232 |
| **cosine_map@100** | **0.8254** |
#### Information Retrieval
* Dataset: `dim_256`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:----------|
| cosine_accuracy@1 | 0.7628 |
| cosine_accuracy@3 | 0.8684 |
| cosine_accuracy@5 | 0.9016 |
| cosine_accuracy@10 | 0.9419 |
| cosine_precision@1 | 0.7628 |
| cosine_precision@3 | 0.2895 |
| cosine_precision@5 | 0.1803 |
| cosine_precision@10 | 0.0942 |
| cosine_recall@1 | 0.7628 |
| cosine_recall@3 | 0.8684 |
| cosine_recall@5 | 0.9016 |
| cosine_recall@10 | 0.9419 |
| cosine_ndcg@10 | 0.8507 |
| cosine_mrr@10 | 0.8216 |
| **cosine_map@100** | **0.824** |
#### Information Retrieval
* Dataset: `dim_128`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.7573 |
| cosine_accuracy@3 | 0.8634 |
| cosine_accuracy@5 | 0.8953 |
| cosine_accuracy@10 | 0.9347 |
| cosine_precision@1 | 0.7573 |
| cosine_precision@3 | 0.2878 |
| cosine_precision@5 | 0.1791 |
| cosine_precision@10 | 0.0935 |
| cosine_recall@1 | 0.7573 |
| cosine_recall@3 | 0.8634 |
| cosine_recall@5 | 0.8953 |
| cosine_recall@10 | 0.9347 |
| cosine_ndcg@10 | 0.8445 |
| cosine_mrr@10 | 0.8157 |
| **cosine_map@100** | **0.8184** |
#### Information Retrieval
* Dataset: `dim_64`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.742 |
| cosine_accuracy@3 | 0.853 |
| cosine_accuracy@5 | 0.8859 |
| cosine_accuracy@10 | 0.9284 |
| cosine_precision@1 | 0.742 |
| cosine_precision@3 | 0.2843 |
| cosine_precision@5 | 0.1772 |
| cosine_precision@10 | 0.0928 |
| cosine_recall@1 | 0.742 |
| cosine_recall@3 | 0.853 |
| cosine_recall@5 | 0.8859 |
| cosine_recall@10 | 0.9284 |
| cosine_ndcg@10 | 0.8335 |
| cosine_mrr@10 | 0.8032 |
| **cosine_map@100** | **0.8057** |
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 32,833 training samples
* Columns: positive
and anchor
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details |
The author saw taking risks as a necessary part of the creative process, and was willing to take risks in order to explore new ideas and themes.
| What was the author's perspective on the importance of taking risks in creative work?
|
| Recognizing that older users are less likely to invite new users led to a strategic focus on younger demographics, prompting a shift in development efforts toward creating products that resonate with teens.
| How did the realization about user engagement influence the app development strategy?
|
| The phrase emphasizes the fragility of Earth and our collective responsibility to protect it and ensure sustainable resource management for future generations.
| What is the significance of the phrase 'pale blue dot' in relation to environmental responsibility?
|
* Loss: [MatryoshkaLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 0.0002
- `num_train_epochs`: 5
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `bf16`: True
- `load_best_model_at_end`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters