SentenceTransformer based on BAAI/bge-small-en-v1.5
This is a sentence-transformers model finetuned from BAAI/bge-small-en-v1.5. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: BAAI/bge-small-en-v1.5
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 384 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("mavihsrr/bgeEmbeddingsRetailedFT")
# Run inference
sentences = [
"Coffee Filter Papers - Size 02, White. Description :Hario brings in Cone-shaped natural paper filter for Pour-over brewing experience for a great cup of Coffee. Hario's V60, size 02 White, give you a perfect brew in comparison to mesh filters. These paper filters are of great quality and they produce a clean, flavorful, sediment-free cup. They are disposable, and thus it makes it convenient and easier to use for brewing and cleanup. Perfect choice for coffee enthusiasts who like to grind their coffee at home. These papers are safe to use and eco-friendly. The Box comes with 100 disposable 02 paper filters.!",
'Steel Rice Serving Spoon - Medium, Classic Diana Series, BBST37. Description :BB Home provides fine and classy cooking and serving tools that can make difference to your kitchen experience. These cooking/serving tools are made from 100% food grade stainless steel. The handle is designed in a way so it does not feel heavy while cooking/serving. It is easy to store as it has a bottom hole on the handle to hang it on the wall.!',
'Tomato Disc 70 g + Cheese Balls 70 g',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Semantic Similarity
- Dataset:
bge-eval
- Evaluated with
EmbeddingSimilarityEvaluator
Metric | Value |
---|---|
pearson_cosine | 0.9791 |
spearman_cosine | 0.158 |
Semantic Similarity
- Dataset:
bge-eval
- Evaluated with
EmbeddingSimilarityEvaluator
Metric | Value |
---|---|
pearson_cosine | 0.9798 |
spearman_cosine | 0.1633 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 69,227 training samples
- Columns:
sentence1
,sentence2
, andscore
- Approximate statistics based on the first 1000 samples:
sentence1 sentence2 score type string string float details - min: 4 tokens
- mean: 114.97 tokens
- max: 512 tokens
- min: 4 tokens
- mean: 101.87 tokens
- max: 512 tokens
- min: 0.18
- mean: 0.88
- max: 0.96
- Samples:
sentence1 sentence2 score Breakfast Mix - Masala Idli. Description :Established in 1924, MTR is the contemporary way to authentic tasting food, Our products are backed by culinary expertise honed, over 8 decades of serving wholesome, tasty and high quality vegetarian food, Using authentic Indian recipes, the purest and best quality natural ingredients and traditional methods of preparation, We brings you a range of products of unmatched flavour and taste, to delight your family at every meal and every occasion, MTR Daily Favourites is your dependable partner in the Kitchen that helps you make your family's everyday meals tasty and wholesome, So bring home the confidence of great tasting food everyday with MTR..!
Quinoa Flakes. Description :Keep a good balance of satisfying your taste buds and satiating your hunger pangs. Nutriwish Quinoa Flakes are a “complete” protein containing all eight essential amino acids. The perfect antidote to all that sugar, Nutriwish Quinoa Flakes are delicious cold in a salad, served warm as a side dish or even combined with vegetables and dairy to make a spectacular and filling vegetarian main course. Curb food cravings and start your day yummy with the starchy Nutriwish Quinoa Flakes.!
0.9524586385560029
1 To 1 Baking Flour - Gluten Free. Description :Bob Red Mill gluten-free 1-to-1 baking flour makes it easy to transform traditional recipes to gluten-free. Simply follow your favourite baking recipe, replacing the wheat flour with this blend. It is formulated for baked goods with terrific taste and texture, no additional speciality ingredients or recipes required. It is suitable for cookies, cakes, brownies, muffins, and more.!
Chocolate - Drink Powder. Description :Hintz cocoa powder is not just ideal for making biscuits, ice cream and deserts. It is also dissolved in hot milk - a delicious chocolate beverage.!
0.8764388983469142
Joy Round Kids Glass. Description :This glass, made of plastic material, is specially designed for your kid. It is lightweight and easy to use. This glass is ideal for drinking water, milk, juices, health drinks etc.!
Plastic Lunch Box/Tiffin Box - Disney Mickey Mouse, BPA Free, HMHILB 199-MK. Description :HMI brings this 4 side lock and lock style. This is airtight, leak-proof and microwave safe. It comes with a small container, fork & spoon.!
0.9289614489097255
- Loss:
CosineSimilarityLoss
with these parameters:{ "loss_fct": "torch.nn.modules.loss.MSELoss" }
Evaluation Dataset
Unnamed Dataset
- Size: 8,654 evaluation samples
- Columns:
sentence1
,sentence2
, andscore
- Approximate statistics based on the first 1000 samples:
sentence1 sentence2 score type string string float details - min: 4 tokens
- mean: 110.58 tokens
- max: 512 tokens
- min: 4 tokens
- mean: 97.13 tokens
- max: 512 tokens
- min: 0.19
- mean: 0.87
- max: 0.96
- Samples:
sentence1 sentence2 score 1947 Flora Natural Aroma Incense Sticks - Economy Pack. Description :A Traditional formula that is handed over by the founder, incense sticks is made the traditional way with a ‘masala’ or mixture of 100% natural aromatic botanicals. During your rituals, these incense sticks will bring about a fresh and fragrant breath of conscious soothing bliss.!
Designer Jyot - Green. Description :This is made in India Initiative and create a meditative and peaceful ambience in your puja room with the handmade Brass Mandir Jyot. It extremely durable and crack-resistant, which allows you to use it with ease on a daily basis. This Jyot is very attractive and worth purchasing for personal use or for gifting purpose. Easy to Use and Clean. This Glass brass diya is designed for ease in inserting whip, refilling oil and cleaning. It emits brighter light due to the increased clarity provided by the superior quality glass. The flame of this brass diya does not go off or cause any danger even when the fan is on as the diya comes with a lid.!
0.9030882765047124
Mexican Seasoning. Description :The rich tapestry of sweet and spicy flavours that Mexican cuisine is loved for - now captured in a magic blend. This international seasoning product is inbuilt with unique 2-way flip cap to sprinkle it or scoop it. On1y is a new way of rediscovering the power of herbs and spices. On1y can conveniently become a part of your daily diet for the irresistible benefits that it brings.!
Rainbow Strands. Description :Colourful jimmies/sprinkles make decorating your cakes, cupcakes and cookies fun and easy. Great as an ice cream topping too.!
0.9584305870004965
Intense 75% Dark Chocolate. Description :This pack has 100gm 75% Luxury Intense Dark Chocolate. With meticulous culinary skills the exotic intense bitterness of cacao beans emerges in this bar. Chocolate was invented in 1900 BC by the Aztecs in Central America. We at Didier & Frank bring you those exotic flavours and hand crafted chocolates that the Aztecs enjoyed secretly. Today, Didier & Frank makes the best chocolates in the world.!
Puff Pastry Sticks With Butter. Description :The unique and timeless original Classic Millefoglie by Matilde Vicenzi: crumbly sticks of delicate pastry typical of the Italian tradition, with all the flavour of butter. With 192 crispy and delicate layers of puff pastry and just a light layer of premium butter, our inimitable Millefoglie d’Italia are among the most popular desserts in Italy.!
0.9553127949715517
- Loss:
CosineSimilarityLoss
with these parameters:{ "loss_fct": "torch.nn.modules.loss.MSELoss" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 16per_device_eval_batch_size
: 16learning_rate
: 2e-05num_train_epochs
: 1warmup_ratio
: 0.1bf16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | Validation Loss | bge-eval_spearman_cosine |
---|---|---|---|---|
0 | 0 | - | - | 0.0923 |
0.0231 | 100 | 0.0657 | 0.0386 | 0.1450 |
0.0462 | 200 | 0.0248 | 0.0133 | 0.1661 |
0.0693 | 300 | 0.0118 | - | - |
0.0231 | 100 | 0.0069 | 0.0070 | 0.1644 |
0.0462 | 200 | 0.0037 | 0.0040 | 0.1634 |
0.0693 | 300 | 0.0016 | 0.0038 | 0.1619 |
0.0924 | 400 | 0.0013 | 0.0042 | 0.1603 |
0.1156 | 500 | 0.0011 | 0.0049 | 0.1579 |
0.1387 | 600 | 0.0012 | 0.0052 | 0.1593 |
0.1618 | 700 | 0.0011 | 0.0053 | 0.1608 |
0.1849 | 800 | 0.0011 | 0.0055 | 0.1612 |
0.2080 | 900 | 0.0011 | 0.0063 | 0.1606 |
0.2311 | 1000 | 0.0011 | 0.0061 | 0.1585 |
0.2542 | 1100 | 0.0012 | 0.0061 | 0.1566 |
0.2773 | 1200 | 0.0011 | 0.0062 | 0.1557 |
0.3004 | 1300 | 0.0012 | 0.0062 | 0.1570 |
0.3235 | 1400 | 0.001 | 0.0058 | 0.1557 |
0.3467 | 1500 | 0.001 | 0.0063 | 0.1554 |
0.3698 | 1600 | 0.0011 | 0.0062 | 0.1572 |
0.3929 | 1700 | 0.0011 | 0.0061 | 0.1580 |
0.4160 | 1800 | 0.001 | - | 0.1598 |
0.2311 | 1000 | 0.0008 | 0.0063 | 0.1532 |
0.4622 | 2000 | 0.0008 | 0.0064 | 0.1651 |
0.6933 | 3000 | 0.001 | 0.0067 | 0.1627 |
0.9244 | 4000 | 0.001 | 0.0067 | 0.1633 |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.1.0+cu118
- Accelerate: 1.2.1
- Datasets: 3.2.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
- Downloads last month
- 0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for mavihsrr/bgeEmbeddingsRetailedFT
Base model
BAAI/bge-small-en-v1.5Evaluation results
- Pearson Cosine on bge evalself-reported0.979
- Spearman Cosine on bge evalself-reported0.158
- Pearson Cosine on bge evalself-reported0.980
- Spearman Cosine on bge evalself-reported0.163