Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
linhphanff
/
stella_en_1.5B_v5_clone
like
0
Sentence Similarity
sentence-transformers
PyTorch
Safetensors
Transformers
qwen2
text-generation
mteb
custom_code
Eval Results
text-embeddings-inference
Inference Endpoints
arxiv:
2205.13147
License:
mit
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
stella_en_1.5B_v5_clone
1 contributor
History:
9 commits
linhphanff
Update modules.json
60fee87
verified
about 2 months ago
1_Pooling
Upload 25 files
about 2 months ago
2_Dense
Upload 25 files
about 2 months ago
2_Dense_1024
Upload 25 files
about 2 months ago
2_Dense_2048
Upload 25 files
about 2 months ago
2_Dense_256
Upload 25 files
about 2 months ago
2_Dense_4096
Upload 25 files
about 2 months ago
2_Dense_6144
Upload 25 files
about 2 months ago
2_Dense_768
Upload 25 files
about 2 months ago
2_Dense_8192
Upload 25 files
about 2 months ago
.gitattributes
Safe
1.55 kB
Upload 16 files
about 2 months ago
README.md
Safe
174 kB
Upload 16 files
about 2 months ago
added_tokens.json
Safe
85 Bytes
Upload 16 files
about 2 months ago
config.json
Safe
875 Bytes
Upload 16 files
about 2 months ago
config_sentence_transformers.json
Safe
497 Bytes
Update config_sentence_transformers.json
about 2 months ago
merges.txt
Safe
1.82 MB
Upload 16 files
about 2 months ago
model.safetensors
Safe
6.17 GB
LFS
Upload 16 files
about 2 months ago
modeling_qwen.py
Safe
66.6 kB
Upload 16 files
about 2 months ago
modules.json
Safe
316 Bytes
Update modules.json
about 2 months ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
6.17 GB
LFS
Upload 16 files
about 2 months ago
sentence_bert_config.json
Safe
54 Bytes
Upload 16 files
about 2 months ago
special_tokens_map.json
Safe
390 Bytes
Upload 16 files
about 2 months ago
tokenization_qwen.py
Safe
11.1 kB
Upload 16 files
about 2 months ago
tokenizer.json
Safe
7.33 MB
Upload 16 files
about 2 months ago
tokenizer_config.json
Safe
1.36 kB
Upload 16 files
about 2 months ago
vocab.json
Safe
2.78 MB
Upload 16 files
about 2 months ago