Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
lamos38667
/
krx_Qwen2.5-7B-Instruct-d20241105
like
0
Text Generation
Transformers
PyTorch
English
qwen2
text-generation-inference
unsloth
trl
krx
sft
conversational
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
krx_Qwen2.5-7B-Instruct-d20241105
1 contributor
History:
5 commits
lamos38667
Trained with Unsloth
c893a4c
verified
about 24 hours ago
.gitattributes
1.52 kB
initial commit
about 24 hours ago
README.md
606 Bytes
Trained with Unsloth
about 24 hours ago
added_tokens.json
632 Bytes
Upload tokenizer
about 24 hours ago
config.json
803 Bytes
Trained with Unsloth
about 24 hours ago
generation_config.json
266 Bytes
Trained with Unsloth
about 24 hours ago
merges.txt
1.67 MB
Upload tokenizer
about 24 hours ago
pytorch_model-00001-of-00004.bin
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
4.88 GB
LFS
Trained with Unsloth
about 24 hours ago
pytorch_model-00002-of-00004.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
4.93 GB
LFS
Trained with Unsloth
about 24 hours ago
pytorch_model-00003-of-00004.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
4.33 GB
LFS
Trained with Unsloth
about 24 hours ago
pytorch_model-00004-of-00004.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
1.09 GB
LFS
Trained with Unsloth
about 24 hours ago
pytorch_model.bin.index.json
27.8 kB
Trained with Unsloth
about 24 hours ago
special_tokens_map.json
613 Bytes
Upload tokenizer
about 24 hours ago
tokenizer.json
7.03 MB
Upload tokenizer
about 24 hours ago
tokenizer_config.json
7.51 kB
Upload tokenizer
about 24 hours ago
vocab.json
2.78 MB
Upload tokenizer
about 24 hours ago