Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
kaizen9
/
danube_24ksteps
like
0
Text Generation
Transformers
Safetensors
llama
text-generation-inference
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
danube_24ksteps
1 contributor
History:
3 commits
kaizen9
Upload tokenizer
56a04a9
verified
9 days ago
.gitattributes
Safe
1.52 kB
initial commit
9 days ago
README.md
Safe
5.17 kB
Upload LlamaForCausalLM
9 days ago
config.json
772 Bytes
Upload LlamaForCausalLM
9 days ago
generation_config.json
Safe
132 Bytes
Upload LlamaForCausalLM
9 days ago
model.safetensors
1.86 GB
LFS
Upload LlamaForCausalLM
9 days ago
special_tokens_map.json
Safe
414 Bytes
Upload tokenizer
9 days ago
tokenizer.json
Safe
3.51 MB
Upload tokenizer
9 days ago
tokenizer.model
Safe
493 kB
LFS
Upload tokenizer
9 days ago
tokenizer_config.json
Safe
1.03 kB
Upload tokenizer
9 days ago