Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
joshhu1123
/
DPO-llama2-no8
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Use this model
main
DPO-llama2-no8
201 MB
2 contributors
History:
2 commits
joshhu1123
Upload model
222731c
almost 2 years ago
.gitattributes
Safe
1.52 kB
initial commit
almost 2 years ago
README.md
Safe
5.51 kB
Upload model
almost 2 years ago
adapter_config.json
Safe
566 Bytes
Upload model
almost 2 years ago
adapter_model.safetensors
Safe
201 MB
xet
Upload model
almost 2 years ago