Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Ren-Wei
/
Safe-RLHF-DPO-helpless-opt-1b
like
0
Safetensors
opt
Model card
Files
Files and versions
Community
main
Safe-RLHF-DPO-helpless-opt-1b
/
merges.txt
Commit History
Initial commit
ea33e13
AAAhWei
commited on
27 days ago