Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
treasure4l
's Collections
SFT-IC
DPO
SFT
DPO
updated
Jan 28
Upvote
-
treasure4l/Llama3.2_3B-DPO
Updated
Feb 8
treasure4l/Mistral_7B-DPO
Updated
Jan 28
treasure4l/Gemma2_9B-DPO
Updated
Jan 28
Upvote
-
Share collection
View history
Collection guide
Browse collections