Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

mcding-org
/
CorrectDPO-Model-SFT_Pm3B_U0

Model card Files Files and versions
xet
Community
CorrectDPO-Model-SFT_Pm3B_U0
1.52 kB
  • 1 contributor
History: 1 commit
mcding's picture
mcding
initial commit
5e518ac verified over 1 year ago
  • .gitattributes
    1.52 kB
    initial commit over 1 year ago