Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
NousResearch
/
Nous-Hermes-2-Mixtral-8x7B-DPO-adapter
like
15
Follow
NousResearch
1.03k
Safetensors
teknium/OpenHermes-2.5
English
Mixtral
instruct
finetune
chatml
DPO
RLHF
gpt4
synthetic data
distillation
License:
apache-2.0
Model card
Files
Files and versions
Community
main
Nous-Hermes-2-Mixtral-8x7B-DPO-adapter
Commit History
Update README.md
d7dccf5
verified
teknium
commited on
Feb 20
Update README.md
9225072
verified
teknium
commited on
Jan 15
Update README.md
519da25
verified
teknium
commited on
Jan 15
Upload model
9c1395d
verified
emozilla
commited on
Jan 11
initial commit
2d6c16d
verified
emozilla
commited on
Jan 11