Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
NousResearch
/
Nous-Hermes-2-Mixtral-8x7B-DPO-adapter
like
15
Follow
NousResearch
1.12k
Safetensors
teknium/OpenHermes-2.5
English
Mixtral
instruct
finetune
chatml
DPO
RLHF
gpt4
synthetic data
distillation
License:
apache-2.0
Model card
Files
Files and versions
Community
main
Nous-Hermes-2-Mixtral-8x7B-DPO-adapter
/
README.md
Commit History
Update README.md
d7dccf5
verified
teknium
commited on
Feb 20, 2024
Update README.md
9225072
verified
teknium
commited on
Jan 15, 2024
Update README.md
519da25
verified
teknium
commited on
Jan 15, 2024
Upload model
9c1395d
verified
emozilla
commited on
Jan 11, 2024