Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
rAIfle
/
Nous-Hermes-2-Mixtral-8x7B-DPO-exl2-rpcal
like
0
English
Mixtral
instruct
finetune
chatml
DPO
RLHF
gpt4
synthetic data
distillation
License:
apache-2.0
Model card
Files
Files and versions
Community
main
Nous-Hermes-2-Mixtral-8x7B-DPO-exl2-rpcal
1 contributor
History:
4 commits
rAIfle
Upload measurement.json with huggingface_hub
fc338da
verified
11 months ago
.gitattributes
Safe
1.52 kB
initial commit
11 months ago
README.md
Safe
12.7 kB
Update README.md
11 months ago
measurement.json
Safe
1.78 MB
Upload measurement.json with huggingface_hub
11 months ago