File size: 517 Bytes
2159ecb 7a35b35 8fe201f 2159ecb 7a35b35 2159ecb 7a35b35 2159ecb 7a35b35 2159ecb 7a35b35 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
library_name: transformers
license: cc-by-nc-4.0
tags:
- merge
- automerger
---
# UltraMerge-7B
This model is an experimental DPO fine-tune of [automerger/YamShadow-7B](https://huggingface.co/automerger/YamShadow-7B) on the following datasets:
- mlabonne/truthy-dpo-v0.1
- mlabonne/distilabel-intel-orca-dpo-pairs
- mlabonne/chatml-OpenHermes2.5-dpo-binarized-alpha
- mlabonne/ultrafeedback-binarized-preferences-cleaned
I have no idea about what's the best chat template. Probably Mistral-Instruct or ChatML. |