|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- Open-Orca/SlimOrca |
|
- argilla/distilabel-intel-orca-dpo-pairs |
|
language: |
|
- en |
|
- de |
|
- fr |
|
- it |
|
- es |
|
library_name: adapter-transformers |
|
pipeline_tag: text-generation |
|
--- |
|
# Model Card for Model Swisslex/Mixtral-8x7b-DPO-v0.2 |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
Finetuned version of [mistralai/Mixtral-8x7B-v0.2](https://huggingface.co/mistralai/Mixtral-8x7B-v0.2) using SFT and DPO. |
|
|
|
|
|
|
|
- **Developed by:** Swisslex |
|
- **Language(s) (NLP):** English, German, French, Italian, Spanish |
|
- **License:** apache-2.0 |
|
- **Finetuned from model [optional]:** [mistralai/Mixtral-8x7B-v0.2](https://huggingface.co/mistralai/Mixtral-8x7B-v0.2) |
|
|