Text Generation
Adapters
Safetensors
mixtral

Model Card for Model Swisslex/Mixtral-8x7b-DPO-v0.2

Model Details

Model Description

Finetuned version of mistralai/Mixtral-8x7B-v0.2 using SFT and DPO.

  • Developed by: Swisslex
  • Language(s) (NLP): English, German, French, Italian, Spanish
  • License: apache-2.0
  • Finetuned from model [optional]: mistralai/Mixtral-8x7B-v0.2
Downloads last month
0
Safetensors
Model size
46.7B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the HF Inference API does not support adapter-transformers models with pipeline type text-generation

Datasets used to train Swisslex/Mixtral-8x7b-DPO-v0.2