mixtral megamerge 8x7b v2

The following models were merged with DARE using https://github.com/martyn/safetensors-merge-supermario

Mergelist

mistralai/Mixtral-8x7B-v0.1
mistralai/Mixtral-8x7B-Instruct-v0.1
cognitivecomputations/dolphin-2.6-mixtral-8x7b
Brillibitg/Instruct_Mixtral-8x7B-v0.1_Dolly15K
orangetin/OpenHermes-Mixtral-8x7B
NeverSleep/Noromaid-v0.1-mixtral-8x7b-v3

Merge command

python3 hf_merge.py to_merge_mixtral2.txt mixtral-2 -p 0.15 -lambda 1.95

Notes

  • MoE gates were filtered for compatibility then averaged with (tensor1 + tensor2)/2
  • seems to generalize prompting formats and sampling settings
Downloads last month
1,093
Safetensors
Model size
46.7B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for martyn/mixtral-megamerge-dare-8x7b-v2

Quantizations
3 models

Collection including martyn/mixtral-megamerge-dare-8x7b-v2