license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- lazymergekit | |
- yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B | |
- jsfs11/MixtureofMerges-MoE-2x7b-v6 | |
# MixtureofMerges-MoE-2x7b-v0.01b-DELLA | |
MixtureofMerges-MoE-2x7b-v0.01b-DELLA is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): | |
* [yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B](https://huggingface.co/yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B) | |
* [jsfs11/MixtureofMerges-MoE-2x7b-v6](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-v6) | |
## 🧩 Configuration | |
```yaml | |
models: | |
- model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B | |
parameters: | |
weight: 1 | |
- model: jsfs11/MixtureofMerges-MoE-2x7b-v6 | |
parameters: | |
weight: 1.0 | |
merge_method: della | |
base_model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B | |
parameters: | |
density: 0.6 | |
epsilon: 0.2 | |
lambda: 1.0 | |
dtype: bfloat16 | |
``` |