MixtureofMerges-MoE-2x7b-v0.01b-DELLA
MixtureofMerges-MoE-2x7b-v0.01b-DELLA is a merge of the following models using mergekit:
🧩 Configuration
models:
- model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
parameters:
weight: 1
- model: jsfs11/MixtureofMerges-MoE-2x7b-v6
parameters:
weight: 1.0
merge_method: della
base_model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
parameters:
density: 0.6
epsilon: 0.2
lambda: 1.0
dtype: bfloat16
- Downloads last month
- 6