--- license: apache-2.0 tags: - merge - mergekit - lazymergekit - FelixChao/WestSeverus-7B-DPO-v2 - CultriX/Wernicke-7B-v9 - mlabonne/NeuralBeagle14-7B --- # RandomMergeNoNormWEIGHTED-7B-DARETIES RandomMergeNoNormWEIGHTED-7B-DARETIES is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [FelixChao/WestSeverus-7B-DPO-v2](https://huggingface.co/FelixChao/WestSeverus-7B-DPO-v2) * [CultriX/Wernicke-7B-v9](https://huggingface.co/CultriX/Wernicke-7B-v9) * [mlabonne/NeuralBeagle14-7B](https://huggingface.co/mlabonne/NeuralBeagle14-7B) ## 🧩 Configuration ```yaml models: - model: FelixChao/WestSeverus-7B-DPO-v2 # No parameters necessary for base model - model: FelixChao/WestSeverus-7B-DPO-v2 parameters: density: [1, 0.7, 0.1] weight: [0, 0.3, 0.7, 1] - model: CultriX/Wernicke-7B-v9 parameters: density: [1, 0.7, 0.3] weight: [0, 0.25, 0.5, 1] - model: mlabonne/NeuralBeagle14-7B parameters: density: 0.25 weight: - filter: mlp value: 0.5 - value: 0 merge_method: ties base_model: FelixChao/WestSeverus-7B-DPO-v2 parameters: int8_mask: true normalize: true sparsify: - filter: mlp value: 0.5 - filter: self_attn value: 0.5 dtype: float16 ```