--- base_model: - TheDrummer/UnslopSmall-22B-v1 - NewEden-Staging/Kyne-22b - nbeerbower/Mistral-Small-Gutenberg-Doppel-22B library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using [NewEden-Staging/Kyne-22b](https://huggingface.co/NewEden-Staging/Kyne-22b) as a base. ### Models Merged The following models were included in the merge: * [TheDrummer/UnslopSmall-22B-v1](https://huggingface.co/TheDrummer/UnslopSmall-22B-v1) * [nbeerbower/Mistral-Small-Gutenberg-Doppel-22B](https://huggingface.co/nbeerbower/Mistral-Small-Gutenberg-Doppel-22B) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: TheDrummer/UnslopSmall-22B-v1 parameters: density: 0.7 weight: 0.7 - model: nbeerbower/Mistral-Small-Gutenberg-Doppel-22B parameters: density: 0.3 weight: 0.3 merge_method: task_arithmetic base_model: NewEden-Staging/Kyne-22b parameters: normalize: false int8_mask: true dtype: bfloat16 ```