base_model: | |
- TheDrummer/Cydonia-22B-v1.2 | |
- anthracite-org/magnum-v4-22b | |
library_name: transformers | |
tags: | |
- mergekit | |
- merge | |
license: other | |
license_name: mrl | |
inference: false | |
license_link: https://mistral.ai/licenses/MRL-0.1.md | |
![Not Horny Enough](Cydonia-v1.2-Magnum-v4-22B.png) | |
# The Drummer becomes hornier | |
Recipe based on [MarsupialAI/Monstral-123B](https://huggingface.co/MarsupialAI/Monstral-123B). It should work since it's the same Mistral, TheDrummer and MarsupialAI, right? | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/arcee-ai/mergekit). | |
## Merge Details | |
### Merge Method | |
This model was merged using the SLERP merge method. | |
### Models Merged | |
The following models were included in the merge: | |
* [TheDrummer/Cydonia-22B-v1.2](https://huggingface.co/TheDrummer/Cydonia-22B-v1.2) | |
* [anthracite-org/magnum-v4-22b](https://huggingface.co/anthracite-org/magnum-v4-22b) | |
### Configuration | |
The following YAML configuration was used to produce this model: | |
```yaml | |
models: | |
- model: TheDrummer/Cydonia-22B-v1.2 | |
- model: anthracite-org/magnum-v4-22b | |
merge_method: slerp | |
base_model: TheDrummer/Cydonia-22B-v1.2 | |
parameters: | |
t: [0.1, 0.3, 0.6, 0.3, 0.1] | |
dtype: bfloat16 | |
``` | |