--- base_model: - ToastyPigeon/new-ms-rp-test-v0-v2 - unsloth/Mistral-Small-24B-Instruct-2501 library_name: transformers tags: - mergekit - mergekitty - merge --- # out This is a merge of pre-trained language models created using [mergekitty](https://github.com/allura-org/mergekitty). ## Merge Details ### Merge Method This model was merged using the [Model Breadcrumbs](https://arxiv.org/abs/2312.06795) merge method using [unsloth/Mistral-Small-24B-Instruct-2501](https://huggingface.co/unsloth/Mistral-Small-24B-Instruct-2501) as a base. ### Models Merged The following models were included in the merge: * [ToastyPigeon/new-ms-rp-test-v0-v2](https://huggingface.co/ToastyPigeon/new-ms-rp-test-v0-v2) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: breadcrumbs base_model: unsloth/Mistral-Small-24B-Instruct-2501 models: - model: ToastyPigeon/new-ms-rp-test-v0-v2 parameters: weight: 1.0 parameters: density: 0.95 gamma: 0.01 ```