Napoleon_24B_V0.1 / README.md
baconnier's picture
Upload folder using huggingface_hub
cbf9804 verified
---
base_model:
- baconnier/Napoleon_24B_V0.0
- cognitivecomputations/Dolphin3.0-Mistral-24B
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [SLERP](https://en.wikipedia.org/wiki/Slerp) merge method.
### Models Merged
The following models were included in the merge:
* [baconnier/Napoleon_24B_V0.0](https://huggingface.co/baconnier/Napoleon_24B_V0.0)
* [cognitivecomputations/Dolphin3.0-Mistral-24B](https://huggingface.co/cognitivecomputations/Dolphin3.0-Mistral-24B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: cognitivecomputations/Dolphin3.0-Mistral-24B
layer_range: [0, 39] # 40 layers (0-39)
- model: baconnier/Napoleon_24B_V0.0
layer_range: [0, 39]
merge_method: slerp
base_model: cognitivecomputations/Dolphin3.0-Mistral-24B # Use one of the source models
parameters:
t:
- filter: self_attn
value: 0.5
- filter: mlp
value: 0.5
- value: 0.5
dtype: float16 # Changed to float16 for wider compatibility
tokenizer_source: baconnier/Napoleon_24B_V0.0 # Use Napoleon's tokenizer
```