A4 / README.md
MrRobotoAI's picture
Update README.md
9d11e6d verified
---
base_model:
- MrRobotoAI/220
- MrRobotoAI/222
- MrRobotoAI/232
- Blackroot/Llama-3-LongStory-LORA
- MrRobotoAI/221
- MrRobotoAI/210
- MrRobotoAI/212
- MrRobotoAI/227
- MrRobotoAI/229
- nothingiisreal/llama3-8B-DWP-lora
- MrRobotoAI/211
- MrRobotoAI/231
- Blackroot/Llama-3-LongStory-LORA
library_name: transformers
tags:
- mergekit
- merge
---
# merge (233) 13,764 R
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [Linear](https://arxiv.org/abs/2203.05482) merge method.
### Models Merged
The following models were included in the merge:
* [MrRobotoAI/220](https://huggingface.co/MrRobotoAI/220)
* [MrRobotoAI/222](https://huggingface.co/MrRobotoAI/222)
* [MrRobotoAI/232](https://huggingface.co/MrRobotoAI/232) + [Blackroot/Llama-3-LongStory-LORA](https://huggingface.co/Blackroot/Llama-3-LongStory-LORA)
* [MrRobotoAI/221](https://huggingface.co/MrRobotoAI/221)
* [MrRobotoAI/210](https://huggingface.co/MrRobotoAI/210)
* [MrRobotoAI/212](https://huggingface.co/MrRobotoAI/212)
* [MrRobotoAI/227](https://huggingface.co/MrRobotoAI/227)
* [MrRobotoAI/229](https://huggingface.co/MrRobotoAI/229) + [nothingiisreal/llama3-8B-DWP-lora](https://huggingface.co/nothingiisreal/llama3-8B-DWP-lora)
* [MrRobotoAI/211](https://huggingface.co/MrRobotoAI/211)
* [MrRobotoAI/231](https://huggingface.co/MrRobotoAI/231) + [Blackroot/Llama-3-LongStory-LORA](https://huggingface.co/Blackroot/Llama-3-LongStory-LORA)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: MrRobotoAI/210
- model: MrRobotoAI/211
- model: MrRobotoAI/212
- model: MrRobotoAI/220
- model: MrRobotoAI/221
- model: MrRobotoAI/222
- model: MrRobotoAI/227
- model: MrRobotoAI/229+nothingiisreal/llama3-8B-DWP-lora
- model: MrRobotoAI/231+Blackroot/Llama-3-LongStory-LORA
- model: MrRobotoAI/232+Blackroot/Llama-3-LongStory-LORA
parameters:
weight: 1.0
merge_method: linear
dtype: float16
```