File size: 1,130 Bytes
1db71e5 3179282 1db71e5 3179282 1db71e5 3179282 7c07fab |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 |
---
language:
- en
library_name: transformers
tags:
- merge
- llm
- stablelm
inference: true
license: other
---
This model is a merge/fusion of [Aryanne/Astridboros-3B](https://huggingface.co/Aryanne/Astridboros-3B) and [stabilityai/stablelm-zephyr-3b](https://huggingface.co/stabilityai/stablelm-zephyr-3b) , 28 layers of Zephyr + 12 layers of Astridboros together(see zephyr-3.43b.yml or below).
A total of 40 layers, with 3.43B of parameters.
License it's the same as Zephyr cause it has 70% of it.
```yaml
slices:
- sources:
- model: stabilityai/stablelm-zephyr-3b
layer_range: [0, 14]
- sources:
- model: Aryanne/Astridboros-3B
layer_range: [10, 22]
- sources:
- model: stabilityai/stablelm-zephyr-3b
layer_range: [18, 32]
merge_method: passthrough
dtype: float16
```
I recommend the use of the Zephyr prompt format.
```
<|user|>
List 3 synonyms for the word "tiny"<|endoftext|>
<|assistant|>
1. Dwarf
2. Little
3. Petite<|endoftext|>
```
GGUF Quants: [afrideva/Zephyr-3.43B-GGUF](https://huggingface.co/afrideva/Zephyr-3.43B-GGUF)
A huge thanks to afrideva for the help. 🙌🤗 |