File size: 1,439 Bytes
6feca76 062d993 6feca76 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 |
---
language:
- en
library_name: transformers
tags:
- gpt
- llm
- stablelm
inference: true
license: cc-by-sa-4.0
---
This model is a mix of [PAIXAI/Astrid-3B](https://huggingface.co/PAIXAI/Astrid-3B) + [jondurbin/airoboros-3b-3p0](https://huggingface.co/jondurbin/airoboros-3b-3p0) + [cxllin/StableHermes-3b](https://huggingface.co/cxllin/StableHermes-3b), as shown in the yaml(see Astrohermes.yml or below).
[Aryanne/Astridboros-3B](https://huggingface.co/Aryanne/Astridboros-3B) = PAIXAI/Astrid-3B + jondurbin/airoboros-3b-3p0
```yaml
slices:
- sources:
- model: Aryanne/Astridboros-3B
layer_range: [0, 15]
- sources:
- model: cxllin/StableHermes-3b
layer_range: [15, 16]
- sources:
- model: Aryanne/Astridboros-3B
layer_range: [16, 17]
- sources:
- model: cxllin/StableHermes-3b
layer_range: [17, 18]
- sources:
- model: Aryanne/Astridboros-3B
layer_range: [18, 19]
- sources:
- model: cxllin/StableHermes-3b
layer_range: [19, 20]
- sources:
- model: Aryanne/Astridboros-3B
layer_range: [20, 21]
- sources:
- model: cxllin/StableHermes-3b
layer_range: [21, 22]
- sources:
- model: Aryanne/Astridboros-3B
layer_range: [22, 23]
- sources:
- model: cxllin/StableHermes-3b
layer_range: [23, 24]
- sources:
- model: Aryanne/Astridboros-3B
layer_range: [24, 32]
merge_method: passthrough
dtype: float16
```
|