metadata
language:
- en
library_name: transformers
tags:
- gpt
- llm
- large language model
inference: true
license: cc-by-sa-4.0
This model is a merge/fusion of PAIXAI/Astrid-3B and jondurbin/airoboros-3b-3p0 , 16 layers of each glued together(see Astridboros.yml or below).
slices:
- sources:
- model: PAIXAI/Astrid-3B
layer_range: [0, 16]
- sources:
- model: jondurbin/airoboros-3b-3p0
layer_range: [16, 32]
merge_method: passthrough
dtype: float16
I recommend the use of alpaca prompt format.
GGUF Quants: afrideva/Astridboros-3B-GGUF