--- language: - en library_name: transformers tags: - merge - llm - stablelm inference: true license: other --- This model is a merge/fusion of [Aryanne/Astridboros-3B](https://huggingface.co/Aryanne/Astridboros-3B) and [stabilityai/stablelm-zephyr-3b](https://huggingface.co/stabilityai/stablelm-zephyr-3b) , 28 layers of Zephyr + 12 layers of Astridboros together(see zephyr-3.43b.yml or below). A total of 40 layers, with 3.43B of parameters. License it's the same as Zephyr cause it has 70% of it. ```yaml slices: - sources: - model: stabilityai/stablelm-zephyr-3b layer_range: [0, 14] - sources: - model: Aryanne/Astridboros-3B layer_range: [10, 22] - sources: - model: stabilityai/stablelm-zephyr-3b layer_range: [18, 32] merge_method: passthrough dtype: float16 ``` I recommend the use of the Zephyr prompt format. ``` <|user|> List 3 synonyms for the word "tiny"<|endoftext|> <|assistant|> 1. Dwarf 2. Little 3. Petite<|endoftext|> ``` GGUF Quants: [afrideva/Zephyr-3.43B-GGUF](https://huggingface.co/afrideva/Zephyr-3.43B-GGUF) A huge thanks to afrideva for the help. 🙌🤗