metadata
base_model:
- ABX-AI/Cerebral-Infinity-7B
- ABX-AI/Starfinite-Laymons-7B
library_name: transformers
tags:
- mergekit
- merge
- mistral
- not-for-all-audiences
Starbral-Infinimons-9B
The concept behind this merge was to combine:
- The conversational abilities or the newly added Starling-LM-7B-beta
- The reasoning abilities of Cerebrum-1.0-7b
- The originality of LemonadeRP-4.5.3
- The already well-performing previous merges I did based on InfinityRP, Layla v4, Laydiculous, between these models and others into a 9B frankenmerge
Based on preliminary tests, I'm quite happy with the results. Very original responses and basically no alignment issues.
In my experience, it works well with ChatML, Alpaca, and likely other instruction sets - you can chat, or ask it to develop a story.
This model is intended for fictional storytelling and role-playing, and may not be intended for all audiences.
Merge Details
This is a merge of pre-trained language models created using mergekit.
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: ABX-AI/Cerebral-Infinity-7B
layer_range: [0, 20]
- sources:
- model: ABX-AI/Starfinite-Laymons-7B
layer_range: [12, 32]
merge_method: passthrough
dtype: float16