metadata
license: cc-by-nc-4.0
library_name: transformers
tags:
- mergekit
- merge
- alpaca
- mistral
- not-for-all-audiences
- nsfw
IceCocoaRP-7b
This is a merge of pre-trained language models created using mergekit.
Rules-lorebook and settings I'm using you can find here
Merge Details
The best one so far for me.
thx mradermacher for
Merge Method
This model was merged using the TIES merge method using NeuralBeagleJaskier as a base.
Models Merged
The following models were included in the merge:
- NeuralBeagleJaskier
- IceBlendedCoffeeRP-7b (slerp bfloat16)
- IceCoffeeRP-7b
- IceBlendedLatteRP-7b base
Configuration
The following YAML configuration was used to produce this model:
models:
- model: NeuralBeagleJaskier
parameters:
density: 0.9
weight: 0.5
- model: IceBlendedCoffeeRP-7b
parameters:
density: 0.5
weight: 0.3
merge_method: ties
base_model: NeuralBeagleJaskier
parameters:
normalize: true
int8_mask: true
dtype: float16