IceCocoaRP-7b-8bpw-exl2
8bpw-exl2 quant of icefog72/IceCocoaRP-7b
Rules-lorebook and settings I'm using you can find here
Merge Details
The best one so far for me.
Merge Method
This model was merged using the TIES merge method using NeuralBeagleJaskier as a base.
Models Merged
The following models were included in the merge:
- NeuralBeagleJaskier
- IceBlendedCoffeeRP-7b (slerp bfloat16)
- IceCoffeeRP-7b
- IceBlendedLatteRP-7b base
Configuration
The following YAML configuration was used to produce this model:
models:
- model: NeuralBeagleJaskier
parameters:
density: 0.9
weight: 0.5
- model: IceBlendedCoffeeRP-7b
parameters:
density: 0.5
weight: 0.3
merge_method: ties
base_model: NeuralBeagleJaskier
parameters:
normalize: true
int8_mask: true
dtype: float16
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.