--- base_model: - cgato/L3-TheSpice-8b-v0.8.3 - Undi95/Llama-3-LewdPlay-8B-evo - NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS - openlynn/Llama-3-Soliloquy-8B-v2 library_name: transformers tags: - mergekit - merge --- # It's either lobotomy, or best RP models merge :D Original idea was to use soliloquy-v2's 24k context length as base, and add little bit lewdness to it by merging lumimaid, lewdplay and thespice. Probably not working. I don't have any sufficient power to run this model with safetensors (I think this is somewhere between lobotomy and good merge) ### THE MODEL IS NOT CHANGING TO GGUF probably because of the tokenizer difference between models. soliloquy-v2 has different tokenizer to others. oof, someone solve this please ## merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [openlynn/Llama-3-Soliloquy-8B-v2](https://huggingface.co/openlynn/Llama-3-Soliloquy-8B-v2) as a base. ### Models Merged The following models were included in the merge: * [cgato/L3-TheSpice-8b-v0.8.3](https://huggingface.co/cgato/L3-TheSpice-8b-v0.8.3) * [Undi95/Llama-3-LewdPlay-8B-evo](https://huggingface.co/Undi95/Llama-3-LewdPlay-8B-evo) * [NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS](https://huggingface.co/NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: openlynn/Llama-3-Soliloquy-8B-v2 dtype: bfloat16 merge_method: dare_ties parameters: int8_mask: 1.0 normalize: 0.0 slices: - sources: - layer_range: [0, 4] model: openlynn/Llama-3-Soliloquy-8B-v2 parameters: density: 0.83 weight: 0.4 - layer_range: [0, 4] model: cgato/L3-TheSpice-8b-v0.8.3 parameters: density: 0.43 weight: 0.15 - layer_range: [0, 4] model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS parameters: density: 0.8 weight: 0.4 - layer_range: [0, 4] model: Undi95/Llama-3-LewdPlay-8B-evo parameters: density: 0.8 weight: 0.15 - sources: - layer_range: [4, 8] model: cgato/L3-TheSpice-8b-v0.8.3 parameters: density: 0.45 weight: 0.1 - layer_range: [4, 8] model: Undi95/Llama-3-LewdPlay-8B-evo parameters: density: 0.41 weight: 0.15 - layer_range: [4, 8] model: openlynn/Llama-3-Soliloquy-8B-v2 parameters: density: 0.78 weight: 0.4 - layer_range: [4, 8] model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS parameters: density: 0.83 weight: 0.45 - sources: - layer_range: [8, 12] model: openlynn/Llama-3-Soliloquy-8B-v2 parameters: density: 0.8 weight: 0.35 - layer_range: [8, 12] model: Undi95/Llama-3-LewdPlay-8B-evo parameters: density: 0.71 weight: 0.2 - layer_range: [8, 12] model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS parameters: density: 0.77 weight: 0.35 - layer_range: [8, 12] model: cgato/L3-TheSpice-8b-v0.8.3 parameters: density: 0.45 weight: 0.2 - sources: - layer_range: [12, 16] model: openlynn/Llama-3-Soliloquy-8B-v2 parameters: density: 0.46 weight: 0.21 - layer_range: [12, 16] model: Undi95/Llama-3-LewdPlay-8B-evo parameters: density: 0.81 weight: 0.3 - layer_range: [12, 16] model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS parameters: density: 0.75 weight: 0.25 - layer_range: [12, 16] model: cgato/L3-TheSpice-8b-v0.8.3 parameters: density: 0.8 weight: 0.35 - sources: - layer_range: [16, 20] model: openlynn/Llama-3-Soliloquy-8B-v2 parameters: density: 0.62 weight: 0.36 - layer_range: [16, 20] model: Undi95/Llama-3-LewdPlay-8B-evo parameters: density: 0.5 weight: 0.2 - layer_range: [16, 20] model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS parameters: density: 0.83 weight: 0.3 - layer_range: [16, 20] model: cgato/L3-TheSpice-8b-v0.8.3 parameters: density: 0.74 weight: 0.28 - sources: - layer_range: [20, 24] model: openlynn/Llama-3-Soliloquy-8B-v2 parameters: density: 0.52 weight: 0.16 - layer_range: [20, 24] model: Undi95/Llama-3-LewdPlay-8B-evo parameters: density: 0.74 weight: 0.28 - layer_range: [20, 24] model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS parameters: density: 0.64 weight: 0.2 - layer_range: [20, 24] model: cgato/L3-TheSpice-8b-v0.8.3 parameters: density: 0.83 weight: 0.37 - sources: - layer_range: [24, 28] model: openlynn/Llama-3-Soliloquy-8B-v2 parameters: density: 0.85 weight: 0.35 - layer_range: [24, 28] model: Undi95/Llama-3-LewdPlay-8B-evo parameters: density: 0.68 weight: 0.3 - layer_range: [24, 28] model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS parameters: density: 0.75 weight: 0.35 - layer_range: [24, 28] model: cgato/L3-TheSpice-8b-v0.8.3 parameters: density: 0.5 weight: 0.1 - sources: - layer_range: [28, 32] model: openlynn/Llama-3-Soliloquy-8B-v2 parameters: density: 0.85 weight: 0.35 - layer_range: [28, 32] model: Undi95/Llama-3-LewdPlay-8B-evo parameters: density: 0.77 weight: 0.15 - layer_range: [28, 32] model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS parameters: density: 0.85 weight: 0.25 - layer_range: [28, 32] model: cgato/L3-TheSpice-8b-v0.8.3 parameters: density: 0.88 weight: 0.35 ```