Update README.md
Browse files
README.md
CHANGED
@@ -13,8 +13,11 @@ license: llama2
|
|
13 |
<h2 style="text-align: center">Experimental Frankenmerge Model</h2>
|
14 |
|
15 |
|
16 |
-
##
|
17 |
-
[GGUF
|
|
|
|
|
|
|
18 |
|
19 |
## Model Details
|
20 |
A Frankenmerge with [Thorns-13B](https://huggingface.co/CalderaAI/13B-Thorns-l2) by CalderaAI and [Noromaid-13-v0.1.1](https://huggingface.co/NeverSleep/Noromaid-13b-v0.1.1) by NeverSleep (IkariDev and Undi). This recipe was proposed by Trappu and the layer distribution recipe was made by Undi. I thank them for sharing their knowledge with me. This model should be very good at any roleplay scenarios. I called the model "Rose" because it was a fitting name for a "thorny maid".
|
@@ -31,6 +34,8 @@ Below is an instruction that describes a task. Write a response that appropriate
|
|
31 |
|
32 |
Feel free to share any other prompts that works. This model is very robust.
|
33 |
|
|
|
|
|
34 |
## Justification for its Existence
|
35 |
Potential base model for finetune experiments using our dataset to create Pygmalion-20B. Due to the already high capabilities, adding our dataset will mesh well with how the model performs.
|
36 |
Potential experimentation with merging with other 20B Frankenmerge models.
|
|
|
13 |
<h2 style="text-align: center">Experimental Frankenmerge Model</h2>
|
14 |
|
15 |
|
16 |
+
## Other Formats
|
17 |
+
[GGUF](https://huggingface.co/TheBloke/Rose-20B-GGUF)
|
18 |
+
[GPTQ](https://huggingface.co/TheBloke/Rose-20B-GPTQ)
|
19 |
+
[AWQ](https://huggingface.co/TheBloke/Rose-20B-AWQ)
|
20 |
+
[exl2](https://huggingface.co/royallab/Rose-20B-exl2)
|
21 |
|
22 |
## Model Details
|
23 |
A Frankenmerge with [Thorns-13B](https://huggingface.co/CalderaAI/13B-Thorns-l2) by CalderaAI and [Noromaid-13-v0.1.1](https://huggingface.co/NeverSleep/Noromaid-13b-v0.1.1) by NeverSleep (IkariDev and Undi). This recipe was proposed by Trappu and the layer distribution recipe was made by Undi. I thank them for sharing their knowledge with me. This model should be very good at any roleplay scenarios. I called the model "Rose" because it was a fitting name for a "thorny maid".
|
|
|
34 |
|
35 |
Feel free to share any other prompts that works. This model is very robust.
|
36 |
|
37 |
+
**Warning: This model uses significantly more VRAM due to the KV cache increase resulting in more VRAM required for the context window.**
|
38 |
+
|
39 |
## Justification for its Existence
|
40 |
Potential base model for finetune experiments using our dataset to create Pygmalion-20B. Due to the already high capabilities, adding our dataset will mesh well with how the model performs.
|
41 |
Potential experimentation with merging with other 20B Frankenmerge models.
|