Quantization made by Richard Erkhov. [Github](https://github.com/RichardErkhov) [Discord](https://discord.gg/pvy7H8DZMG) [Request more models](https://github.com/RichardErkhov/quant_request) Maidphin-Kunoichi-7B - GGUF - Model creator: https://huggingface.co/nbeerbower/ - Original model: https://huggingface.co/nbeerbower/Maidphin-Kunoichi-7B/ | Name | Quant method | Size | | ---- | ---- | ---- | | [Maidphin-Kunoichi-7B.Q2_K.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q2_K.gguf) | Q2_K | 2.53GB | | [Maidphin-Kunoichi-7B.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.IQ3_XS.gguf) | IQ3_XS | 2.81GB | | [Maidphin-Kunoichi-7B.IQ3_S.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.IQ3_S.gguf) | IQ3_S | 2.96GB | | [Maidphin-Kunoichi-7B.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q3_K_S.gguf) | Q3_K_S | 2.95GB | | [Maidphin-Kunoichi-7B.IQ3_M.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.IQ3_M.gguf) | IQ3_M | 3.06GB | | [Maidphin-Kunoichi-7B.Q3_K.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q3_K.gguf) | Q3_K | 3.28GB | | [Maidphin-Kunoichi-7B.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q3_K_M.gguf) | Q3_K_M | 3.28GB | | [Maidphin-Kunoichi-7B.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q3_K_L.gguf) | Q3_K_L | 3.56GB | | [Maidphin-Kunoichi-7B.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.IQ4_XS.gguf) | IQ4_XS | 3.67GB | | [Maidphin-Kunoichi-7B.Q4_0.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q4_0.gguf) | Q4_0 | 3.83GB | | [Maidphin-Kunoichi-7B.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.IQ4_NL.gguf) | IQ4_NL | 3.87GB | | [Maidphin-Kunoichi-7B.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q4_K_S.gguf) | Q4_K_S | 3.86GB | | [Maidphin-Kunoichi-7B.Q4_K.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q4_K.gguf) | Q4_K | 4.07GB | | [Maidphin-Kunoichi-7B.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q4_K_M.gguf) | Q4_K_M | 4.07GB | | [Maidphin-Kunoichi-7B.Q4_1.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q4_1.gguf) | Q4_1 | 4.24GB | | [Maidphin-Kunoichi-7B.Q5_0.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q5_0.gguf) | Q5_0 | 4.65GB | | [Maidphin-Kunoichi-7B.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q5_K_S.gguf) | Q5_K_S | 4.65GB | | [Maidphin-Kunoichi-7B.Q5_K.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q5_K.gguf) | Q5_K | 4.78GB | | [Maidphin-Kunoichi-7B.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q5_K_M.gguf) | Q5_K_M | 4.78GB | | [Maidphin-Kunoichi-7B.Q5_1.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q5_1.gguf) | Q5_1 | 5.07GB | | [Maidphin-Kunoichi-7B.Q6_K.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q6_K.gguf) | Q6_K | 5.53GB | | [Maidphin-Kunoichi-7B.Q8_0.gguf](https://huggingface.co/RichardErkhov/nbeerbower_-_Maidphin-Kunoichi-7B-gguf/blob/main/Maidphin-Kunoichi-7B.Q8_0.gguf) | Q8_0 | 7.17GB | Original model description: --- license: cc-by-nc-4.0 base_model: - SanjiWatsuki/Kunoichi-DPO-v2-7B - nbeerbower/maidphin library_name: transformers tags: - mergekit - merge --- # Maidphin-Kunoichi-7B This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) * [nbeerbower/maidphin](https://huggingface.co/nbeerbower/maidphin) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: nbeerbower/maidphin layer_range: [0, 32] - model: SanjiWatsuki/Kunoichi-DPO-v2-7B layer_range: [0, 32] merge_method: slerp base_model: SanjiWatsuki/Kunoichi-DPO-v2-7B parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.5 dtype: bfloat16 ```