Nitral-AI commited on
Commit
a3e46ca
·
verified ·
1 Parent(s): 145fb73

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +39 -10
README.md CHANGED
@@ -1,12 +1,41 @@
1
  ---
2
- license: other
3
- language:
4
- - en
 
 
 
 
 
5
  ---
6
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/umAoqWpJAhrpZbmzwiynH.png)
7
- # Quant's available from Bartowski <3: [GGUF](https://huggingface.co/bartowski/Captain_BMO-12B-GGUF) Exl2 Quant's from me: [4bpw Exl2](https://huggingface.co/Nitral-AI/Captain_BMO-12b-4bpw-exl2) [6bpw Exl2](https://huggingface.co/Nitral-AI/Captain_BMO-12b-6bpw-exl2)
8
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/qmC-LKV6T93GhuqSjtSxN.png)
9
- # Uses Mistral Formatting, Text completion [preset here](https://huggingface.co/Nitral-AI/Captain_BMO-12B/tree/main/ST)
10
- Notes: One off train most likely, this was done purely for internal testing purposes but seemed ok enough to release. I do not plan to offer any kind of extended support for using this model, so your mileage may vary depending on use and context size.
11
- - (Nemo 12B instruct as base)
12
- - 200k randomized subset of GU_instruct-Remastered-1.1, with a splash of 25k hathor/poppy sauce, slow cooked for 3 epochs on medium heat.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ base_model:
3
+ - Epiculous/Violet_Twilight-v0.2
4
+ - Nitral-AI/Captain_BMO-12B
5
+ library_name: transformers
6
+ tags:
7
+ - mergekit
8
+ - merge
9
+
10
  ---
11
+
12
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/V4OFnx7IBvXhYphdkUr5F.png)
13
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/Shj4ZhtUMgIpyhnfUsk98.png)
14
+ # Instruct/Context import + Textgen preset both available: [Here](https://huggingface.co/Nitral-AI/Captain_Violet-V0.420-12B/tree/main/ST)
15
+
16
+
17
+ ## Original Models used in the merge:
18
+ [Epiculous/Violet_Twilight-v0.2](https://huggingface.co/Epiculous/Violet_Twilight-v0.2)
19
+ [Nitral-AI/Captain_BMO-12B](https://huggingface.co/Nitral-AI/Captain_BMO-12B)
20
+
21
+
22
+ ### The following YAML configuration was used to produce this model:
23
+
24
+ ```yaml
25
+ slices:
26
+ - sources:
27
+ - model: Nitral-AI/Captain_BMO-12B
28
+ layer_range: [0, 40]
29
+ - model: Epiculous/Violet_Twilight-v0.2
30
+ layer_range: [0, 40]
31
+ merge_method: slerp
32
+ base_model: Nitral-AI/Captain_BMO-12B
33
+ parameters:
34
+ t:
35
+ - filter: self_attn
36
+ value: [0, 0.5, 0.3, 0.7, 1]
37
+ - filter: mlp
38
+ value: [1, 0.5, 0.7, 0.3, 0]
39
+ - value: 0.420
40
+ dtype: bfloat16
41
+ ```