Fischerboot commited on
Commit
7b8e0e7
1 Parent(s): ed04ab7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +43 -31
README.md CHANGED
@@ -1,31 +1,43 @@
1
- ---
2
- base_model: []
3
- library_name: transformers
4
- tags:
5
- - mergekit
6
- - merge
7
-
8
- ---
9
- # output-model-directory
10
-
11
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
-
13
- ## Merge Details
14
- ### Merge Method
15
-
16
- This model was merged using the passthrough merge method.
17
-
18
- ### Models Merged
19
-
20
- The following models were included in the merge:
21
- * ./3b + ./thinking-3b
22
-
23
- ### Configuration
24
-
25
- The following YAML configuration was used to produce this model:
26
-
27
- ```yaml
28
- models:
29
- - model: ./3b+./thinking-3b
30
- merge_method: passthrough
31
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: []
3
+ library_name: transformers
4
+ tags:
5
+ - mergekit
6
+ - merge
7
+
8
+ ---
9
+ # Holy Fuck
10
+
11
+ this model was a proof of concept, it has thinking (and other) tags, which made the quality of the output, really f*ckin good.
12
+
13
+ (Tested Q8 GGUF)
14
+
15
+ It does really well as a Q8, its fast as fuck boi, and small.
16
+
17
+ This is just a lora checkpoint, so once the final produt is done, expect something better.
18
+
19
+ Link to the final product will be here when its done.
20
+
21
+ # output-model-directory
22
+
23
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
24
+
25
+ ## Merge Details
26
+ ### Merge Method
27
+
28
+ This model was merged using the passthrough merge method.
29
+
30
+ ### Models Merged
31
+
32
+ The following models were included in the merge:
33
+ * ./3b + ./thinking-3b
34
+
35
+ ### Configuration
36
+
37
+ The following YAML configuration was used to produce this model:
38
+
39
+ ```yaml
40
+ models:
41
+ - model: ./3b+./thinking-3b
42
+ merge_method: passthrough
43
+ ```