jeiku commited on
Commit
5fef9d9
·
verified ·
1 Parent(s): 3555403

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -39
README.md CHANGED
@@ -2,46 +2,14 @@
2
  base_model:
3
  - ChaoticNeutrals/T-900-8B
4
  - ResplendentAI/Nymph_8B
5
- library_name: transformers
6
- tags:
7
- - mergekit
8
- - merge
9
-
10
  ---
11
- # merge
12
-
13
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
14
-
15
- ## Merge Details
16
- ### Merge Method
17
-
18
- This model was merged using the SLERP merge method.
19
-
20
- ### Models Merged
21
-
22
- The following models were included in the merge:
23
- * [ChaoticNeutrals/T-900-8B](https://huggingface.co/ChaoticNeutrals/T-900-8B)
24
- * [ResplendentAI/Nymph_8B](https://huggingface.co/ResplendentAI/Nymph_8B)
25
 
26
- ### Configuration
27
 
28
- The following YAML configuration was used to produce this model:
29
 
30
- ```yaml
31
- slices:
32
- - sources:
33
- - model: ChaoticNeutrals/T-900-8B
34
- layer_range: [0,32]
35
- - model: ResplendentAI/Nymph_8B
36
- layer_range: [0,32]
37
- merge_method: slerp
38
- base_model: ChaoticNeutrals/T-900-8B
39
- parameters:
40
- t:
41
- - filter: self_attn
42
- value: [0, 0.3, 0.5, 0.7, 1]
43
- - filter: mlp
44
- value: [1, 0.7, 0.5, 0.3, 0]
45
- - value: 0.4
46
- dtype: bfloat16
47
- ```
 
2
  base_model:
3
  - ChaoticNeutrals/T-900-8B
4
  - ResplendentAI/Nymph_8B
5
+ license: apache-2.0
6
+ language:
7
+ - en
 
 
8
  ---
9
+ # Templar v1
 
 
 
 
 
 
 
 
 
 
 
 
 
10
 
11
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/626dfb8786671a29c715f8a9/-VhI9L4SJFQsM1cQX78DW.png)
12
 
13
+ A SLERP merge of T-900 and Nymph, Templar shows some emergent properties that I was not expecting to see.
14
 
15
+ This model is purpose made for roleplaying, and has seen a plethora of data. I assure you that it will serve that purpose very well.