altomek commited on
Commit
5938e33
·
verified ·
1 Parent(s): 52e01ad

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +104 -0
README.md ADDED
@@ -0,0 +1,104 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ library_name: transformers
4
+ tags:
5
+ - merge
6
+ base_model:
7
+ - 01-ai/Yi-1.5-34B-Chat
8
+ - 01-ai/Yi-1.5-34B
9
+ pipeline_tag: text-generation
10
+ ---
11
+
12
+ #
13
+ <img src=https://huggingface.co/altomek/YiSM-34B-0rn/resolve/main/YiSM.png>
14
+ <a href="https://www.youtube.com/watch?v=a9dNpk9G5h0" title="P.T. Adamczyk - Never Looking Back | Cyberpunk 2077: Phantom Liberty (Original Score)" target="_blank">intro music...</a>
15
+
16
+ ## YiSM-34B-0rn
17
+
18
+ This is Yi Self Merged. I wanted model that will follow most instuctions yet preserve its base model nature.
19
+
20
+ ### Ingridients
21
+
22
+ - [Yi-1.5-34B-Chat](https://huggingface.co/01-ai/Yi-1.5-34B-Chat)
23
+
24
+ - [Yi-1.5-34B](https://huggingface.co/01-ai/Yi-1.5-34B-Chat/)
25
+
26
+ ### Settings
27
+
28
+ I use max_seq_len 8K with alpha_value 2.65.
29
+
30
+ SillyTavern presets:
31
+
32
+ ```json
33
+ {
34
+ "temp": 0.1,
35
+ "temperature_last": true,
36
+ "top_p": 1,
37
+ "top_k": 0,
38
+ "top_a": 0,
39
+ "tfs": 1,
40
+ "epsilon_cutoff": 0,
41
+ "eta_cutoff": 0,
42
+ "typical_p": 1,
43
+ "min_p": 0.8,
44
+ "rep_pen": 1.08,
45
+ "rep_pen_range": 0,
46
+ "no_repeat_ngram_size": 0,
47
+ "penalty_alpha": 0,
48
+ "num_beams": 1,
49
+ "length_penalty": 1,
50
+ "min_length": 0,
51
+ "encoder_rep_pen": 1,
52
+ "freq_pen": 0.01,
53
+ "presence_pen": 0,
54
+ "do_sample": true,
55
+ "early_stopping": false,
56
+ "add_bos_token": true,
57
+ "truncation_length": 2048,
58
+ "ban_eos_token": false,
59
+ "skip_special_tokens": true,
60
+ "streaming": true,
61
+ "mirostat_mode": 0,
62
+ "mirostat_tau": 5,
63
+ "mirostat_eta": 0.1,
64
+ "guidance_scale": 1,
65
+ "negative_prompt": "",
66
+ "grammar_string": "",
67
+ "banned_tokens": "",
68
+ "ignore_eos_token_aphrodite": false,
69
+ "spaces_between_special_tokens_aphrodite": true,
70
+ "sampler_order": [
71
+ 6,
72
+ 0,
73
+ 1,
74
+ 3,
75
+ 4,
76
+ 2,
77
+ 5
78
+ ],
79
+ "logit_bias": [],
80
+ "n": 1,
81
+ "rep_pen_size": 0,
82
+ "genamt": 2048,
83
+ "max_length": 8192
84
+ }
85
+ ```
86
+
87
+ ### Terms and Conditions of Use
88
+
89
+ The following table outlines the primary characteristics and intended uses of my YiSM-34B-0rn models:
90
+
91
+ | Model Type | Purpose | Target Users | Key Features |
92
+ | --- | --- | --- | --- |
93
+ | **Censored** | Suitable for general audiences and sensitive topics | Educational institutions, families, and individuals seeking age-appropriate content | Restricts explicit or mature material |
94
+ | **Neutral** (<u>**this one</u>) | Balances accessibility with openness | Universities, researchers, and curious minds | Encourages exploration and intellectual exchange |
95
+ | Uncensored | Ideal for adults and specialized fields | Professionals, experts, and advanced scholars | Offers unfiltered access to diverse viewpoints and knowledge |
96
+
97
+ Please remember that all YiSM-34B-0rn models operate under the apache-2.0 license, so familiarize yourself with its terms and conditions before employing their content.
98
+
99
+
100
+ ### Quants
101
+
102
+ - [GGUF](https://huggingface.co/altomek/YiSM-34B-0rn-GGUF)
103
+ - [8bpw](https://huggingface.co/altomek/YiSM-34B-0rn-8bpw-EXL2)
104
+ - [measurements](https://huggingface.co/altomek/measurements/resolve/main/YiSM-34B-0rn_measurement.json) --> ExLlamav2 measurments