Update README.md
Browse files
README.md
CHANGED
@@ -1,4 +1,5 @@
|
|
1 |
---
|
|
|
2 |
library_name: transformers
|
3 |
tags:
|
4 |
- 4-bit
|
@@ -6,10 +7,28 @@ tags:
|
|
6 |
- text-generation
|
7 |
- autotrain_compatible
|
8 |
- endpoints_compatible
|
|
|
|
|
9 |
pipeline_tag: text-generation
|
10 |
inference: false
|
11 |
quantized_by: Suparious
|
12 |
---
|
13 |
-
#
|
14 |
|
15 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
license: apache-2.0
|
3 |
library_name: transformers
|
4 |
tags:
|
5 |
- 4-bit
|
|
|
7 |
- text-generation
|
8 |
- autotrain_compatible
|
9 |
- endpoints_compatible
|
10 |
+
- moe
|
11 |
+
- merge
|
12 |
pipeline_tag: text-generation
|
13 |
inference: false
|
14 |
quantized_by: Suparious
|
15 |
---
|
16 |
+
# Alsebay/RainyMotip-2x7B AWQ
|
17 |
|
18 |
+
- Model creator: [Alsebay](https://huggingface.co/Alsebay)
|
19 |
+
- Original model: [RainyMotip-2x7B](https://huggingface.co/Alsebay/RainyMotip-2x7B)
|
20 |
+
|
21 |
+
## Model Summary
|
22 |
+
|
23 |
+
What is it? A 2x7B MoE model for Roleplay(?).
|
24 |
+
|
25 |
+
You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.
|
26 |
+
|
27 |
+
You may want see this: https://huggingface.co/Alsebay/My_LLMs_Leaderboard
|
28 |
+
|
29 |
+
This model is is a Mixure of Experts (MoE) made with the following models:
|
30 |
+
|
31 |
+
- udkai/Turdus
|
32 |
+
- Kquant03/Samlagast-7B-laser-bf16
|
33 |
+
|
34 |
+
If you used it, please let me know if it good or not. Thank you :)
|