TheMelonGod commited on
Commit
f471f0c
·
verified ·
1 Parent(s): 6c81797

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +57 -3
README.md CHANGED
@@ -1,3 +1,57 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ quantized_by: TheMelonGod
6
+ pipeline_tag: text-generation
7
+ tags:
8
+ - quantized
9
+ - safetensors
10
+ - exllamav2
11
+ - qwen2
12
+ base_model:
13
+ - cognitivecomputations/Dolphin3.0-Qwen2.5-0.5B
14
+ base_model_relation: quantized
15
+ ---
16
+ **Orignal Model by:** [Cognitive Computations](https://huggingface.co/cognitivecomputations)
17
+ **Orignal Model:** [Dolphin3.0-Qwen2.5-0.5B](https://huggingface.co/cognitivecomputations/Dolphin3.0-Qwen2.5-0.5B)
18
+
19
+ For more information about the model, I highly recommend checking out the original model page and the creator while you're at it.
20
+
21
+ **ExLlamaV2 Quantizations:**
22
+ Sure, I've added the links to your list. Here it is:
23
+
24
+ **8.0bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-8.0bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-8.0bpw)
25
+ **7.75bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-7.75bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-7.75bpw)
26
+ **7.5bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-7.5bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-7.5bpw)
27
+ **7.25bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-7.25bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-7.25bpw)
28
+ **7.0bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-7.0bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-7.0bpw)
29
+ **6.75bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-6.75bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-6.75bpw)
30
+ **6.5bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-6.5bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-6.5bpw)
31
+ **6.25bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-6.25bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-6.25bpw)
32
+ **6.0bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-6.0bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-6.0bpw)
33
+ **5.75bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-5.75bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-5.75bpw)
34
+ **5.5bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-5.5bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-5.5bpw)
35
+ **5.25bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-5.25bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-5.25bpw)
36
+ **5.0bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-5.0bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-5.0bpw)
37
+ **4.75bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-4.75bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-4.75bpw)
38
+ **4.5bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-4.5bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-4.5bpw)
39
+ **4.25bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-4.25bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-4.25bpw)
40
+ **4.0bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-4.0bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-4.0bpw)
41
+ **3.75bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-3.75bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-3.75bpw)
42
+ **3.5bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-3.5bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-3.5bpw)
43
+ **3.25bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-3.25bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-3.25bpw)
44
+ **3.0bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-3.0bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-3.0bpw)
45
+ **2.75bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-2.75bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-2.75bpw)
46
+ **2.5bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-2.5bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-2.5bpw)
47
+ **2.25bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-2.25bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-2.25bpw)
48
+ **2.0bpw**: [8hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/8hb-2.0bpw) | [6hb](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/tree/6hb-2.0bpw)
49
+
50
+ [Measurement File](https://huggingface.co/TheMelonGod/Dolphin3.0-Qwen2.5-0.5B-exl2/blob/main/Dolphin3.0-Qwen2.5-0.5B-measurement.json) _(Default/built-in calibration dataset was used)_
51
+
52
+ If you need a specific model quantized or particular bits per weight, please let me know. I’m happy to help.
53
+
54
+ Your feedback and suggestions are always welcome! They help me improve and make quantizations better for everyone.
55
+
56
+ Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated!
57
+