apepkuss79 commited on
Commit
07e3032
1 Parent(s): 0adf89e

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +51 -6
README.md CHANGED
@@ -102,11 +102,56 @@ quantized_by: Second State Inc.
102
  | [DeepSeek-Coder-V2-Instruct-Q4_K_S-00003-of-00005.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q4_K_S-00003-of-00005.gguf) | Q4_K_S | 4 | 29.9 GB| small, greater quality loss |
103
  | [DeepSeek-Coder-V2-Instruct-Q4_K_S-00004-of-00005.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q4_K_S-00004-of-00005.gguf) | Q4_K_S | 4 | 29.8 GB| small, greater quality loss |
104
  | [DeepSeek-Coder-V2-Instruct-Q4_K_S-00005-of-00005.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q4_K_S-00005-of-00005.gguf) | Q4_K_S | 4 | 14.8 GB| small, greater quality loss |
105
- | [DeepSeek-Coder-V2-Instruct-Q5_0.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_0.gguf) | Q5_0 | 5 | 10.8 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
106
- | [DeepSeek-Coder-V2-Instruct-Q5_K_M.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_M.gguf) | Q5_K_M | 5 | 11.9 GB| large, very low quality loss - recommended |
107
- | [DeepSeek-Coder-V2-Instruct-Q5_K_S.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_S.gguf) | Q5_K_S | 5 | 11.1 GB| large, low quality loss - recommended |
108
- | [DeepSeek-Coder-V2-Instruct-Q6_K.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q6_K.gguf) | Q6_K | 6 | 14.1 GB| very large, extremely low quality loss |
109
- | [DeepSeek-Coder-V2-Instruct-Q8_0.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0.gguf) | Q8_0 | 8 | 16.7 GB| very large, extremely low quality loss - not recommended |
110
- | [DeepSeek-Coder-V2-Instruct-f16.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16.gguf) | f16 | 16 | 31.4 GB| |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
111
 
112
  *Quatized with llama.cpp b3499*
 
102
  | [DeepSeek-Coder-V2-Instruct-Q4_K_S-00003-of-00005.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q4_K_S-00003-of-00005.gguf) | Q4_K_S | 4 | 29.9 GB| small, greater quality loss |
103
  | [DeepSeek-Coder-V2-Instruct-Q4_K_S-00004-of-00005.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q4_K_S-00004-of-00005.gguf) | Q4_K_S | 4 | 29.8 GB| small, greater quality loss |
104
  | [DeepSeek-Coder-V2-Instruct-Q4_K_S-00005-of-00005.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q4_K_S-00005-of-00005.gguf) | Q4_K_S | 4 | 14.8 GB| small, greater quality loss |
105
+ | [DeepSeek-Coder-V2-Instruct-Q5_0-00001-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_0-00001-of-00006.gguf) | Q5_0 | 5 | 29.4 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
106
+ | [DeepSeek-Coder-V2-Instruct-Q5_0-00002-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_0-00002-of-00006.gguf) | Q5_0 | 5 | 29.2 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
107
+ | [DeepSeek-Coder-V2-Instruct-Q5_0-00003-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_0-00003-of-00006.gguf) | Q5_0 | 5 | 30.0 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
108
+ | [DeepSeek-Coder-V2-Instruct-Q5_0-00004-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_0-00004-of-00006.gguf) | Q5_0 | 5 | 29.2 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
109
+ | [DeepSeek-Coder-V2-Instruct-Q5_0-00005-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_0-00005-of-00006.gguf) | Q5_0 | 5 | 29.2 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
110
+ | [DeepSeek-Coder-V2-Instruct-Q5_0-00006-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_0-00006-of-00006.gguf) | Q5_0 | 5 | 15.4 GB| legacy; medium, balanced quality - prefer using Q4_K_M |
111
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_M-00001-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_M-00001-of-00006.gguf) | Q5_K_M | 5 | 29.7 GB| large, very low quality loss - recommended |
112
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_M-00002-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_M-00002-of-00006.gguf) | Q5_K_M | 5 | 29.7 GB| large, very low quality loss - recommended |
113
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_M-00003-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_M-00003-of-00006.gguf) | Q5_K_M | 5 | 29.7 GB| large, very low quality loss - recommended |
114
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_M-00004-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_M-00004-of-00006.gguf) | Q5_K_M | 5 | 29.9 GB| large, very low quality loss - recommended |
115
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_M-00005-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_M-00005-of-00006.gguf) | Q5_K_M | 5 | 29.9 GB| large, very low quality loss - recommended |
116
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_M-00006-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_M-00006-of-00006.gguf) | Q5_K_M | 5 | 18.3 GB| large, very low quality loss - recommended |
117
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_S-00001-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_S-00001-of-00006.gguf) | Q5_K_S | 5 | 29.4 GB| large, low quality loss - recommended |
118
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_S-00002-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_S-00002-of-00006.gguf) | Q5_K_S | 5 | 29.2 GB| large, low quality loss - recommended |
119
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_S-00003-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_S-00003-of-00006.gguf) | Q5_K_S | 5 | 30.0 GB| large, low quality loss - recommended |
120
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_S-00004-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_S-00004-of-00006.gguf) | Q5_K_S | 5 | 29.2 GB| large, low quality loss - recommended |
121
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_S-00005-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_S-00005-of-00006.gguf) | Q5_K_S | 5 | 29.2 GB| large, low quality loss - recommended |
122
+ | [DeepSeek-Coder-V2-Instruct-Q5_K_S-00006-of-00006.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q5_K_S-00006-of-00006.gguf) | Q5_K_S | 5 | 15.4 GB| large, low quality loss - recommended |
123
+ | [DeepSeek-Coder-V2-Instruct-Q6_K-00001-of-00007.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q6_K-00001-of-00007.gguf) | Q6_K | 6 | 29.6 GB| very large, extremely low quality loss |
124
+ | [DeepSeek-Coder-V2-Instruct-Q6_K-00002-of-00007.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q6_K-00002-of-00007.gguf) | Q6_K | 6 | 29.0 GB| very large, extremely low quality loss |
125
+ | [DeepSeek-Coder-V2-Instruct-Q6_K-00003-of-00007.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q6_K-00003-of-00007.gguf) | Q6_K | 6 | 29.5 GB| very large, extremely low quality loss |
126
+ | [DeepSeek-Coder-V2-Instruct-Q6_K-00004-of-00007.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q6_K-00004-of-00007.gguf) | Q6_K | 6 | 29.3 GB| very large, extremely low quality loss |
127
+ | [DeepSeek-Coder-V2-Instruct-Q6_K-00005-of-00007.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q6_K-00005-of-00007.gguf) | Q6_K | 6 | 29.3 GB| very large, extremely low quality loss |
128
+ | [DeepSeek-Coder-V2-Instruct-Q6_K-00006-of-00007.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q6_K-00006-of-00007.gguf) | Q6_K | 6 | 29.3 GB| very large, extremely low quality loss |
129
+ | [DeepSeek-Coder-V2-Instruct-Q6_K-00007-of-00007.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q6_K-00007-of-00007.gguf) | Q6_K | 6 | 17.3 GB| very large, extremely low quality loss |
130
+ | [DeepSeek-Coder-V2-Instruct-Q8_0-00001-of-00009.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0-00001-of-00009.gguf) | Q8_0 | 8 | 29.7 GB| very large, extremely low quality loss - not recommended |
131
+ | [DeepSeek-Coder-V2-Instruct-Q8_0-00002-of-00009.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0-00002-of-00009.gguf) | Q8_0 | 8 | 29.6 GB| very large, extremely low quality loss - not recommended |
132
+ | [DeepSeek-Coder-V2-Instruct-Q8_0-00003-of-00009.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0-00003-of-00009.gguf) | Q8_0 | 8 | 29.6 GB| very large, extremely low quality loss - not recommended |
133
+ | [DeepSeek-Coder-V2-Instruct-Q8_0-00004-of-00009.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0-00004-of-00009.gguf) | Q8_0 | 8 | 29.6 GB| very large, extremely low quality loss - not recommended |
134
+ | [DeepSeek-Coder-V2-Instruct-Q8_0-00005-of-00009.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0-00005-of-00009.gguf) | Q8_0 | 8 | 29.6 GB| very large, extremely low quality loss - not recommended |
135
+ | [DeepSeek-Coder-V2-Instruct-Q8_0-00006-of-00009.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0-00006-of-00009.gguf) | Q8_0 | 8 | 29.6 GB| very large, extremely low quality loss - not recommended |
136
+ | [DeepSeek-Coder-V2-Instruct-Q8_0-00007-of-00009.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0-00007-of-00009.gguf) | Q8_0 | 8 | 29.6 GB| very large, extremely low quality loss - not recommended |
137
+ | [DeepSeek-Coder-V2-Instruct-Q8_0-00008-of-00009.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0-00008-of-00009.gguf) | Q8_0 | 8 | 29.6 GB| very large, extremely low quality loss - not recommended |
138
+ | [DeepSeek-Coder-V2-Instruct-Q8_0-00009-of-00009.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-Q8_0-00009-of-00009.gguf) | Q8_0 | 8 | 14.0 GB| very large, extremely low quality loss - not recommended |
139
+ | [DeepSeek-Coder-V2-Instruct-f16-00001-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00001-of-00017.gguf) | f16 | 16 | 29.5 GB| |
140
+ | [DeepSeek-Coder-V2-Instruct-f16-00002-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00002-of-00017.gguf) | f16 | 16 | 29.3 GB| |
141
+ | [DeepSeek-Coder-V2-Instruct-f16-00003-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00003-of-00017.gguf) | f16 | 16 | 28.9 GB| |
142
+ | [DeepSeek-Coder-V2-Instruct-f16-00004-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00004-of-00017.gguf) | f16 | 16 | 29.3 GB| |
143
+ | [DeepSeek-Coder-V2-Instruct-f16-00005-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00005-of-00017.gguf) | f16 | 16 | 29.3 GB| |
144
+ | [DeepSeek-Coder-V2-Instruct-f16-00006-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00006-of-00017.gguf) | f16 | 16 | 28.9 GB| |
145
+ | [DeepSeek-Coder-V2-Instruct-f16-00007-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00007-of-00017.gguf) | f16 | 16 | 29.3 GB| |
146
+ | [DeepSeek-Coder-V2-Instruct-f16-00008-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00008-of-00017.gguf) | f16 | 16 | 29.3 GB| |
147
+ | [DeepSeek-Coder-V2-Instruct-f16-00009-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00009-of-00017.gguf) | f16 | 16 | 28.9 GB| |
148
+ | [DeepSeek-Coder-V2-Instruct-f16-00010-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00010-of-00017.gguf) | f16 | 16 | 29.3 GB| |
149
+ | [DeepSeek-Coder-V2-Instruct-f16-00011-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00011-of-00017.gguf) | f16 | 16 | 29.3 GB| |
150
+ | [DeepSeek-Coder-V2-Instruct-f16-00012-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00012-of-00017.gguf) | f16 | 16 | 28.5 GB| |
151
+ | [DeepSeek-Coder-V2-Instruct-f16-00013-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00013-of-00017.gguf) | f16 | 16 | 29.7 GB| |
152
+ | [DeepSeek-Coder-V2-Instruct-f16-00014-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00014-of-00017.gguf) | f16 | 16 | 29.3 GB| |
153
+ | [DeepSeek-Coder-V2-Instruct-f16-00015-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00015-of-00017.gguf) | f16 | 16 | 28.9 GB| |
154
+ | [DeepSeek-Coder-V2-Instruct-f16-00016-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00016-of-00017.gguf) | f16 | 16 | 29.3 GB| |
155
+ | [DeepSeek-Coder-V2-Instruct-f16-00017-of-00017.gguf](https://huggingface.co/second-state/DeepSeek-Coder-V2-Instruct-GGUF/blob/main/DeepSeek-Coder-V2-Instruct-f16-00017-of-00017.gguf) | f16 | 16 | 5.03 GB| |
156
 
157
  *Quatized with llama.cpp b3499*