File size: 6,954 Bytes
88213f5
 
 
 
 
 
5c24aad
 
88213f5
 
 
 
6135e3c
 
 
 
 
 
88213f5
55bdef4
 
88b93ae
 
88213f5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5c24aad
 
 
88213f5
 
5c24aad
 
 
88213f5
 
5c24aad
 
 
88213f5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
---
base_model: migtissera/Tess-3-Mixtral-8x22B
language:
- en
library_name: transformers
license: apache-2.0
no_imatrtix: Missing importance matrix for tensor blk.29.ffn_down_exps.weight in a
  very low-bit quantization
quantized_by: mradermacher
---
## About

<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type:  -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/migtissera/Tess-3-Mixtral-8x22B

**llama.cpp crashes when creating some of the imatrix quants. Only the ones where it
did not crash will be provided. quality is likely reduced. When in doubt,
compare with the static quants, which should be safe.**

<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-GGUF
## Usage

If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.

## Provided Quants

(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)

| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q2_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q2_K.gguf.part2of2) | i1-Q2_K | 52.2 | IQ3_XXS probably better |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ3_XXS.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ3_XXS.gguf.part2of2) | i1-IQ3_XXS | 55.0 | lower quality |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ3_XS.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ3_XS.gguf.part2of2) | i1-IQ3_XS | 58.3 |  |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ3_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ3_S.gguf.part2of2) | i1-IQ3_S | 61.6 | beats Q3_K* |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q3_K_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q3_K_S.gguf.part2of2) | i1-Q3_K_S | 61.6 | IQ3_XS probably better |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ3_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ3_M.gguf.part2of2) | i1-IQ3_M | 64.6 |  |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q3_K_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q3_K_M.gguf.part2of2) | i1-Q3_K_M | 67.9 | IQ3_S probably better |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q3_K_L.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q3_K_L.gguf.part2of2) | i1-Q3_K_L | 72.7 | IQ3_M probably better |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ4_XS.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-IQ4_XS.gguf.part2of2) | i1-IQ4_XS | 75.6 |  |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q4_0.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q4_0.gguf.part2of2) | i1-Q4_0 | 80.0 | fast, low quality |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q4_K_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q4_K_S.gguf.part2of2) | i1-Q4_K_S | 80.6 | optimal size/speed/quality |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q4_K_M.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q4_K_M.gguf.part2of2) | i1-Q4_K_M | 85.7 | fast, recommended |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q5_K_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q5_K_S.gguf.part2of2) | i1-Q5_K_S | 97.1 |  |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q5_K_M.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q5_K_M.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q5_K_M.gguf.part3of3) | i1-Q5_K_M | 100.1 |  |
| [PART 1](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q6_K.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q6_K.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Tess-3-Mixtral-8x22B-i1-GGUF/resolve/main/Tess-3-Mixtral-8x22B.i1-Q6_K.gguf.part3of3) | i1-Q6_K | 115.6 | practically like static Q6_K |

Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9

## FAQ / Model Request

See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.

## Thanks

I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.

<!-- end -->