Add AWQ quant links
Browse files
README.md
CHANGED
@@ -46,6 +46,8 @@ ExLlamaV2: https://huggingface.co/bartowski/NeuralHyperion-2.0-Mistral-7B-exl2
|
|
46 |
|
47 |
GGUF: https://huggingface.co/bartowski/NeuralHyperion-2.0-Mistral-7B-GGUF
|
48 |
|
|
|
|
|
49 |
## How to Use
|
50 |
```python
|
51 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
|
|
46 |
|
47 |
GGUF: https://huggingface.co/bartowski/NeuralHyperion-2.0-Mistral-7B-GGUF
|
48 |
|
49 |
+
AWQ: https://huggingface.co/solidrust/NeuralHyperion-2.0-Mistral-7B-AWQ
|
50 |
+
|
51 |
## How to Use
|
52 |
```python
|
53 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|