legraphista
commited on
Commit
β’
f1a898a
1
Parent(s):
f1032d5
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -82,8 +82,8 @@ Link: [here](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/im
|
|
82 |
| [xLAM-7b-r.Q5_K_S.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q5_K_S.gguf) | Q5_K_S | 5.00GB | β
Available | βͺ Static | π¦ No
|
83 |
| [xLAM-7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q4_K.gguf) | Q4_K | 4.37GB | β
Available | π’ IMatrix | π¦ No
|
84 |
| [xLAM-7b-r.Q4_K_S.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q4_K_S.gguf) | Q4_K_S | 4.14GB | β
Available | π’ IMatrix | π¦ No
|
85 |
-
| xLAM-7b-r.IQ4_NL | IQ4_NL |
|
86 |
-
| xLAM-7b-r.IQ4_XS | IQ4_XS |
|
87 |
| [xLAM-7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q3_K.gguf) | Q3_K | 3.52GB | β
Available | π’ IMatrix | π¦ No
|
88 |
| [xLAM-7b-r.Q3_K_L.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q3_K_L.gguf) | Q3_K_L | 3.82GB | β
Available | π’ IMatrix | π¦ No
|
89 |
| [xLAM-7b-r.Q3_K_S.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q3_K_S.gguf) | Q3_K_S | 3.16GB | β
Available | π’ IMatrix | π¦ No
|
|
|
82 |
| [xLAM-7b-r.Q5_K_S.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q5_K_S.gguf) | Q5_K_S | 5.00GB | β
Available | βͺ Static | π¦ No
|
83 |
| [xLAM-7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q4_K.gguf) | Q4_K | 4.37GB | β
Available | π’ IMatrix | π¦ No
|
84 |
| [xLAM-7b-r.Q4_K_S.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q4_K_S.gguf) | Q4_K_S | 4.14GB | β
Available | π’ IMatrix | π¦ No
|
85 |
+
| [xLAM-7b-r.IQ4_NL.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.IQ4_NL.gguf) | IQ4_NL | 4.13GB | β
Available | π’ IMatrix | π¦ No
|
86 |
+
| [xLAM-7b-r.IQ4_XS.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.IQ4_XS.gguf) | IQ4_XS | 3.91GB | β
Available | π’ IMatrix | π¦ No
|
87 |
| [xLAM-7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q3_K.gguf) | Q3_K | 3.52GB | β
Available | π’ IMatrix | π¦ No
|
88 |
| [xLAM-7b-r.Q3_K_L.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q3_K_L.gguf) | Q3_K_L | 3.82GB | β
Available | π’ IMatrix | π¦ No
|
89 |
| [xLAM-7b-r.Q3_K_S.gguf](https://huggingface.co/legraphista/xLAM-7b-r-IMat-GGUF/blob/main/xLAM-7b-r.Q3_K_S.gguf) | Q3_K_S | 3.16GB | β
Available | π’ IMatrix | π¦ No
|