Update README.md
Browse files
README.md
CHANGED
@@ -207,7 +207,7 @@ Please remember that all YiSM-34B-0rn models operate under the apache-2.0 licens
|
|
207 |
- [6.5bpw](https://huggingface.co/altomek/YiSM-34B-0rn-6.5bpw-EXL2)
|
208 |
- [4.65bpw](https://huggingface.co/altomek/YiSM-34B-0rn-4.65bpw-EXL2)
|
209 |
- [4bpw](https://huggingface.co/altomek/YiSM-34B-0rn-4bpw-EXL2)
|
210 |
-
- [3.2bpw](https://huggingface.co/altomek/YiSM-34B-0rn-3.2bpw-EXL2)
|
211 |
- [measurements](https://huggingface.co/altomek/measurements/resolve/main/YiSM-34B-0rn_measurement.json) --> ExLlamav2 measurments
|
212 |
|
213 |
### [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard)
|
|
|
207 |
- [6.5bpw](https://huggingface.co/altomek/YiSM-34B-0rn-6.5bpw-EXL2)
|
208 |
- [4.65bpw](https://huggingface.co/altomek/YiSM-34B-0rn-4.65bpw-EXL2)
|
209 |
- [4bpw](https://huggingface.co/altomek/YiSM-34B-0rn-4bpw-EXL2)
|
210 |
+
- [3.2bpw](https://huggingface.co/altomek/YiSM-34B-0rn-3.2bpw-EXL2) -> Fits in 16GB VRAM but not recomended. Performance is significantly degraded in lower quants.
|
211 |
- [measurements](https://huggingface.co/altomek/measurements/resolve/main/YiSM-34B-0rn_measurement.json) --> ExLlamav2 measurments
|
212 |
|
213 |
### [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard)
|