|
--- |
|
license: cc-by-4.0 |
|
base_model: TheDrummer/Star-Command-R-32B-v1 |
|
pipeline_tag: text-generation |
|
quantized_by: Orion-zhen |
|
--- |
|
|
|
# Star-Command-R |
|
|
|
These exl2 models are quantized using my [pixiv-novel](https://huggingface.co/datasets/Orion-zhen/tagged-pixiv-novel) as calibration. |
|
|
|
- [8.0bpw](https://huggingface.co/Orion-zhen/Star-Command-R-32B-v1-exl2/tree/8.0) |
|
- [6.5bpw](https://huggingface.co/Orion-zhen/Star-Command-R-32B-v1-exl2/tree/6.5) |
|
- [5.0bpw](https://huggingface.co/Orion-zhen/Star-Command-R-32B-v1-exl2/tree/5.0) |
|
- [4.65bpw](https://huggingface.co/Orion-zhen/Star-Command-R-32B-v1-exl2/tree/4.65) |
|
|
|
--- |
|
|
|
My proxy got broken, so I'm unable to upload remaining quants anymore 🥲 |
|
|
|
- [4.4bpw](https://huggingface.co/Orion-zhen/Star-Command-R-32B-v1-exl2/tree/4.4) |
|
- [4.2bpw](https://huggingface.co/Orion-zhen/Star-Command-R-32B-v1-exl2/tree/4.2) |
|
- [4.0bpw](https://huggingface.co/Orion-zhen/Star-Command-R-32B-v1-exl2/tree/4.0) |