Quantized models (4bit) request

#4
by terminator33 - opened

Can we please get a 4bit quantized model!!

LCPP does not support this model yet. I have an issue open with them on their github. I will be quantizing it as soon as it's resolved, and I have to assume others will as well.

Cohere Labs org

We released a 4 bit quantized model here today: https://huggingface.co/CohereForAI/c4ai-command-r-plus-4bit. Enjoy!

sarahooker changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment