GPTQ 4 bit 128 groupsize CUDA quantization of CalderaAI's 13B Ouroboros:
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
GPTQ 4 bit 128 groupsize CUDA quantization of CalderaAI's 13B Ouroboros: