--- license: unknown --- This is a 2.4-bit EXL2 quantization of [Aurelian v0.5 70B 32K](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-fp16), an interim checkpoint before v1.0. See that page for more details. This quantization fits in a single 24GB using Exllamav2 & 8-bit cache @ 10K context. It uses the newer experimental quantization method from turboderp.