Qwen2.5-0.5B

ExLlamav2 8 bpw quant of https://huggingface.co/Qwen/Qwen2.5-0.5B

Downloads last month
8
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for altomek/Qwen2.5-0.5B-8bpw-EXL2

Base model

Qwen/Qwen2.5-0.5B
Quantized
(58)
this model