Vram
#7
by
DKRacingFan
- opened
How much vram is needed to run this model?
for 4bit quant, it need about 40G
I'm currently using a 3090, and if I quantize it with Q5_k_m, it works, although it's slow.
How much vram is needed to run this model?
for 4bit quant, it need about 40G
I'm currently using a 3090, and if I quantize it with Q5_k_m, it works, although it's slow.