Is there a way to run on the GPU of RTX3060 with VRAM12GB? Will a lightweight model be released

#1
by michaelj - opened

Is there a way to run on the GPU of RTX3060 with VRAM12GB? Will a lightweight model be released

Rhymes.AI org

In inference code, device-map is set to auto, so cpu_offloading will be initiated with insufficient VRAM.
Running with two 3090 is ok, but I'm not sure about 3060, may be you can help us test and let us know?

Rhymes.AI org

Is there a way to run on the GPU of RTX3060 with VRAM12GB? Will a lightweight model be released

Check this, it seems the author tried loading on a RTX 4060 Ti 16 GB.
https://huggingface.co/rhymes-ai/Aria-sequential_mlp/discussions/1#6713372f0fac3235c596f388

Sign up or log in to comment