Is there a way to run on the GPU of RTX3060 with VRAM12GB? Will a lightweight model be released
#1
by
michaelj
- opened
Is there a way to run on the GPU of RTX3060 with VRAM12GB? Will a lightweight model be released
In inference code, device-map is set to auto, so cpu_offloading will be initiated with insufficient VRAM.
Running with two 3090 is ok, but I'm not sure about 3060, may be you can help us test and let us know?
Is there a way to run on the GPU of RTX3060 with VRAM12GB? Will a lightweight model be released
Check this, it seems the author tried loading on a RTX 4060 Ti 16 GB.
https://huggingface.co/rhymes-ai/Aria-sequential_mlp/discussions/1#6713372f0fac3235c596f388