GPU memory

#19
by gebaltso - opened

How much GPU memory is needed to run the sd 3.5 large? I get constantly torch.OutOfMemoryError: CUDA out of memory even if I use a 4090 nvidia 24G.

Edit: I used
pipe.enable_model_cpu_offload()

instead of
pipe = pipe.to("cuda")

and ran properly.

Sign up or log in to comment