Resource Requirements for Running DeepSeek v3 Locally
Hi, I’m interested in running the DeepSeek v3 model locally, and I’d like to know the resource requirements (CPU, GPU, RAM, and storage) for optimal performance. I currently have access to an NVIDIA H100 GPU with 80GB of RAM, and I’m curious if this setup is sufficient or overkill for running the model efficiently.
Any suggestions are welcome!
Thanks in advance for your support!
Gpu is just simple calculations accelerator, nothing more. LLMs need a huge space for their brains first place, next is the speed. So the problem is in 80Gbs. I've launched Q5_K Medium quality of it on lliterally 10 years old Gigabyte 12 Ram slots motherboard and it takes 502Gb Ram to load (it's GGUF so somewhat reduced from original, which even more demanding, in my experience with GGUFs usually Ram=storage space+10%) . If we would have today a Terabyte GPUs that would be no problem.
Your GPU is great for video models generation, training-merging LLMs.