Sharded GGUF version of openbmb/MiniCPM3-4B-GGUF.

Downloads last month
14
GGUF
Model size
4.07B params
Architecture
minicpm3
Hardware compatibility
Log In to view the estimation
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Felladrin/gguf-Q4_K_M-MiniCPM3-4B

Quantized
(2)
this model