Inference with vLLM not working

#2
by llamameta - opened

OSError: openbmb/MiniCPM-V-2_6-gguf does not appear to have a file named

config.json. Checkout

'https://huggingface.co/openbmb/MiniCPM-V-2_6-gguf/tree/main' for available

files.

llamameta changed discussion title from Inference with vLLM to Inference with vLLM not working

Sign up or log in to comment