Can you provide GGUF model usable with Ollama locally

#1
by ryg81 - opened

Can you provide GGUF model usable with Ollama locally?

It's not supported by Ollama nor llama.cpp.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment