ggml_bakllava-1
This repo contains GGUF files to inference BakLLaVA-1 with llama.cpp end-to-end without any extra dependency.
Note: The mmproj-model-f16.gguf
file structure is experimental and may change. Always use the latest code in llama.cpp.
- Downloads last month
- 16,539
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.