MiniCPM-V-2_6-GGUF
Original Model
Run with Gaianet
Prompt template:
prompt template: minicpmv
Context size:
chat_ctx_size: 128000
Run with GaiaNet:
Quick start: https://docs.gaianet.ai/node-guide/quick-start
Customize your node: https://docs.gaianet.ai/node-guide/customize
Quantized with llama.cpp b4120
- Downloads last month
- 414
Hardware compatibility
Log In
to view the estimation
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit
16-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for gaianet/MiniCPM-V-2_6-GGUF
Base model
openbmb/MiniCPM-V-2_6