license: apache-2.0 | |
language: | |
- fi | |
- en | |
Original model: [Poro-34B-chat](https://huggingface.co/LumiOpen/Poro-34B-chat) | |
### Description | |
GGUF-format model files quantized using [llama.cpp](https://github.com/ggerganov/llama.cpp) | |
We have Q4_K_M and Q5_K_M quantized models available. |