Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
akx
/
Poro-34B-gguf
like
3
GGUF
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Deploy
Use this model
main
Poro-34B-gguf
2 contributors
History:
8 commits
akx
valstu
Update README.md (
#2
)
739b6e1
verified
10 months ago
.gitattributes
Safe
1.69 kB
Upload ggml-model-Q5_K.gguf with huggingface_hub
about 1 year ago
README.md
Safe
495 Bytes
Update README.md (#2)
10 months ago
ggml-model-Q3_K.gguf
Safe
18.6 GB
LFS
1000B
10 months ago
ggml-model-Q4_K.gguf
Safe
22.4 GB
LFS
1000B
10 months ago
ggml-model-Q5_K.gguf
Safe
26.1 GB
LFS
1000B
10 months ago