Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
SpongeEngine
/
patricide-12B-Unslop-Mell-i1-GGUF
like
0
Follow
SpongeEngine
2
GGUF
English
SpongeQuant
i1-GGUF
Inference Endpoints
imatrix
License:
mit
Model card
Files
Files and versions
Community
Deploy
Use this model
main
patricide-12B-Unslop-Mell-i1-GGUF
1 contributor
History:
3 commits
dclipca
Update README.md
9b70e99
verified
about 15 hours ago
.gitattributes
Safe
3.05 kB
Upload folder using huggingface_hub
7 days ago
README.md
2.6 kB
Update README.md
about 15 hours ago
patricide-12B-Unslop-Mell.imatrix.dat
7.05 MB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ2_M.gguf
Safe
4.44 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ2_S.gguf
Safe
4.14 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ2_XS.gguf
Safe
3.92 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ2_XXS.gguf
Safe
3.59 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ3_M.gguf
Safe
5.72 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ3_S.gguf
Safe
5.56 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ3_XS.gguf
Safe
5.31 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ3_XXS.gguf
Safe
4.95 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ4_NL.gguf
Safe
7.1 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-IQ4_XS.gguf
Safe
6.74 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-Q2_K.gguf
Safe
4.79 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-Q3_K_L.gguf
Safe
6.56 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-Q3_K_M.gguf
Safe
6.08 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-Q3_K_S.gguf
Safe
5.53 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-Q4_K_M.gguf
Safe
7.48 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-Q4_K_S.gguf
Safe
7.12 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-Q5_K_M.gguf
Safe
8.73 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-Q5_K_S.gguf
Safe
8.52 GB
LFS
Upload folder using huggingface_hub
7 days ago
patricide-12b-unslop-mell-i1-Q6_K.gguf
Safe
10.1 GB
LFS
Upload folder using huggingface_hub
7 days ago