Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
LoneStriker
/
wolfram_miqu-1-120b-GGUF
like
5
Transformers
GGUF
5 languages
mergekit
Merge
Inference Endpoints
conversational
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
wolfram_miqu-1-120b-GGUF
2 contributors
History:
3 commits
LoneStriker
wolfram
Update README.md (
#1
)
3226711
verified
9 months ago
.gitattributes
Safe
1.12 kB
Upload folder using huggingface_hub
9 months ago
README.md
Safe
3.92 kB
Update README.md (#1)
9 months ago
huggingface-metadata.txt
Safe
2.74 kB
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q2_K.gguf
Safe
44.2 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_L.gguf-part-a
Safe
31.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_L.gguf-part-b
Safe
31.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_M.gguf-part-a
Safe
28.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_M.gguf-part-b
Safe
28.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_S.gguf-part-a
Safe
25.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_S.gguf-part-b
Safe
25.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q4_K_M.gguf-part-a
Safe
36.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q4_K_M.gguf-part-b
Safe
36.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q4_K_S.gguf-part-a
Safe
34.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q4_K_S.gguf-part-b
Safe
34.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q5_K_M.gguf-part-a
Safe
42.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q5_K_M.gguf-part-b
Safe
42.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q5_K_S.gguf-part-a
Safe
41.4 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q5_K_S.gguf-part-b
Safe
41.4 GB
LFS
Upload folder using huggingface_hub
9 months ago