Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
LoneStriker
/
wolfram_miqu-1-120b-GGUF
like
5
Transformers
GGUF
5 languages
mergekit
Merge
Inference Endpoints
conversational
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
09ec7f1
wolfram_miqu-1-120b-GGUF
2 contributors
History:
2 commits
LoneStriker
Upload folder using huggingface_hub
09ec7f1
verified
9 months ago
.gitattributes
Safe
1.12 kB
Upload folder using huggingface_hub
9 months ago
README.md
Safe
3.34 kB
Upload folder using huggingface_hub
9 months ago
huggingface-metadata.txt
Safe
2.74 kB
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q2_K.gguf
Safe
44.2 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_L.gguf-part-a
Safe
31.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_L.gguf-part-b
Safe
31.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_M.gguf-part-a
Safe
28.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_M.gguf-part-b
Safe
28.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_S.gguf-part-a
Safe
25.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q3_K_S.gguf-part-b
Safe
25.9 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q4_K_M.gguf-part-a
Safe
36.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q4_K_M.gguf-part-b
Safe
36.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q4_K_S.gguf-part-a
Safe
34.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q4_K_S.gguf-part-b
Safe
34.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q5_K_M.gguf-part-a
Safe
42.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q5_K_M.gguf-part-b
Safe
42.5 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q5_K_S.gguf-part-a
Safe
41.4 GB
LFS
Upload folder using huggingface_hub
9 months ago
wolfram_miqu-1-120b-Q5_K_S.gguf-part-b
Safe
41.4 GB
LFS
Upload folder using huggingface_hub
9 months ago