Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
LiteLLMs
/
Mixtral-8x22B-v0.1-GGUF
like
0
Follow
LiteLLMs
20
GGUF
5 languages
Mixture of Experts
GGUF
Eval Results
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Deploy
Use this model
4c9bed8
Mixtral-8x22B-v0.1-GGUF
/
Q3_K_S
1 contributor
History:
1 commit
andrijdavid
Upload folder using huggingface_hub
55e36b4
verified
9 months ago
Mixtral-8x22B-v0.1-Q3_K_S-00001-of-00009.gguf
Safe
586 MB
LFS
Upload folder using huggingface_hub
9 months ago
Mixtral-8x22B-v0.1-Q3_K_S-00002-of-00009.gguf
Safe
501 MB
LFS
Upload folder using huggingface_hub
9 months ago
Mixtral-8x22B-v0.1-Q3_K_S-00003-of-00009.gguf
Safe
508 MB
LFS
Upload folder using huggingface_hub
9 months ago
Mixtral-8x22B-v0.1-Q3_K_S-00004-of-00009.gguf
Safe
527 MB
LFS
Upload folder using huggingface_hub
9 months ago
Mixtral-8x22B-v0.1-Q3_K_S-00005-of-00009.gguf
Safe
517 MB
LFS
Upload folder using huggingface_hub
9 months ago
Mixtral-8x22B-v0.1-Q3_K_S-00006-of-00009.gguf
Safe
508 MB
LFS
Upload folder using huggingface_hub
9 months ago
Mixtral-8x22B-v0.1-Q3_K_S-00007-of-00009.gguf
Safe
18.6 GB
LFS
Upload folder using huggingface_hub
9 months ago
Mixtral-8x22B-v0.1-Q3_K_S-00008-of-00009.gguf
Safe
22.1 GB
LFS
Upload folder using huggingface_hub
9 months ago
Mixtral-8x22B-v0.1-Q3_K_S-00009-of-00009.gguf
Safe
17.6 GB
LFS
Upload folder using huggingface_hub
9 months ago