Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Mozilla
/
Mixtral-8x7B-Instruct-v0.1-llamafile
like
18
Follow
mozilla
198
Transformers
llamafile
5 languages
mixtral
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
91f4f1b
Mixtral-8x7B-Instruct-v0.1-llamafile
1 contributor
History:
99 commits
jartine
Quantize Q5_K_M with llamafile-0.8.5
91f4f1b
verified
8 months ago
.gitattributes
3.31 kB
Quantize Q5_K_S with llamafile-0.8.5
8 months ago
README.md
20.8 kB
Update README.md
9 months ago
config.json
31 Bytes
Add config.json to repo
about 1 year ago
mixtral-8x7b-instruct-v0.1.BF16.llamafile.cat0
50 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 BF16
9 months ago
mixtral-8x7b-instruct-v0.1.BF16.llamafile.cat1
43.4 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 BF16
9 months ago
mixtral-8x7b-instruct-v0.1.F16.llamafile.cat0
50 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 F16
9 months ago
mixtral-8x7b-instruct-v0.1.F16.llamafile.cat1
43.4 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 F16
9 months ago
mixtral-8x7b-instruct-v0.1.Q2_K.llamafile
17.3 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 Q2_K
9 months ago
mixtral-8x7b-instruct-v0.1.Q3_K_M.llamafile
22.6 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 Q3_K_M
9 months ago
mixtral-8x7b-instruct-v0.1.Q3_K_S.llamafile
20.5 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 Q3_K_S
9 months ago
mixtral-8x7b-instruct-v0.1.Q4_0.llamafile
26.5 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 Q4_0
9 months ago
mixtral-8x7b-instruct-v0.1.Q4_1.llamafile
29.4 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.7.3 Q4_1
9 months ago
mixtral-8x7b-instruct-v0.1.Q4_K_M.llamafile
28.5 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 Q4_K_M
9 months ago
mixtral-8x7b-instruct-v0.1.Q5_0.llamafile
32.3 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.7 Q5_0
10 months ago
mixtral-8x7b-instruct-v0.1.Q5_K_M.llamafile
33.3 GB
LFS
Quantize Q5_K_M with llamafile-0.8.5
8 months ago
mixtral-8x7b-instruct-v0.1.Q5_K_S.llamafile
32.3 GB
LFS
Quantize Q5_K_S with llamafile-0.8.5
8 months ago
mixtral-8x7b-instruct-v0.1.Q6_K.llamafile
38.4 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 Q6_K
9 months ago
mixtral-8x7b-instruct-v0.1.Q8_0.llamafile
49.7 GB
LFS
Quantize mixtral-8x7b-instruct-v0.1 with llamafile-0.8.4 Q8_0
9 months ago