Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Mozilla
/
Meta-Llama-3.1-8B-Instruct-llamafile
like
39
Follow
mozilla
186
llamafile
PyTorch
8 languages
facebook
meta
llama
llama-3
arxiv:
2204.05149
License:
llama3.1
Model card
Files
Files and versions
Community
3ec89e9
Meta-Llama-3.1-8B-Instruct-llamafile
1 contributor
History:
42 commits
jartine
Quantize Q8_0 with llamafile-0.8.13
3ec89e9
verified
4 months ago
.gitattributes
Safe
2.78 kB
Quantize Q5_1 with llamafile-0.8.11
5 months ago
LICENSE
Safe
17.3 kB
Update LICENSE
5 months ago
Meta-Llama-3.1-8B-Instruct.BF16.llamafile
Safe
16.3 GB
LFS
Quantize BF16 with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.F16.llamafile
Safe
16.3 GB
LFS
Quantize F16 with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q2_K.llamafile
Safe
3.42 GB
LFS
Quantize Q2_K with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q3_K_L.llamafile
Safe
4.56 GB
LFS
Quantize Q3_K_L with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q3_K_M.llamafile
Safe
4.26 GB
LFS
Quantize Q3_K_M with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q3_K_S.llamafile
Safe
3.91 GB
LFS
Quantize Q3_K_S with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q4_0.llamafile
Safe
4.69 GB
LFS
Quantize Q4_0 with llamafile-0.8.11
5 months ago
Meta-Llama-3.1-8B-Instruct.Q4_1.llamafile
Safe
5.16 GB
LFS
Quantize Q4_1 with llamafile-0.8.11
5 months ago
Meta-Llama-3.1-8B-Instruct.Q4_K_M.llamafile
Safe
5.16 GB
LFS
Quantize Q4_K_M with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q4_K_S.llamafile
Safe
4.93 GB
LFS
Quantize Q4_K_S with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q5_0.llamafile
Safe
5.84 GB
LFS
Quantize Q5_0 with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q5_1.llamafile
Safe
6.1 GB
LFS
Quantize Q5_1 with llamafile-0.8.11
5 months ago
Meta-Llama-3.1-8B-Instruct.Q5_K_M.llamafile
Safe
5.97 GB
LFS
Quantize Q5_K_M with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q5_K_S.llamafile
Safe
5.84 GB
LFS
Quantize Q5_K_S with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q6_K.llamafile
Safe
6.84 GB
LFS
Quantize Q6_K with llamafile-0.8.13
4 months ago
Meta-Llama-3.1-8B-Instruct.Q8_0.llamafile
Safe
8.78 GB
LFS
Quantize Q8_0 with llamafile-0.8.13
4 months ago
README.md
Safe
31.3 kB
Update README.md
5 months ago