Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
RichardErkhov
/
YeungNLP_-_firefly-qwen1.5-en-7b-gguf
like
0
GGUF
Inference Endpoints
conversational
arxiv:
2305.18290
Model card
Files
Files and versions
Community
Deploy
Use this model
b1d42f6
YeungNLP_-_firefly-qwen1.5-en-7b-gguf
1 contributor
History:
23 commits
RichardErkhov
uploaded model
b1d42f6
verified
about 2 months ago
.gitattributes
Safe
3.04 kB
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.IQ3_M.gguf
Safe
3.74 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.IQ3_S.gguf
Safe
3.57 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.IQ3_XS.gguf
Safe
3.42 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.IQ4_NL.gguf
Safe
4.53 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.IQ4_XS.gguf
Safe
4.32 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q2_K.gguf
Safe
3.1 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q3_K.gguf
Safe
3.92 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q3_K_L.gguf
Safe
4.22 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q3_K_M.gguf
Safe
3.92 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q3_K_S.gguf
Safe
3.57 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q4_0.gguf
Safe
4.51 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q4_1.gguf
Safe
4.96 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q4_K.gguf
Safe
4.77 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q4_K_M.gguf
Safe
4.77 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q4_K_S.gguf
Safe
4.54 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q5_0.gguf
Safe
5.4 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q5_1.gguf
Safe
5.84 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q5_K.gguf
Safe
5.53 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q5_K_M.gguf
Safe
5.53 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q5_K_S.gguf
Safe
5.4 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q6_K.gguf
Safe
6.34 GB
LFS
uploaded model
about 2 months ago
firefly-qwen1.5-en-7b.Q8_0.gguf
Safe
8.21 GB
LFS
uploaded model
about 2 months ago