Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Arki05
/
Grok-1-GGUF
like
64
Transformers
GGUF
Grok
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
17
Train
Deploy
Use this model
main
Grok-1-GGUF
/
Q3_K_M
2 contributors
History:
1 commit
Arki05
more quants (from f32) with ggerganov's IQ3_S imatrix (
#17
)
d4359f5
verified
9 months ago
grok-1-Q3_K_M-00001-of-00009.gguf
Safe
18.5 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
9 months ago
grok-1-Q3_K_M-00002-of-00009.gguf
Safe
17.4 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
9 months ago
grok-1-Q3_K_M-00003-of-00009.gguf
Safe
17.1 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
9 months ago
grok-1-Q3_K_M-00004-of-00009.gguf
Safe
16.6 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
9 months ago
grok-1-Q3_K_M-00005-of-00009.gguf
Safe
17.3 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
9 months ago
grok-1-Q3_K_M-00006-of-00009.gguf
Safe
17.4 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
9 months ago
grok-1-Q3_K_M-00007-of-00009.gguf
Safe
16.9 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
9 months ago
grok-1-Q3_K_M-00008-of-00009.gguf
Safe
16.8 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
9 months ago
grok-1-Q3_K_M-00009-of-00009.gguf
Safe
13.7 GB
LFS
more quants (from f32) with ggerganov's IQ3_S imatrix (#17)
9 months ago