Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
anthracite-org
/
magnum-v2-72b-gguf
like
2
Follow
Anthracite
299
Text Generation
GGUF
9 languages
chat
Inference Endpoints
conversational
License:
tongyi-qianwen
Model card
Files
Files and versions
Community
2
Deploy
Use this model
9a8ce83
magnum-v2-72b-gguf
4 contributors
History:
12 commits
lucyknada
Upload magnum-v2-72b-Q5_K_M_split.gguf-00007-of-00008.gguf with huggingface_hub
9a8ce83
verified
4 months ago
.gitattributes
Safe
3.28 kB
Upload magnum-v2-72b-Q5_K_M_split.gguf-00007-of-00008.gguf with huggingface_hub
4 months ago
README.md
Safe
2.48 kB
Create README.md
4 months ago
imatrix-magnum-v2-72b.dat
25.2 MB
LFS
Upload imatrix-magnum-v2-72b.dat with huggingface_hub
4 months ago
magnum-v2-72b-IQ3_M.gguf
Safe
35.5 GB
LFS
Upload magnum-v2-72b-IQ3_M.gguf with huggingface_hub
4 months ago
magnum-v2-72b-IQ3_S.gguf
Safe
34.5 GB
LFS
Upload magnum-v2-72b-IQ3_S.gguf with huggingface_hub
4 months ago
magnum-v2-72b-IQ4_XS.gguf
Safe
39.7 GB
LFS
Upload magnum-v2-72b-IQ4_XS.gguf with huggingface_hub
4 months ago
magnum-v2-72b-Q4_K_M.gguf
Safe
47.4 GB
LFS
Upload magnum-v2-72b-Q4_K_M.gguf with huggingface_hub
4 months ago
magnum-v2-72b-Q5_K_M_split.gguf-00007-of-00008.gguf
Safe
7.25 GB
LFS
Upload magnum-v2-72b-Q5_K_M_split.gguf-00007-of-00008.gguf with huggingface_hub
4 months ago
magnum-v2-72b-Q6_K_split.gguf-00002-of-00008.gguf
Safe
8.71 GB
LFS
Upload magnum-v2-72b-Q6_K_split.gguf-00002-of-00008.gguf with huggingface_hub
4 months ago
magnum-v2-72b-Q6_K_split.gguf-00007-of-00008.gguf
Safe
8.46 GB
LFS
Upload magnum-v2-72b-Q6_K_split.gguf-00007-of-00008.gguf with huggingface_hub
4 months ago
magnum-v2-72b-Q8_0_split.gguf-00002-of-00008.gguf
Safe
10.4 GB
LFS
Upload magnum-v2-72b-Q8_0_split.gguf-00002-of-00008.gguf with huggingface_hub
4 months ago
magnum-v2-72b-f16-00001-of-00015.gguf
Safe
10 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00002-of-00015.gguf
Safe
9.56 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00003-of-00015.gguf
Safe
9.75 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00004-of-00015.gguf
Safe
9.56 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00005-of-00015.gguf
Safe
9.75 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00006-of-00015.gguf
Safe
9.56 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00007-of-00015.gguf
Safe
9.75 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00008-of-00015.gguf
Safe
9.56 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00009-of-00015.gguf
Safe
9.75 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00010-of-00015.gguf
Safe
9.56 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00011-of-00015.gguf
Safe
9.75 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00012-of-00015.gguf
Safe
9.56 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00013-of-00015.gguf
Safe
9.75 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00014-of-00015.gguf
Safe
9.56 GB
LFS
Upload folder using huggingface_hub
4 months ago
magnum-v2-72b-f16-00015-of-00015.gguf
Safe
10 GB
LFS
Upload folder using huggingface_hub
4 months ago