Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mradermacher
/
Hermes-2-Pro-Llama-3-70B-GGUF
like
2
Transformers
GGUF
teknium/OpenHermes-2.5
English
Llama-3
instruct
finetune
chatml
DPO
RLHF
gpt4
synthetic data
distillation
function calling
json mode
axolotl
Inference Endpoints
conversational
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Hermes-2-Pro-Llama-3-70B-GGUF
1 contributor
History:
18 commits
mradermacher
auto-patch README.md
cd869ea
verified
6 months ago
.gitattributes
Safe
2.71 kB
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.IQ3_M.gguf
Safe
31.9 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.IQ3_S.gguf
Safe
30.9 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.IQ3_XS.gguf
Safe
29.3 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.IQ4_XS.gguf
Safe
38.3 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q2_K.gguf
Safe
26.4 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q3_K_L.gguf
Safe
37.1 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q3_K_M.gguf
Safe
34.3 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q3_K_S.gguf
Safe
30.9 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q4_K_M.gguf
Safe
42.5 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q4_K_S.gguf
Safe
40.3 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q5_K_M.gguf
Safe
49.9 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q5_K_S.gguf
Safe
48.7 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q6_K.gguf.part1of2
Safe
29 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q6_K.gguf.part2of2
Safe
28.9 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q8_0.gguf.part1of2
Safe
37.6 GB
LFS
uploaded from nethype/kaos
6 months ago
Hermes-2-Pro-Llama-3-70B.Q8_0.gguf.part2of2
Safe
37.4 GB
LFS
uploaded from nethype/kaos
6 months ago
README.md
Safe
4.17 kB
auto-patch README.md
6 months ago