Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
DavidAU
/
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
like
2
Text Generation
English
MOE
Mixture of Experts
Mixtral
4X8
2X8
deepseek
reasoning
reason
thinking
all use cases
bfloat16
float32
float16
role play
sillytavern
backyard
lmstudio
Text Generation WebUI
llama 3
mistral
llama 3.1
qwen 2.5
context 128k
mergekit
Merge
License:
apache-2.0
Model card
Files
Files and versions
Community
main
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
1 contributor
History:
5 commits
DavidAU
Update README.md
3789325
verified
about 21 hours ago
.gitattributes
Safe
1.52 kB
initial commit
about 22 hours ago
README.md
Safe
4.16 kB
Update README.md
about 21 hours ago