Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
DavidAU
/
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
like
2
Text Generation
English
MOE
Mixture of Experts
Mixtral
4X8
2X8
deepseek
reasoning
reason
thinking
all use cases
bfloat16
float32
float16
role play
sillytavern
backyard
lmstudio
Text Generation WebUI
llama 3
mistral
llama 3.1
qwen 2.5
context 128k
mergekit
Merge
License:
apache-2.0
Model card
Files
Files and versions
Community
9e231ff
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
/
.gitattributes
Commit History
initial commit
9e231ff
verified
DavidAU
commited on
1 day ago