Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
DavidAU
/
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
like
2
Text Generation
English
MOE
Mixture of Experts
Mixtral
4X8
2X8
deepseek
reasoning
reason
thinking
all use cases
bfloat16
float32
float16
role play
sillytavern
backyard
lmstudio
Text Generation WebUI
llama 3
mistral
llama 3.1
qwen 2.5
context 128k
mergekit
Merge
License:
apache-2.0
Model card
Files
Files and versions
Community
9e231ff
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
1 contributor
History:
1 commit
DavidAU
initial commit
9e231ff
verified
1 day ago
.gitattributes
Safe
1.52 kB
initial commit
1 day ago
README.md
Safe
31 Bytes
initial commit
1 day ago