Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
DavidAU
/
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
like
2
Text Generation
English
MOE
Mixture of Experts
Mixtral
4X8
2X8
deepseek
reasoning
reason
thinking
all use cases
bfloat16
float32
float16
role play
sillytavern
backyard
lmstudio
Text Generation WebUI
llama 3
mistral
llama 3.1
qwen 2.5
context 128k
mergekit
Merge
License:
apache-2.0
Model card
Files
Files and versions
Community
New discussion
New pull request
Resources
PR & discussions documentation
Code of Conduct
Hub documentation
All
Discussions
Pull requests
View closed (0)
Welcome to the community
The community tab is the place to discuss and collaborate with the HF community!