Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
cloudyu
/
Mixtral_7Bx2_MoE
like
36
Text Generation
Transformers
Safetensors
mixtral
Eval Results
text-generation-inference
Inference Endpoints
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
7
Train
Deploy
Use this model
Add MOE (mixture of experts) tag
#6
by
davanstrien
HF staff
- opened
Jan 13, 2024
base:
refs/heads/main
←
from:
refs/pr/6
Discussion
Files changed
+3
-1
davanstrien
Jan 13, 2024
No description provided.
Add MOE (mixture of experts) tag
514e201b
Edit
Preview
Upload images, audio, and videos by dragging in the text input, pasting, or
clicking here
.
Tap or paste here to upload images
Cannot merge
This branch has merge conflicts in the following files:
README.md
Comment
·
Sign up
or
log in
to comment