Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Aratako
/
Swallow-MoE-4x7B-lisa
like
0
Text Generation
Transformers
Safetensors
5 datasets
Japanese
mixtral
Merge
Mixture of Experts
lisa
text-generation-inference
Inference Endpoints
arxiv:
2403.17919
License:
cc-by-nc-sa-4.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Swallow-MoE-4x7B-lisa
Commit History
Update README.md
48d666e
verified
Aratako
commited on
Apr 5
Update README.md
26ac48f
verified
Aratako
commited on
Apr 3
Update tokenizer_config.json
2115ad7
verified
Aratako
commited on
Apr 3
Update tokenizer_config.json
e2b7015
verified
Aratako
commited on
Apr 3
Upload mergekit_moe_config.yml
f810a80
verified
Aratako
commited on
Apr 3
Upload japanese_mt_bench.png
e0c93fc
verified
Aratako
commited on
Apr 3
Update README.md
92582b8
verified
Aratako
commited on
Apr 3
Upload tokenizer
1ff23e6
verified
Aratako
commited on
Apr 2
Upload MixtralForCausalLM
399674e
verified
Aratako
commited on
Apr 2
initial commit
248011f
verified
Aratako
commited on
Apr 2