metadata
license: apache-2.0
datasets:
- yentinglin/v1
language:
- zh
tags:
- traditional mandarin
- traditional chinese
- taiwan
- moe
- mixtral
- zh-tw
- zh-hant
pretty_name: twllm-moe
Taiwan LLM Mixtrue of Export - Pilot run
Model Details
Model Description
- Developed by: Yen-Ting Lin 林彥廷
- Compute Funded by: HelperAI
- Model type: Mixtral
- Language(s) (NLP): Traditional Mandarin (zh-tw)
- License: Apache-2.0
- Finetuned from model: mistralai/Mixtral-8x7B-Instruct-v0.1
- TMMLUS+ score: 38.09223090909092
Model Sources
- Repository: Taiwan-LLM
- Paper: Taiwan-LLM: Bridging the Linguistic Divide with a Culturally Aligned Language Model
- Demo: Taiwan LLM ChatUI
Citation
BibTeX:
@misc{lin2023taiwan,
title={Taiwan LLM: Bridging the Linguistic Divide with a Culturally Aligned Language Model},
author={Yen-Ting Lin and Yun-Nung Chen},
year={2023},
eprint={2311.17487},
archivePrefix={arXiv},
primaryClass={cs.CL}
}