Taiwan LLM based on LLaMa2-7b

continue pretraining on 20 billion tokens in traditional mandarin and instruction fine-tuning on millions of conversations.

This version does NOT include commoncrawl.

🌟 Checkout New Taiwan-LLM Demo Chat-UI 🌟

Collaboration with Ubitus K.K. 💪💪💪

本項目與 Ubitus K.K. 合作進行。Ubitus 為本項目提供寶貴的技術支持和計算資源。

Taiwan LLM v2 is conducted in collaboration with Ubitus K.K.. Ubitus provides valuable technical support and compute resources for the project.

Downloads last month
0
Safetensors
Model size
6.74B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support