mengzi-t5-base / README.md
wangyulong's picture
Create README.md
ebac1ae
|
raw
history blame
583 Bytes
metadata
language:
  - zh
license: apache-2.0

Mengzi-T5 model (Chinese)

Pretrained model on 300G Chinese corpus. Mengzi: A lightweight yet Powerful Chinese Pre-trained Language Model

Usage

from transformers import T5Tokenizer, T5ForConditionalGeneration

tokenizer = T5Tokenizer.from_pretrained("langboat/mengzi-t5-base")
model = T5ForConditionalGeneration.from_pretrained("langboat/mengzi-t5-base")

Citation

If you find the technical report or resource is useful, please cite the following technical report in your paper.

example