KoichiYasuoka commited on
Commit
b439784
1 Parent(s): 016b706

links added

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -17,7 +17,7 @@ widget:
17
 
18
  ## Model Description
19
 
20
- This is a RoBERTa model pre-trained on Classical Chinese texts, derived from [GuwenBERT-large](https://huggingface.co/ethanyt/guwenbert-large). Character-embeddings are enhanced into traditional/simplified characters. You can fine-tune `roberta-classical-chinese-large-char` for downstream tasks, such as sentencization, POS-tagging, dependency-parsing, and so on.
21
 
22
  ## How to Use
23
 
 
17
 
18
  ## Model Description
19
 
20
+ This is a RoBERTa model pre-trained on Classical Chinese texts, derived from [GuwenBERT-large](https://huggingface.co/ethanyt/guwenbert-large). Character-embeddings are enhanced into traditional/simplified characters. You can fine-tune `roberta-classical-chinese-large-char` for downstream tasks, such as [sentence-segmentation](https://huggingface.co/KoichiYasuoka/roberta-classical-chinese-large-sentence-segmentation), [POS-tagging](https://huggingface.co/KoichiYasuoka/roberta-classical-chinese-large-upos), [dependency-parsing](https://github.com/KoichiYasuoka/SuPar-Kanbun), and so on.
21
 
22
  ## How to Use
23