Edit model card

gpt2-small-japanese-wikipedia-juman-ud-causal

Model Description

This is a GPT-2 model pretrained for POS-tagging and dependency-parsing, derived from gpt2-small-japanese-wikipedia-juman-upos and UD_Japanese-GSDLUW.

How to Use

from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/gpt2-small-japanese-wikipedia-juman-ud-causal",trust_remote_code=True)
print(nlp("全学年にわたって小学校の国語の教科書に挿し絵が用いられている"))

fugashi is required.

Downloads last month
0
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for KoichiYasuoka/gpt2-small-japanese-wikipedia-juman-ud-causal

Dataset used to train KoichiYasuoka/gpt2-small-japanese-wikipedia-juman-ud-causal