metadata
language:
- ja
tags:
- japanese
- pos
- dependency-parsing
base_model: KoichiYasuoka/gpt2-small-japanese-wikipedia-juman-upos
datasets:
- universal_dependencies
license: cc-by-sa-4.0
pipeline_tag: token-classification
widget:
- text: 全学年にわたって小学校の国語の教科書に挿し絵が用いられている
gpt2-small-japanese-wikipedia-juman-ud-causal
Model Description
This is a GPT-2 model pretrained for POS-tagging and dependency-parsing, derived from gpt2-small-japanese-wikipedia-juman-upos and UD_Japanese-GSDLUW.
How to Use
from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/gpt2-small-japanese-wikipedia-juman-ud-causal",trust_remote_code=True)
print(nlp("全学年にわたって小学校の国語の教科書に挿し絵が用いられている"))
fugashi is required.