Tokara-0.5B-ud-causal

Model Description

This is a Qwen1.5 model pretrained for POS-tagging and dependency-parsing, derived from Tokara-0.5B-v0.1 refined for UD_Japanese-GSDLUW.

How to Use

from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/Tokara-0.5B-ud-causal",trust_remote_code=True)
print(nlp("全学年にわたって小学校の国語の教科書に挿し絵が用いられている"))

Reference

安岡孝一: GPT系言語モデルによる国語研長単位係り受け解析, 人文科学とコンピュータシンポジウム「じんもんこん2024」論文集 (2024年12月), pp.83-90.

Downloads last month
11
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for KoichiYasuoka/Tokara-0.5B-ud-causal

Finetuned
(1)
this model

Dataset used to train KoichiYasuoka/Tokara-0.5B-ud-causal