modernbert-base-japanese-aozora-upos

Model Description

This is a ModernBERT model pre-trained on 青空文庫 texts for POS-tagging and dependency-parsing, derived from modernbert-base-japanese-aozora. Every short-unit-word is tagged by UPOS (Universal Part-Of-Speech) and FEATS.

How to Use

from transformers import pipeline
nlp=pipeline("upos","KoichiYasuoka/modernbert-base-japanese-aozora-upos",trust_remote_code=True,aggregation_strategy="simple")
print(nlp("国境の長いトンネルを抜けると雪国であった。"))

or

import esupar
nlp=esupar.load("KoichiYasuoka/modernbert-base-japanese-aozora-upos")
print(nlp("国境の長いトンネルを抜けると雪国であった。"))

See Also

esupar: Tokenizer POS-tagger and Dependency-parser with BERT/RoBERTa/DeBERTa/GPT models

Downloads last month
29
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for KoichiYasuoka/modernbert-base-japanese-aozora-upos

Finetuned
(1)
this model
Finetunes
2 models

Dataset used to train KoichiYasuoka/modernbert-base-japanese-aozora-upos