KoBERT-LM

  • Further pretrained model for re-training LM Mask Head

How to use

If you want to import KoBERT tokenizer with AutoTokenizer, you should give trust_remote_code=True.

from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("monologg/kobert-lm")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert-lm", trust_remote_code=True)

Reference

Downloads last month
40
Safetensors
Model size
92.8M params
Tensor type
F32
Β·
Inference Examples
Inference API (serverless) has been turned off for this model.

Spaces using monologg/kobert-lm 3