kobert-lm / README.md
monologg's picture
feat: trust_remote_code enabled
dca75fa
metadata
license: apache-2.0
language:
  - ko
inference: false

KoBERT-LM

  • Further pretrained model for re-training LM Mask Head

How to use

If you want to import KoBERT tokenizer with AutoTokenizer, you should give trust_remote_code=True.

from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("monologg/kobert-lm")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert-lm", trust_remote_code=True)

Reference