KoBERT-LM
- Further pretrained model for re-training LM Mask Head
How to use
If you want to import KoBERT tokenizer with
AutoTokenizer
, you should givetrust_remote_code=True
.
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("monologg/kobert-lm")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert-lm", trust_remote_code=True)
Reference
- Downloads last month
- 40