File size: 515 Bytes
dca75fa |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
license: apache-2.0
language:
- ko
inference: false
---
# KoBERT-LM
- Further pretrained model for re-training LM Mask Head
## How to use
> If you want to import KoBERT tokenizer with `AutoTokenizer`, you should give `trust_remote_code=True`.
```python
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("monologg/kobert-lm")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert-lm", trust_remote_code=True)
```
## Reference
- https://github.com/SKTBrain/KoBERT
|