AlexKay's picture
Update README.md
b9dc41d
|
raw
history blame
676 Bytes
---
language:
- en-ru
- ru
- multilingual
license: apache-2.0
---
# XLM-RoBERTa large model whole word masking finetuned on SQuAD
Pretrained model on English and Russian languages using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
## Used Datasets
SQuAD + SberQuAD
[SberQuAD original paper](https://arxiv.org/pdf/1912.09723.pdf) is here! Recommend to read!
## Evaluation results
The results obtained are the following (SberQUaD):
```
f1 = 84.3
exact_match = 65.3