AlexKay's picture
Update README.md
2eb256b
|
raw
history blame
473 Bytes
metadata
language:
  - en-ru
  - ru
  - multilingual
license: apache-2.0

XLM-RoBERTa large model whole word masking finetuned on SQuAD

Pretrained model on English and Russian languages using a masked language modeling (MLM) objective.

Used Datasets

SQuAD + SberQuAD

SberQuAD original paper is here! Recommend to read!

Evaluation results

The results obtained are the following (SberQUaD):

f1 = 84.3
exact_match = 65.3