AlexKay's picture
Update README.md
6cc1436
|
raw
history blame
486 Bytes
metadata
language:
  - en
  - ru
  - multilingual
license: apache-2.0

XLM-RoBERTa large model whole word masking finetuned on SQuAD

Pretrained model using a masked language modeling (MLM) objective. Fine tuned on English and Russian QA datasets

Used QA Datasets

SQuAD + SberQuAD

SberQuAD original paper is here! Recommend to read!

Evaluation results

The results obtained are the following (SberQUaD):

f1 = 84.3
exact_match = 65.3