AlexKay commited on
Commit
2eb256b
1 Parent(s): b9dc41d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -4
README.md CHANGED
@@ -6,10 +6,7 @@ language:
6
  license: apache-2.0
7
  ---
8
  # XLM-RoBERTa large model whole word masking finetuned on SQuAD
9
- Pretrained model on English and Russian languages using a masked language modeling (MLM) objective. It was introduced in
10
- [this paper](https://arxiv.org/abs/1810.04805) and first released in
11
- [this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
12
-
13
 
14
  ## Used Datasets
15
  SQuAD + SberQuAD
 
6
  license: apache-2.0
7
  ---
8
  # XLM-RoBERTa large model whole word masking finetuned on SQuAD
9
+ Pretrained model on English and Russian languages using a masked language modeling (MLM) objective.
 
 
 
10
 
11
  ## Used Datasets
12
  SQuAD + SberQuAD