lukecq commited on
Commit
76a49b5
1 Parent(s): b74f79b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -16,9 +16,11 @@ The model backbone is RoBERTa-base.
16
  ## Model description
17
  The model is tuned with unlabeled data using a learning objective called first sentence prediction (FSP).
18
  The FSP task is designed by considering both the nature of the unlabeled corpus and the input/output format of classification tasks.
19
- The training and validation sets are constructed from the unlabeled corpus using FSP. During tuning, BERT-like pre-trained masked language
 
 
20
  models such as RoBERTa and ALBERT are employed as the backbone, and an output layer for classification is added.
21
- The learning objective for FSP is to predict the index of the positive option.
22
  A cross-entropy loss is used for tuning the model.
23
 
24
  ## Intended uses & limitations
 
16
  ## Model description
17
  The model is tuned with unlabeled data using a learning objective called first sentence prediction (FSP).
18
  The FSP task is designed by considering both the nature of the unlabeled corpus and the input/output format of classification tasks.
19
+ The training and validation sets are constructed from the unlabeled corpus using FSP.
20
+
21
+ During tuning, BERT-like pre-trained masked language
22
  models such as RoBERTa and ALBERT are employed as the backbone, and an output layer for classification is added.
23
+ The learning objective for FSP is to predict the index of the correct label.
24
  A cross-entropy loss is used for tuning the model.
25
 
26
  ## Intended uses & limitations