Fairseq
German
Catalan
AudreyVM commited on
Commit
3dd528e
·
verified ·
1 Parent(s): cc4c122

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -56,7 +56,7 @@ However, we are well aware that our models may be biased. We intend to conduct r
56
 
57
  The model was trained on a combination of the following datasets:
58
 
59
- | Dataset |
60
  |-------------------|
61
  | Multi CCAligned |
62
  | WikiMatrix |
@@ -128,7 +128,7 @@ The model was trained for a total of 29.000 updates. Weights were saved every 10
128
 
129
  ### Variable and metrics
130
 
131
- We use the BLEU score for evaluation on the [Flores-101](https://github.com/facebookresearch/flores) and [NTREX](https://github.com/MicrosoftTranslator/NTREX) test sets.
132
 
133
  ### Evaluation results
134
 
@@ -141,7 +141,7 @@ Below are the evaluation results on the machine translation from German to Catal
141
  | Flores 101 devtest |29,2 | **35,9** | 33,2 |
142
  | NTEU | 38,9 | 39,1 | **42,9** |
143
  | NTREX | 25,7 | **31,2** | 29,1 |
144
- | Average | 30,7 | **35,3** | 34,3 |
145
 
146
  ## Additional information
147
 
 
56
 
57
  The model was trained on a combination of the following datasets:
58
 
59
+ | Datasets |
60
  |-------------------|
61
  | Multi CCAligned |
62
  | WikiMatrix |
 
128
 
129
  ### Variable and metrics
130
 
131
+ We use the BLEU score for evaluation on the [Flores-101](https://github.com/facebookresearch/flores), NTEU (unpublished) and [NTREX](https://github.com/MicrosoftTranslator/NTREX) test sets.
132
 
133
  ### Evaluation results
134
 
 
141
  | Flores 101 devtest |29,2 | **35,9** | 33,2 |
142
  | NTEU | 38,9 | 39,1 | **42,9** |
143
  | NTREX | 25,7 | **31,2** | 29,1 |
144
+ | **Average** | 30,7 | **35,3** | 34,3 |
145
 
146
  ## Additional information
147