Vicent Ahuir Esteve
commited on
Commit
•
5bc63bf
1
Parent(s):
542f956
Update README.md
Browse files
README.md
CHANGED
@@ -15,9 +15,9 @@ Most of the models proposed in the literature for abstractive summarization are
|
|
15 |
|
16 |
# The NASes model
|
17 |
|
18 |
-
News Abstractive Summarization for Spanish (
|
19 |
|
20 |
-
|
21 |
|
22 |
### BibTeX entry
|
23 |
```bibtex
|
|
|
15 |
|
16 |
# The NASes model
|
17 |
|
18 |
+
News Abstractive Summarization for Spanish (NASes) is a Transformer encoder-decoder model, with the same hyper-parameters than BART, to perform summarization of Spanish news articles. It is pre-trained on a combination of several self-supervised tasks that help to increase the abstractivity of the generated summaries. Four pre-training tasks have been combined: sentence permutation, text infilling, Gap Sentence Generation, and Next Segment Generation. Spanish newspapers, and Wikipedia articles in Spanish were used for pre-training the model (21GB of raw text -8.5 millions of documents-).
|
19 |
|
20 |
+
NASes is finetuned for the summarization task on 1.802.919 (document, summary) pairs from the Dataset for Automatic summarization of Catalan and Spanish newspaper Articles (DACSA).
|
21 |
|
22 |
### BibTeX entry
|
23 |
```bibtex
|