habdine commited on
Commit
0fac219
·
verified ·
1 Parent(s): 83600df

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +3 -2
app.py CHANGED
@@ -39,9 +39,10 @@ def get_input(text) -> Iterator[str]:
39
  yield "".join(outputs)
40
 
41
  desc = f'''
42
- This is a demo for Greek News summarization using greekbart-news24-abstract, a finetuned version of GreekBART
43
  GreekBART is the first Greek sequence to sequence pretrained model. It is pretrained on 77GB of Greek raw text using the CNRS Jean Zay supercomputer. Our model is based on BART. Unlike already existing BERT-based Greek language models such as GreekBERT and Electra, GreekBART is particularly well-suited for generative tasks, since not only its encoder but also its decoder is pretrained. Our models are competitive to GreekBERT and XLM-R in discriminative tasks and it is the first BART BASE model that can generative tasks such as abstractive summarization for the Greek language.
44
- Paper: [GreekBART: The First Pretrained Greek Sequence-to-Sequence Model](https://arxiv.org/abs/2304.00869)
 
45
 
46
  Enter your text (maximum of 1024 tokens of Greek news article) to get a summary.
47
  '''
 
39
  yield "".join(outputs)
40
 
41
  desc = f'''
42
+ This is a demo for Greek News summarization using [greekbart-news24-abstract](https://huggingface.co/dascim/greekbart-news24-abstract), a finetuned version of [GreekBART](https://huggingface.co/dascim/greekbart).
43
  GreekBART is the first Greek sequence to sequence pretrained model. It is pretrained on 77GB of Greek raw text using the CNRS Jean Zay supercomputer. Our model is based on BART. Unlike already existing BERT-based Greek language models such as GreekBERT and Electra, GreekBART is particularly well-suited for generative tasks, since not only its encoder but also its decoder is pretrained. Our models are competitive to GreekBERT and XLM-R in discriminative tasks and it is the first BART BASE model that can generative tasks such as abstractive summarization for the Greek language.
44
+
45
+ 📑 Paper: [GreekBART: The First Pretrained Greek Sequence-to-Sequence Model](https://arxiv.org/abs/2304.00869)
46
 
47
  Enter your text (maximum of 1024 tokens of Greek news article) to get a summary.
48
  '''