jarodrigues
commited on
Commit
•
6f88984
1
Parent(s):
57bd1b1
Update README.md
Browse files
README.md
CHANGED
@@ -60,12 +60,12 @@ be run on consumer-grade hardware.
|
|
60 |
**Gervásio 7B PT-BR** is developed by NLX-Natural Language and Speech Group, at the University of Lisbon, Faculty of Sciences, Department of Informatics, Portugal.
|
61 |
|
62 |
For the record, its full name is **Gervásio Produz Textos em Português**, to which corresponds the natural acronym **GPT PT**,
|
63 |
-
and which is
|
64 |
|
65 |
These models are fully documented in the respective [publication](https://arxiv.org/abs/?):
|
66 |
|
67 |
``` latex
|
68 |
-
@misc{
|
69 |
title={Advancing Generative AI for Portuguese with Open Decoder Gervásio~PT*},
|
70 |
author={Rodrigo Santos, João Silva, Luís Gomes, João Rodrigues, António Branco},
|
71 |
year={2024},
|
@@ -92,7 +92,7 @@ Gervásio-7B-PTBR-Decoder is distributed under an [MIT license](https://huggingf
|
|
92 |
|
93 |
# Training Data
|
94 |
|
95 |
-
**Gervásio 7B PT-
|
96 |
|
97 |
|
98 |
We selected those datasets where the outcome of their machine translation into American Portuguese could preserve, in the target language, the linguistic properties at stake.
|
|
|
60 |
**Gervásio 7B PT-BR** is developed by NLX-Natural Language and Speech Group, at the University of Lisbon, Faculty of Sciences, Department of Informatics, Portugal.
|
61 |
|
62 |
For the record, its full name is **Gervásio Produz Textos em Português**, to which corresponds the natural acronym **GPT PT**,
|
63 |
+
and which is known more shortly as Gervásio PT-* or, even more briefly, just as Gervásio, among its acquaintances.
|
64 |
|
65 |
These models are fully documented in the respective [publication](https://arxiv.org/abs/?):
|
66 |
|
67 |
``` latex
|
68 |
+
@misc{gervasio,
|
69 |
title={Advancing Generative AI for Portuguese with Open Decoder Gervásio~PT*},
|
70 |
author={Rodrigo Santos, João Silva, Luís Gomes, João Rodrigues, António Branco},
|
71 |
year={2024},
|
|
|
92 |
|
93 |
# Training Data
|
94 |
|
95 |
+
**Gervásio 7B PT-BR** was trained over standard supervised fine-tuning, and to keep some alignment with mainstream benchmarks for English, we resorted to tasks and respective datasets in the GLUE and the SuperGLUE collections.
|
96 |
|
97 |
|
98 |
We selected those datasets where the outcome of their machine translation into American Portuguese could preserve, in the target language, the linguistic properties at stake.
|