Update README.md
Browse files
README.md
CHANGED
@@ -28,7 +28,7 @@ widget:
|
|
28 |
|
29 |
**Albertina PT-*** is a foundation, large language model for the **Portuguese language**.
|
30 |
|
31 |
-
It is an **encoder** of the BERT family, based on
|
32 |
developed over the DeBERTa model, with most competitive performance for this language.
|
33 |
It has different versions that were trained for different variants of Portuguese (PT),
|
34 |
namely the European variant from Portugal (PT-PT) and the American variant from Brazil (PT-BR),
|
@@ -36,8 +36,8 @@ and it is distributed free of charge and under a most permissible license.
|
|
36 |
|
37 |
**Albertina PT-BR** is the version for **American Portuguese from Brazil**,
|
38 |
and to the best of our knowledge, at the time of its initial distribution,
|
39 |
-
it
|
40 |
-
|
41 |
|
42 |
It was developped by a joint team from the University of Lisbon and the University of Porto, Portugal.
|
43 |
For further details, check the respective publication:
|
|
|
28 |
|
29 |
**Albertina PT-*** is a foundation, large language model for the **Portuguese language**.
|
30 |
|
31 |
+
It is an **encoder** of the BERT family, based on the neural architecture Transformer and
|
32 |
developed over the DeBERTa model, with most competitive performance for this language.
|
33 |
It has different versions that were trained for different variants of Portuguese (PT),
|
34 |
namely the European variant from Portugal (PT-PT) and the American variant from Brazil (PT-BR),
|
|
|
36 |
|
37 |
**Albertina PT-BR** is the version for **American Portuguese from Brazil**,
|
38 |
and to the best of our knowledge, at the time of its initial distribution,
|
39 |
+
it sets a new state of the art for and econder specifically developped for this language and variant
|
40 |
+
that is made publicly available and distributed for reuse.
|
41 |
|
42 |
It was developped by a joint team from the University of Lisbon and the University of Porto, Portugal.
|
43 |
For further details, check the respective publication:
|