jarodrigues
commited on
Commit
•
aa8a6d4
1
Parent(s):
f0666b9
Update README.md
Browse files
README.md
CHANGED
@@ -31,6 +31,12 @@ widget:
|
|
31 |
|
32 |
# Albertina PT-BR
|
33 |
|
|
|
|
|
|
|
|
|
|
|
|
|
34 |
**Albertina PT-*** is a foundation, large language model for the **Portuguese language**.
|
35 |
|
36 |
It is an **encoder** of the BERT family, based on the neural architecture Transformer and
|
@@ -44,6 +50,8 @@ and to the best of our knowledge, at the time of its initial distribution,
|
|
44 |
it sets a new state of the art for this language and variant
|
45 |
that is made publicly available and distributed for reuse.
|
46 |
|
|
|
|
|
47 |
It was developed by a joint team from the University of Lisbon and the University of Porto, Portugal.
|
48 |
For further details, check the respective publication:
|
49 |
|
|
|
31 |
|
32 |
# Albertina PT-BR
|
33 |
|
34 |
+
---
|
35 |
+
<img align="left" width="30" height="30" src="https://github.githubassets.com/images/icons/emoji/unicode/1f917.png">
|
36 |
+
<p> We will soon release distilled models and <b>Albertina PT-BR V2</b>, which has been trained on PT-BR data sourced from OSCAR.</p>
|
37 |
+
|
38 |
+
---
|
39 |
+
|
40 |
**Albertina PT-*** is a foundation, large language model for the **Portuguese language**.
|
41 |
|
42 |
It is an **encoder** of the BERT family, based on the neural architecture Transformer and
|
|
|
50 |
it sets a new state of the art for this language and variant
|
51 |
that is made publicly available and distributed for reuse.
|
52 |
|
53 |
+
|
54 |
+
|
55 |
It was developed by a joint team from the University of Lisbon and the University of Porto, Portugal.
|
56 |
For further details, check the respective publication:
|
57 |
|