jarodrigues commited on
Commit
b1cecba
·
verified ·
1 Parent(s): e582c02

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -5
README.md CHANGED
@@ -40,9 +40,8 @@ Its further improvement through additional training was done over language resou
40
  It has different versions that were trained for different variants of Portuguese (PT),
41
  namely the European variant from Portugal (**PT-PT**) and the American variant from Brazil (**PT-BR**).
42
 
43
- All versions of Gervásio are distributed for free and under a fully open license, including for either research or commercial usage, and can
44
  be run on consumer-grade hardware, thus seeking to contribute to the advancement of research and innovation in language technology for Portuguese.
45
- ++++++++
46
 
47
  **Gervásio PT-PT 7B Instruct** is developed by NLX-Natural Language and Speech Group, at the University of Lisbon, Faculty of Sciences, Department of Informatics, Portugal.
48
 
@@ -70,9 +69,8 @@ Please use the above cannonical reference when using or citing this model.
70
 
71
  # Model Description
72
 
73
- **This model card is for Gervásio-7B-PTPT-Instruct-Decoder**, with 7 billion parameters, ? layers and a hidden size of ?.
74
-
75
- Gervásio-PT-BR base is distributed under an [Apache 2.0 license](https://huggingface.co/PORTULAN/gervasio-ptpt-base/blob/main/LICENSE) (like Pythia).
76
 
77
 
78
  <br>
 
40
  It has different versions that were trained for different variants of Portuguese (PT),
41
  namely the European variant from Portugal (**PT-PT**) and the American variant from Brazil (**PT-BR**).
42
 
43
+ All versions of Gervásio are **distributed for free and under a fully open license**, including for either research or commercial usage, and can
44
  be run on consumer-grade hardware, thus seeking to contribute to the advancement of research and innovation in language technology for Portuguese.
 
45
 
46
  **Gervásio PT-PT 7B Instruct** is developed by NLX-Natural Language and Speech Group, at the University of Lisbon, Faculty of Sciences, Department of Informatics, Portugal.
47
 
 
69
 
70
  # Model Description
71
 
72
+ **This model card is for Gervásio-7B-PTPT-Instruct-Decoder**, with 7 billion parameters, a hidden size of 4096 units, an intermediate size of 11,008 units, 32 attention heads, 32 hidden layers, and a tokenizer obtained using the Byte-Pair Encoding (BPE) algorithm implemented with SentencePiece, featuring a vocabulary size of 32,000.
73
+ Gervásio-7B-PTPT-Instruct-Decoder is distributed under an [MIT license](https://huggingface.co/PORTULAN/albertina-ptpt/blob/main/LICENSE).
 
74
 
75
 
76
  <br>