Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,38 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
language: es
|
| 3 |
+
tags:
|
| 4 |
+
- T5
|
| 5 |
+
- Seq2Seq
|
| 6 |
+
- EconderDecoder
|
| 7 |
+
- Spanish
|
| 8 |
+
datasets:
|
| 9 |
+
- large_spanish_corpus
|
| 10 |
+
widgets:
|
| 11 |
+
- text: "Érase un vez un"
|
| 12 |
+
|
| 13 |
+
license: mit
|
| 14 |
+
|
| 15 |
+
---
|
| 16 |
+
# Spanish T5 (small) trained on [large_spanish_corpus](https://huggingface.co/datasets/viewer/?dataset=large_spanish_corpus).
|
| 17 |
+
|
| 18 |
+
This is a Spanish **T5** (small arch) trained from scratch on the [large_spanish_corpus](https://huggingface.co/datasets/viewer/?dataset=large_spanish_corpus) aka BETO's corpus with [Flax](https://github.com/google/flax)
|
| 19 |
+
|
| 20 |
+
This is part of the
|
| 21 |
+
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
|
| 22 |
+
## Dataset
|
| 23 |
+
The dataset is about 20 GB. 95% of the data was used for training and the rest 5% for validation.
|
| 24 |
+
|
| 25 |
+
## Metrics (on evaluation dataset)
|
| 26 |
+
|
| 27 |
+
- Loss: 2.413
|
| 28 |
+
- Perplexity: 11.36
|
| 29 |
+
|
| 30 |
+
## Team members
|
| 31 |
+
- Manuel Romero ([mrm8488](https://huggingface.co/mrm8488))
|
| 32 |
+
- María Grandury ([mariagrandury](https://huggingface.co/))
|
| 33 |
+
|
| 34 |
+
|
| 35 |
+
## Useful links
|
| 36 |
+
- [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
|
| 37 |
+
- [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
|
| 38 |
+
- [Community Week thread](https://discuss.huggingface.co/t/pretrain-gpt2-from-scratch-in-spanish/7086/8)
|