|
--- |
|
language: |
|
- es |
|
tags: |
|
- masked-lm |
|
- roberta |
|
license: mit |
|
datasets: |
|
- filevich/uy22 |
|
--- |
|
|
|
|
|
# ROUBERTa cased |
|
|
|
This is a RoBERTa-base LM trained from scratch exclusively on Uruguayan press [1]. |
|
|
|
Cite this work |
|
|
|
``` |
|
@inproceedings{rouberta2024, |
|
title={A Language Model Trained on Uruguayan Spanish News Text}, |
|
author={Filevich, Juan Pablo and Marco, Gonzalo and Castro, Santiago and Chiruzzo, Luis and Ros{\'a}, Aiala}, |
|
booktitle={Proceedings of the Second International Workshop Towards Digital Language Equality (TDLE): Focusing on Sustainability@ LREC-COLING 2024}, |
|
pages={53--60}, |
|
year={2024} |
|
} |
|
``` |
|
|
|
|
|
[1] [huggingface.co/datasets/pln-udelar/uy22](https://huggingface.co/datasets/pln-udelar/uy22) |