|
--- |
|
license: cc-by-nc-sa-4.0 |
|
language: |
|
- it |
|
- lld |
|
--- |
|
|
|
# Ladin-Val Badia to Italian Translation Model |
|
|
|
## Description |
|
This model is designed for translating text between Ladin (Val Badia) and Italian. |
|
The model was developed and trained as part of the research presented in the paper titled |
|
"Rule-Based, Neural and LLM Back-Translation: Comparative Insights from a Variant of Ladin" submitted to [LoResMT @ ACL 2024](https://sites.google.com/view/loresmt/). |
|
|
|
## Paper |
|
The details of the model, including its architecture, training process, and evaluation, are discussed in the paper: |
|
- [Rule-Based, Neural and LLM Back-Translation: Comparative Insights from a Variant of Ladin](https://arxiv.org/abs/2407.08819) |
|
|
|
## License |
|
This model is licensed under the [CC BY-NC-SA 4.0 License](https://creativecommons.org/licenses/by-nc-sa/4.0/). |
|
|
|
## Usage |
|
To use this model for translation, you need to use the prefixes `>>ita<<` for translating to Italian and `>>lld_Latn<<` for translating to Ladin (Val Badia). |
|
|
|
--- |
|
license: cc-by-nc-sa-4.0 |
|
language: |
|
- it |
|
- lld |
|
--- |
|
|
|
# Ladin-Val Badia to Italian Translation Model |
|
|
|
## Description |
|
This model is designed for translating text between Ladin (Val Badia) and Italian. |
|
The model was developed and trained as part of the research presented in the paper titled |
|
"Rule-Based, Neural and LLM Back-Translation: Comparative Insights from a Variant of Ladin" submitted to [LoResMT @ ACL 2024](https://sites.google.com/view/loresmt/). |
|
|
|
## Paper |
|
The details of the model, including its architecture, training process, and evaluation, are discussed in the paper: |
|
- [Rule-Based, Neural and LLM Back-Translation: Comparative Insights from a Variant of Ladin](https://arxiv.org/abs/2407.08819) |
|
|
|
## License |
|
This model is licensed under the [CC BY-NC-SA 4.0 License](https://creativecommons.org/licenses/by-nc-sa/4.0/). |
|
|
|
## Usage |
|
To use this model for translation, you need to use the prefixes `>>ita<<` for translating to Italian and `>>lld_Latn<<` for translating to Ladin (Val Badia). |
|
|
|
## Citation |
|
|
|
If you use this model, please cite the following paper: |
|
|
|
```bibtex |
|
@inproceedings{frontull-moser-2024-rule, |
|
title = "Rule-Based, Neural and {LLM} Back-Translation: Comparative Insights from a Variant of {L}adin", |
|
author = "Frontull, Samuel and |
|
Moser, Georg", |
|
editor = "Ojha, Atul Kr. and |
|
Liu, Chao-hong and |
|
Vylomova, Ekaterina and |
|
Pirinen, Flammie and |
|
Abbott, Jade and |
|
Washington, Jonathan and |
|
Oco, Nathaniel and |
|
Malykh, Valentin and |
|
Logacheva, Varvara and |
|
Zhao, Xiaobing", |
|
booktitle = "Proceedings of the The Seventh Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2024)", |
|
month = aug, |
|
year = "2024", |
|
address = "Bangkok, Thailand", |
|
publisher = "Association for Computational Linguistics", |
|
url = "https://aclanthology.org/2024.loresmt-1.13", |
|
pages = "128--138", |
|
abstract = "This paper explores the impact of different back-translation approaches on machine translation for Ladin, specifically the Val Badia variant. Given the limited amount of parallel data available for this language (only 18k Ladin-Italian sentence pairs), we investigate the performance of a multilingual neural machine translation model fine-tuned for Ladin-Italian. In addition to the available authentic data, we synthesise further translations by using three different models: a fine-tuned neural model, a rule-based system developed specifically for this language pair, and a large language model. Our experiments show that all approaches achieve comparable translation quality in this low-resource scenario, yet round-trip translations highlight differences in model performance.", |
|
} |
|
``` |
|
|
|
|