imdbo commited on
Commit
deaef8a
1 Parent(s): f4d9973

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -3
README.md CHANGED
@@ -1,3 +1,30 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ ---
4
+ **Model Description**
5
+
6
+ Model created with OpenNMT-py 3.2 for the Spanish-Aragonese pair using a transformer architecture. The model was converted to the ctranslate2 format.
7
+ This model was trained for the paper Training and fine-tuning NMT models for low-resource languages using Apertium-based synthetic corpora
8
+
9
+ **How to Translate with this Model**
10
+
11
+ + Install [Python 3.9](https://www.python.org/downloads/release/python-390/)
12
+ + Install [ctranslate 3.2](https://github.com/OpenNMT/CTranslate2)
13
+ + Translate an input_text using the NOS-MT-es-arn model with the following command:
14
+ ```bash
15
+ perl tokenizer.perl < input.txt > input.tok
16
+ ```
17
+ ```bash
18
+ subword_nmt.apply_bpe -c ./bpe/es.bpe < input.tok > input.bpe
19
+ ```
20
+ ```bash
21
+ python3 translate.py ./ct2-arn input.bpe > output.txt
22
+ ```
23
+ ```bash
24
+ sed -i 's/@@ //g' output.txt
25
+ ```
26
+
27
+ ## Citation
28
+
29
+ If you use this model in your research, please cite the following paper:
30
+ Sant, A., Bardanca Outeiriño, D., Pichel Campos, J. R., De Luca Fornaciari, F., Escolano, C., García Gilabert, J., Gamallo Otero, P., Mash, A., Liao, X., & Melero, M. (2023). Training and fine-tuning NMT models for low-resource languages using Apertium-based synthetic corpora. arXiv.