Model Details
Model Description
This is a Machine Translation model, finetuned from NLLB-200's distilled 1.3B model, it is meant to be used in machine translation for education-related data.
- Finetuning code repository: the code used to finetune this model can be found here
How to Get Started with the Model
Use the code below to get started with the model.
Training Procedure
The model was finetuned on three datasets; a general purpose dataset, a tourism, and an education dataset. The model was finetuned on an A100 40GB GPU for two epochs.
Evaluation
Testing Data
Metrics
Model performance was measured using BLEU, spBLEU, TER, and chrF++ metrics.
Results
- Downloads last month
- 9
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.