oddadmix's picture
oddadmix/English-To-EgyptianArabic-Translator
3ebd9db verified
|
raw
history blame
2.49 kB
metadata
library_name: transformers
license: apache-2.0
base_model: Helsinki-NLP/opus-mt-en-ar
tags:
  - generated_from_trainer
model-index:
  - name: English-To-EgyptianArabic-Translator
    results: []

English-To-EgyptianArabic-Translator

This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-ar on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8722

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
1.5425 1.0 4701 1.9165
1.4814 2.0 9402 1.8411
1.3308 3.0 14103 1.8098
1.2401 4.0 18804 1.7975
1.1253 5.0 23505 1.7835
1.02 6.0 28206 1.7948
0.9592 7.0 32907 1.8001
0.8913 8.0 37608 1.8011
0.808 9.0 42309 1.8004
0.7824 10.0 47010 1.8144
0.7382 11.0 51711 1.8230
0.6639 12.0 56412 1.8272
0.6307 13.0 61113 1.8363
0.6096 14.0 65814 1.8416
0.5885 15.0 70515 1.8533
0.549 16.0 75216 1.8609
0.5113 17.0 79917 1.8633
0.5041 18.0 84618 1.8700
0.4829 19.0 89319 1.8681
0.4847 20.0 94020 1.8722

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.3