opus-mt-mul-en-finetuned-dz-to-en-n1

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.2735
  • Bleu: 12.0953
  • Gen Len: 15.8819

Model description

More information needed

Intended uses & limitations

Gives inacurrate translation, use at your own risk

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 9
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
3.2761 1.0 562 2.9572 5.5363 15.5395
2.8321 2.0 1124 2.6795 7.9287 15.2723
2.5606 3.0 1686 2.5424 9.3415 15.5245
2.346 4.0 2248 2.4262 10.811 15.8619
2.1696 5.0 2810 2.3613 10.7336 16.0801
2.0639 6.0 3372 2.3282 11.2871 15.8388
1.9692 7.0 3934 2.2879 12.7783 15.4164
1.893 8.0 4496 2.2758 12.6661 15.3634
1.7954 9.0 5058 2.2735 12.0953 15.8819

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
145
Safetensors
Model size
77.1M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.