mt5-small-finetuned-amazon-en-es

This model is a fine-tuned version of google/mt5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0609
  • Rouge1: 35.0709
  • Rouge2: 16.7086
  • Rougel: 34.3217
  • Rougelsum: 34.3182

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
No log 1.0 1209 3.3452 28.1494 11.5385 27.8138 27.9215
5.3779 2.0 2418 3.2066 29.2799 14.9292 28.3282 28.4643
5.3779 3.0 3627 3.1105 31.9146 15.8212 31.0157 30.9702
3.5145 4.0 4836 3.0808 32.6703 15.9624 31.568 31.5303
3.5145 5.0 6045 3.0837 33.8454 16.3402 32.6727 32.8738
3.2939 6.0 7254 3.0655 32.4588 15.713 31.7059 31.7646
3.2939 7.0 8463 3.0576 34.764 16.6023 34.1524 34.0333
3.2076 8.0 9672 3.0609 35.0709 16.7086 34.3217 34.3182

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
22
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for magnustragardh/mt5-small-finetuned-amazon-en-es

Base model

google/mt5-small
Finetuned
(369)
this model