Edit model card

byt5-base-es_maq

This model is a fine-tuned version of google/byt5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0862
  • Bleu: 16.0295
  • Gen Len: 98.8829

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 65
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 398 0.9955 0.0907 19.0
1.4196 2.0 796 0.8670 0.5548 19.0
0.9762 3.0 1194 0.8083 0.2508 19.0
0.8703 4.0 1592 0.7638 0.5692 19.0
0.8703 5.0 1990 0.7335 0.3461 19.0
0.8098 6.0 2388 0.7079 0.399 19.0
0.7592 7.0 2786 0.6846 0.3376 19.0
0.7167 8.0 3184 0.6675 0.4617 19.0
0.6881 9.0 3582 0.6496 0.438 19.0
0.6881 10.0 3980 0.6297 0.4397 19.0
0.6543 11.0 4378 0.6144 0.4078 19.0
0.6245 12.0 4776 0.6091 0.3468 19.0
0.5959 13.0 5174 0.6039 0.433 19.0
0.5766 14.0 5572 0.5971 0.4332 19.0
0.5766 15.0 5970 0.5931 0.4291 19.0
0.5541 16.0 6368 0.5877 0.4504 19.0
0.5331 17.0 6766 0.5873 0.4359 19.0
0.5169 18.0 7164 0.5864 0.419 19.0
0.4991 19.0 7562 0.5880 0.4191 19.0
0.4991 20.0 7960 0.5845 0.4535 19.0
0.4827 21.0 8358 0.5889 0.4614 19.0
0.4646 22.0 8756 0.5894 0.4075 19.0
0.4523 23.0 9154 0.5905 0.4399 19.0
0.437 24.0 9552 0.5985 0.4369 19.0
0.437 25.0 9950 0.5960 0.4056 19.0
0.4229 26.0 10348 0.5962 0.4252 19.0
0.4091 27.0 10746 0.6049 0.4713 19.0
0.3965 28.0 11144 0.6118 0.4242 19.0
0.3842 29.0 11542 0.6170 0.3924 19.0
0.3842 30.0 11940 0.6114 0.3984 19.0
0.3718 31.0 12338 0.6304 0.4186 19.0
0.3585 32.0 12736 0.6364 0.3846 19.0
0.3473 33.0 13134 0.6325 0.4058 19.0
0.3377 34.0 13532 0.6434 0.3669 19.0
0.3377 35.0 13930 0.6559 0.396 19.0
0.3258 36.0 14328 0.6614 0.4449 19.0
0.3144 37.0 14726 0.6619 0.3988 19.0
0.3062 38.0 15124 0.6812 0.4133 19.0
0.2976 39.0 15522 0.6795 0.4102 19.0
0.2976 40.0 15920 0.6798 0.3953 19.0
0.2883 41.0 16318 0.7088 0.3846 19.0
0.2791 42.0 16716 0.7110 0.3701 19.0
0.2701 43.0 17114 0.7160 0.3985 19.0
0.2619 44.0 17512 0.7150 0.3654 19.0
0.2619 45.0 17910 0.7197 0.394 19.0
0.2527 46.0 18308 0.7387 0.4033 19.0
0.2444 47.0 18706 0.7438 0.389 19.0
0.239 48.0 19104 0.7597 0.3948 19.0
0.2303 49.0 19502 0.7645 0.3976 19.0
0.2303 50.0 19900 0.7786 0.385 19.0
0.2212 51.0 20298 0.7699 0.3948 19.0
0.2157 52.0 20696 0.7902 0.4265 19.0
0.2108 53.0 21094 0.7906 0.3924 19.0
0.2108 54.0 21492 0.8098 0.3849 19.0
0.2041 55.0 21890 0.8167 0.3888 19.0
0.1959 56.0 22288 0.8317 0.4139 19.0
0.1899 57.0 22686 0.8345 0.4136 19.0
0.1868 58.0 23084 0.8484 0.4093 19.0
0.1868 59.0 23482 0.8663 0.4013 19.0
0.1815 60.0 23880 0.8709 0.3858 19.0
0.1744 61.0 24278 0.8845 0.3716 19.0
0.1709 62.0 24676 0.8787 0.3781 19.0
0.1659 63.0 25074 0.8844 0.3642 19.0
0.1659 64.0 25472 0.9034 0.3818 19.0
0.1625 65.0 25870 0.9117 0.3522 19.0
0.1568 66.0 26268 0.9059 0.3892 19.0
0.1539 67.0 26666 0.9160 0.398 19.0
0.1501 68.0 27064 0.9333 0.3831 19.0
0.1501 69.0 27462 0.9351 0.4036 19.0
0.1461 70.0 27860 0.9484 0.3727 19.0
0.1413 71.0 28258 0.9522 0.3638 19.0
0.1405 72.0 28656 0.9725 0.3501 19.0
0.1365 73.0 29054 0.9698 0.372 19.0
0.1365 74.0 29452 0.9703 0.3727 19.0
0.1328 75.0 29850 0.9798 0.3834 19.0
0.1298 76.0 30248 0.9850 0.4008 19.0
0.1283 77.0 30646 0.9988 0.3815 19.0
0.1247 78.0 31044 0.9896 0.3621 19.0
0.1247 79.0 31442 1.0035 0.3761 19.0
0.1222 80.0 31840 1.0223 0.3729 19.0
0.1195 81.0 32238 1.0171 0.3866 19.0
0.1189 82.0 32636 1.0247 0.3698 19.0
0.1175 83.0 33034 1.0151 0.3657 19.0
0.1175 84.0 33432 1.0388 0.3786 19.0
0.1146 85.0 33830 1.0413 0.3737 19.0
0.1124 86.0 34228 1.0402 0.3803 19.0
0.1125 87.0 34626 1.0519 0.3746 19.0
0.1102 88.0 35024 1.0542 0.3863 19.0
0.1102 89.0 35422 1.0626 0.3839 19.0
0.1075 90.0 35820 1.0602 0.3615 19.0
0.1069 91.0 36218 1.0701 0.3692 19.0
0.1062 92.0 36616 1.0699 0.3719 19.0
0.1051 93.0 37014 1.0732 0.3667 19.0
0.1051 94.0 37412 1.0749 0.3701 19.0
0.1041 95.0 37810 1.0796 0.3744 19.0
0.1034 96.0 38208 1.0823 0.3771 19.0
0.1031 97.0 38606 1.0797 0.3775 19.0
0.1015 98.0 39004 1.0842 0.3822 19.0
0.1015 99.0 39402 1.0859 0.3839 19.0
0.1007 100.0 39800 1.0862 0.3829 19.0

Framework versions

  • Transformers 4.29.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
12
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.