t5-small-finetuned-chinese-to-hausa
This model is a fine-tuned version of google-t5/t5-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.6751
- Bleu: 12.5282
- Gen Len: 18.5325
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
---|---|---|---|---|---|
2.6717 | 1.0 | 846 | 1.9694 | 12.1664 | 18.7954 |
2.0063 | 2.0 | 1692 | 1.8146 | 10.5635 | 18.8736 |
1.8317 | 3.0 | 2538 | 1.7341 | 10.7724 | 18.9023 |
1.7706 | 4.0 | 3384 | 1.6942 | 11.676 | 18.0272 |
1.6908 | 5.0 | 4230 | 1.6608 | 11.654 | 17.8361 |
1.6333 | 6.0 | 5076 | 1.6336 | 11.6008 | 18.0251 |
1.5922 | 7.0 | 5922 | 1.6249 | 11.1834 | 18.7068 |
1.541 | 8.0 | 6768 | 1.6106 | 12.827 | 18.6533 |
1.5121 | 9.0 | 7614 | 1.6082 | 10.873 | 14.6468 |
1.4769 | 10.0 | 8460 | 1.5994 | 9.1287 | 15.2999 |
1.4358 | 11.0 | 9306 | 1.5943 | 12.1784 | 18.0381 |
1.4141 | 12.0 | 10152 | 1.5960 | 12.3004 | 18.6165 |
1.3879 | 13.0 | 10998 | 1.6087 | 11.6896 | 18.6615 |
1.3526 | 14.0 | 11844 | 1.6015 | 12.2844 | 18.6508 |
1.3365 | 15.0 | 12690 | 1.6085 | 11.9235 | 17.9056 |
1.3142 | 16.0 | 13536 | 1.6165 | 11.8504 | 17.6737 |
1.2846 | 17.0 | 14382 | 1.6198 | 12.4398 | 18.5284 |
1.2654 | 18.0 | 15228 | 1.6252 | 12.8486 | 17.7201 |
1.2532 | 19.0 | 16074 | 1.6363 | 12.1792 | 17.5936 |
1.231 | 20.0 | 16920 | 1.6423 | 12.3326 | 17.7331 |
1.2128 | 21.0 | 17766 | 1.6461 | 12.5054 | 18.668 |
1.2029 | 22.0 | 18612 | 1.6509 | 12.5588 | 18.5612 |
1.1899 | 23.0 | 19458 | 1.6570 | 12.1469 | 17.7074 |
1.1804 | 24.0 | 20304 | 1.6620 | 12.4007 | 17.7623 |
1.1728 | 25.0 | 21150 | 1.6681 | 12.6017 | 18.5794 |
1.1697 | 26.0 | 21996 | 1.6688 | 12.4686 | 18.5296 |
1.1661 | 27.0 | 22842 | 1.6720 | 12.5375 | 18.5321 |
1.1617 | 28.0 | 23688 | 1.6744 | 12.5354 | 18.5309 |
1.1606 | 29.0 | 24534 | 1.6752 | 12.529 | 18.533 |
1.1599 | 30.0 | 25380 | 1.6751 | 12.5282 | 18.5325 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 9
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for samu/t5-small-finetuned-chinese-to-hausa
Base model
google-t5/t5-small