flanT5_small_EN2AR
This model is a fine-tuned version of google/flan-t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 3.2622
- Bleu: 2.0525
- Rouge: 0.087
- Gen Len: 13.1861
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 6
- eval_batch_size: 6
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 18
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Rouge | Gen Len |
---|---|---|---|---|---|---|
No log | 1.0 | 350 | 3.2894 | 2.0707 | 0.0829 | 12.7312 |
3.5472 | 2.0 | 700 | 3.2747 | 2.1163 | 0.0825 | 12.7439 |
3.4556 | 3.0 | 1050 | 3.2685 | 2.1947 | 0.0869 | 13.6647 |
3.4556 | 4.0 | 1400 | 3.2640 | 2.1784 | 0.0857 | 13.9304 |
3.3994 | 5.0 | 1750 | 3.2588 | 2.1677 | 0.0877 | 13.19 |
3.353 | 6.0 | 2100 | 3.2581 | 2.1906 | 0.0876 | 12.9535 |
3.353 | 7.0 | 2450 | 3.2581 | 2.1056 | 0.0873 | 13.232 |
3.3052 | 8.0 | 2800 | 3.2579 | 2.0887 | 0.0871 | 13.1443 |
3.2749 | 9.0 | 3150 | 3.2561 | 2.0541 | 0.0873 | 12.8396 |
3.2725 | 10.0 | 3500 | 3.2567 | 2.1265 | 0.0881 | 12.9237 |
3.2725 | 11.0 | 3850 | 3.2592 | 2.103 | 0.0876 | 12.9612 |
3.2151 | 12.0 | 4200 | 3.2606 | 2.0482 | 0.087 | 13.1824 |
3.2171 | 13.0 | 4550 | 3.2607 | 2.021 | 0.0867 | 13.2216 |
3.2171 | 14.0 | 4900 | 3.2609 | 2.0482 | 0.0878 | 13.2296 |
3.2013 | 15.0 | 5250 | 3.2614 | 2.0651 | 0.088 | 13.0349 |
3.1821 | 16.0 | 5600 | 3.2616 | 2.063 | 0.0879 | 12.9586 |
3.1821 | 17.0 | 5950 | 3.2623 | 2.0537 | 0.0869 | 13.1869 |
3.1866 | 18.0 | 6300 | 3.2622 | 2.0525 | 0.087 | 13.1861 |
Framework versions
- Transformers 4.42.4
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for yasmineee/flanT5_small_EN2AR
Base model
google/flan-t5-small