flan-t5-small-en-simplif

This model is a fine-tuned version of google/flan-t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0235
  • Rouge1: 37.4861
  • Rouge2: 24.6089
  • Rougel: 36.9311
  • Rougelsum: 37.0089
  • Gen Len: 18.2196
  • Bleu: 6.4559
  • Corpus Bleu: 1.7137

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len Bleu Corpus Bleu
1.591 1.0 836 1.2587 36.4802 23.5099 35.8961 35.9879 18.2100 5.6488 1.6728
1.2642 2.0 1672 1.1112 37.0889 24.1259 36.5258 36.6133 18.2148 5.7688 1.7062
1.1999 3.0 2508 1.0687 37.5481 24.737 36.995 37.0752 18.2148 6.5724 1.75
1.1533 4.0 3344 1.0338 37.4724 24.5806 36.8996 36.9784 18.2196 6.4559 1.7049
1.1067 5.0 4180 1.0235 37.4861 24.6089 36.9311 37.0089 18.2196 6.4559 1.7137

Framework versions

  • Transformers 4.35.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.7
  • Tokenizers 0.14.1
Downloads last month
23
Safetensors
Model size
77M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for test3333333/flan-t5-small-en-simplif

Finetuned
(298)
this model