SQL_Kaggle

This model is a fine-tuned version of google/flan-t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0548
  • Bleu: 43.7723
  • Gen Len: 18.9204

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 12
  • eval_batch_size: 12
  • seed: 42
  • gradient_accumulation_steps: 20
  • total_train_batch_size: 240
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
0.3225 0.12 100 0.1124 42.6318 18.9124
0.1451 0.23 200 0.0851 43.0481 18.9202
0.1218 0.35 300 0.0733 43.4809 18.9254
0.1062 0.46 400 0.0670 43.4753 18.9186
0.0978 0.58 500 0.0621 43.59 18.9205
0.0901 0.69 600 0.0587 43.68 18.9214
0.0859 0.81 700 0.0565 43.7207 18.9206
0.0824 0.93 800 0.0548 43.7723 18.9204

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.0.0
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
9
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for AayushShah/SQL_Kaggle

Finetuned
(678)
this model