t5-text2sql_v1 / README.md
mousaazari's picture
update model card README.md
2cc51cc
|
raw
history blame
4.48 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
model-index:
  - name: t5-text2sql_v1
    results: []

t5-text2sql_v1

This model is a fine-tuned version of t5-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0733
  • Rouge2 Precision: 0.9124
  • Rouge2 Recall: 0.405
  • Rouge2 Fmeasure: 0.5291

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Rouge2 Precision Rouge2 Recall Rouge2 Fmeasure
No log 1.0 11 1.9724 0.0931 0.0257 0.0386
No log 2.0 22 1.2952 0.0923 0.0261 0.0386
No log 3.0 33 0.8639 0.1081 0.0261 0.0405
No log 4.0 44 0.5684 0.2273 0.0945 0.1281
No log 5.0 55 0.3793 0.4304 0.1823 0.2449
No log 6.0 66 0.2920 0.6664 0.3135 0.401
No log 7.0 77 0.2351 0.7222 0.3114 0.4113
No log 8.0 88 0.2055 0.7373 0.3141 0.4147
No log 9.0 99 0.1787 0.7308 0.3023 0.4055
No log 10.0 110 0.1540 0.7755 0.3297 0.4373
No log 11.0 121 0.1406 0.7661 0.3217 0.4269
No log 12.0 132 0.1299 0.8462 0.3729 0.488
No log 13.0 143 0.1172 0.8169 0.353 0.4671
No log 14.0 154 0.1133 0.8509 0.382 0.4972
No log 15.0 165 0.1049 0.8509 0.382 0.4972
No log 16.0 176 0.0988 0.8234 0.3471 0.462
No log 17.0 187 0.0921 0.8696 0.3852 0.5048
No log 18.0 198 0.0877 0.8575 0.3676 0.488
No log 19.0 209 0.0878 0.8575 0.3648 0.485
No log 20.0 220 0.0849 0.8575 0.3648 0.485
No log 21.0 231 0.0806 0.8784 0.3785 0.499
No log 22.0 242 0.0791 0.9217 0.4101 0.5348
No log 23.0 253 0.0794 0.8959 0.3901 0.5133
No log 24.0 264 0.0773 0.9198 0.412 0.537
No log 25.0 275 0.0744 0.9217 0.4101 0.5348
No log 26.0 286 0.0735 0.9217 0.4101 0.5348
No log 27.0 297 0.0742 0.9257 0.4136 0.5394
No log 28.0 308 0.0740 0.9124 0.405 0.5291
No log 29.0 319 0.0734 0.9124 0.405 0.5291
No log 30.0 330 0.0733 0.9124 0.405 0.5291

Framework versions

  • Transformers 4.21.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.4.0
  • Tokenizers 0.12.1