finetuned-baseline-phase-1
This model is a fine-tuned version of valhalla/t5-small-e2e-qg on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.1073
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
7.3947 | 0.14 | 5 | 6.5866 |
6.1276 | 0.29 | 10 | 5.0631 |
4.8984 | 0.43 | 15 | 4.1654 |
4.4942 | 0.57 | 20 | 3.9987 |
4.2374 | 0.72 | 25 | 3.7471 |
3.9935 | 0.86 | 30 | 3.6307 |
3.8155 | 1.01 | 35 | 3.5470 |
3.7181 | 1.15 | 40 | 3.4950 |
3.6391 | 1.29 | 45 | 3.4587 |
3.6432 | 1.44 | 50 | 3.4328 |
3.5728 | 1.58 | 55 | 3.4103 |
3.6185 | 1.72 | 60 | 3.3889 |
3.5931 | 1.87 | 65 | 3.3722 |
3.5249 | 2.01 | 70 | 3.3605 |
3.595 | 2.15 | 75 | 3.3459 |
3.5795 | 2.3 | 80 | 3.3356 |
3.4731 | 2.44 | 85 | 3.3281 |
3.4917 | 2.59 | 90 | 3.3216 |
3.4628 | 2.73 | 95 | 3.3140 |
3.4421 | 2.87 | 100 | 3.3065 |
3.4528 | 3.02 | 105 | 3.2972 |
3.4554 | 3.16 | 110 | 3.2884 |
3.4619 | 3.3 | 115 | 3.2827 |
3.4654 | 3.45 | 120 | 3.2778 |
3.3787 | 3.59 | 125 | 3.2735 |
3.3945 | 3.73 | 130 | 3.2690 |
3.458 | 3.88 | 135 | 3.2647 |
3.4034 | 4.02 | 140 | 3.2569 |
3.4042 | 4.17 | 145 | 3.2499 |
3.4147 | 4.31 | 150 | 3.2463 |
3.4611 | 4.45 | 155 | 3.2423 |
3.3803 | 4.6 | 160 | 3.2392 |
3.3861 | 4.74 | 165 | 3.2364 |
3.3503 | 4.88 | 170 | 3.2335 |
3.4182 | 5.03 | 175 | 3.2299 |
3.356 | 5.17 | 180 | 3.2286 |
3.3826 | 5.31 | 185 | 3.2260 |
3.3368 | 5.46 | 190 | 3.2221 |
3.3739 | 5.6 | 195 | 3.2160 |
3.4032 | 5.75 | 200 | 3.2112 |
3.3825 | 5.89 | 205 | 3.2075 |
3.3381 | 6.03 | 210 | 3.2055 |
3.3162 | 6.18 | 215 | 3.2033 |
3.2946 | 6.32 | 220 | 3.1988 |
3.3505 | 6.46 | 225 | 3.1944 |
3.3643 | 6.61 | 230 | 3.1921 |
3.336 | 6.75 | 235 | 3.1904 |
3.374 | 6.89 | 240 | 3.1905 |
3.3148 | 7.04 | 245 | 3.1859 |
3.3649 | 7.18 | 250 | 3.1829 |
3.2273 | 7.32 | 255 | 3.1835 |
3.305 | 7.47 | 260 | 3.1821 |
3.3225 | 7.61 | 265 | 3.1795 |
3.3526 | 7.76 | 270 | 3.1757 |
3.3127 | 7.9 | 275 | 3.1746 |
3.3137 | 8.04 | 280 | 3.1766 |
3.2641 | 8.19 | 285 | 3.1739 |
3.2587 | 8.33 | 290 | 3.1683 |
3.2954 | 8.47 | 295 | 3.1669 |
3.3443 | 8.62 | 300 | 3.1682 |
3.2783 | 8.76 | 305 | 3.1641 |
3.2698 | 8.9 | 310 | 3.1597 |
3.3021 | 9.05 | 315 | 3.1577 |
3.3145 | 9.19 | 320 | 3.1578 |
3.2308 | 9.34 | 325 | 3.1589 |
3.2509 | 9.48 | 330 | 3.1574 |
3.2615 | 9.62 | 335 | 3.1544 |
3.2387 | 9.77 | 340 | 3.1521 |
3.2738 | 9.91 | 345 | 3.1501 |
3.2565 | 10.05 | 350 | 3.1494 |
3.2863 | 10.2 | 355 | 3.1495 |
3.1892 | 10.34 | 360 | 3.1496 |
3.2688 | 10.48 | 365 | 3.1460 |
3.2417 | 10.63 | 370 | 3.1441 |
3.3144 | 10.77 | 375 | 3.1421 |
3.292 | 10.92 | 380 | 3.1390 |
3.2722 | 11.06 | 385 | 3.1372 |
3.2685 | 11.2 | 390 | 3.1368 |
3.2317 | 11.35 | 395 | 3.1367 |
3.2512 | 11.49 | 400 | 3.1390 |
3.2268 | 11.63 | 405 | 3.1400 |
3.2148 | 11.78 | 410 | 3.1386 |
3.2577 | 11.92 | 415 | 3.1368 |
3.2406 | 12.06 | 420 | 3.1344 |
3.2415 | 12.21 | 425 | 3.1343 |
3.2433 | 12.35 | 430 | 3.1348 |
3.2126 | 12.5 | 435 | 3.1324 |
3.2706 | 12.64 | 440 | 3.1295 |
3.189 | 12.78 | 445 | 3.1267 |
3.2343 | 12.93 | 450 | 3.1253 |
3.1968 | 13.07 | 455 | 3.1247 |
3.242 | 13.21 | 460 | 3.1255 |
3.2193 | 13.36 | 465 | 3.1259 |
3.2464 | 13.5 | 470 | 3.1254 |
3.2374 | 13.64 | 475 | 3.1241 |
3.2849 | 13.79 | 480 | 3.1217 |
3.2263 | 13.93 | 485 | 3.1203 |
3.2702 | 14.08 | 490 | 3.1187 |
3.3134 | 14.22 | 495 | 3.1177 |
3.1861 | 14.36 | 500 | 3.1176 |
3.2232 | 14.51 | 505 | 3.1180 |
3.1825 | 14.65 | 510 | 3.1180 |
3.2067 | 14.79 | 515 | 3.1178 |
3.1963 | 14.94 | 520 | 3.1165 |
3.2425 | 15.08 | 525 | 3.1153 |
3.1739 | 15.22 | 530 | 3.1150 |
3.1967 | 15.37 | 535 | 3.1152 |
3.2015 | 15.51 | 540 | 3.1156 |
3.1911 | 15.66 | 545 | 3.1156 |
3.2413 | 15.8 | 550 | 3.1146 |
3.2284 | 15.94 | 555 | 3.1138 |
3.2534 | 16.09 | 560 | 3.1128 |
3.2333 | 16.23 | 565 | 3.1118 |
3.1774 | 16.37 | 570 | 3.1117 |
3.1782 | 16.52 | 575 | 3.1118 |
3.1897 | 16.66 | 580 | 3.1123 |
3.197 | 16.8 | 585 | 3.1119 |
3.2257 | 16.95 | 590 | 3.1107 |
3.1869 | 17.09 | 595 | 3.1100 |
3.1515 | 17.24 | 600 | 3.1096 |
3.2433 | 17.38 | 605 | 3.1096 |
3.241 | 17.52 | 610 | 3.1089 |
3.2323 | 17.67 | 615 | 3.1090 |
3.1672 | 17.81 | 620 | 3.1088 |
3.1555 | 17.95 | 625 | 3.1087 |
3.2066 | 18.1 | 630 | 3.1087 |
3.1844 | 18.24 | 635 | 3.1087 |
3.2146 | 18.38 | 640 | 3.1086 |
3.2339 | 18.53 | 645 | 3.1083 |
3.2031 | 18.67 | 650 | 3.1080 |
3.1772 | 18.82 | 655 | 3.1078 |
3.1573 | 18.96 | 660 | 3.1076 |
3.2879 | 19.1 | 665 | 3.1074 |
3.2407 | 19.25 | 670 | 3.1073 |
3.1676 | 19.39 | 675 | 3.1073 |
3.2272 | 19.53 | 680 | 3.1073 |
Framework versions
- Transformers 4.34.1
- Pytorch 2.1.0+cu118
- Datasets 2.14.6
- Tokenizers 0.14.1
- Downloads last month
- 19
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.