nandakishormpai commited on
Commit
7400230
1 Parent(s): 94428c5

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +34 -35
README.md CHANGED
@@ -16,11 +16,11 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.6488
20
- - Rouge1: 25.2912
21
- - Rouge2: 9.5617
22
- - Rougel: 22.6455
23
- - Rougelsum: 22.617
24
  - Gen Len: 19.0
25
 
26
  ## Model description
@@ -46,42 +46,41 @@ The following hyperparameters were used during training:
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
- - num_epochs: 30
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
55
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:-------:|:---------:|:-------:|
56
- | 1.7857 | 1.0 | 66 | 1.2317 | 2.609 | 0.0 | 2.5996 | 2.6132 | 19.0 |
57
- | 1.2147 | 2.0 | 132 | 1.0105 | 1.8041 | 0.0866 | 1.7432 | 1.7451 | 18.9848 |
58
- | 1.0625 | 3.0 | 198 | 0.9154 | 2.5794 | 0.5266 | 2.4272 | 2.4305 | 19.0 |
59
- | 0.9849 | 4.0 | 264 | 0.8615 | 19.2823 | 4.6729 | 17.639 | 17.6209 | 19.0 |
60
- | 0.9363 | 5.0 | 330 | 0.8248 | 22.4371 | 5.4177 | 20.0806 | 20.1164 | 19.0 |
61
- | 0.8995 | 6.0 | 396 | 0.7943 | 24.162 | 6.1729 | 21.3733 | 21.3621 | 19.0 |
62
- | 0.8774 | 7.0 | 462 | 0.7736 | 24.0765 | 6.4219 | 21.2588 | 21.2715 | 19.0 |
63
- | 0.8544 | 8.0 | 528 | 0.7558 | 24.4842 | 6.7685 | 21.8275 | 21.8459 | 19.0 |
64
- | 0.8334 | 9.0 | 594 | 0.7416 | 25.009 | 7.8025 | 22.3227 | 22.3454 | 19.0 |
65
- | 0.8212 | 10.0 | 660 | 0.7300 | 24.9532 | 7.9013 | 22.4275 | 22.4138 | 19.0 |
66
- | 0.8118 | 11.0 | 726 | 0.7208 | 25.4191 | 7.8727 | 22.696 | 22.6894 | 19.0 |
67
- | 0.7994 | 12.0 | 792 | 0.7114 | 25.4852 | 8.1776 | 22.4479 | 22.4522 | 19.0 |
68
- | 0.7904 | 13.0 | 858 | 0.7020 | 25.4509 | 8.7603 | 22.7333 | 22.7213 | 19.0 |
69
- | 0.7829 | 14.0 | 924 | 0.6958 | 25.0587 | 8.9197 | 22.6393 | 22.6207 | 19.0 |
70
- | 0.7764 | 15.0 | 990 | 0.6897 | 25.0867 | 9.0392 | 22.6598 | 22.6808 | 19.0 |
71
- | 0.7703 | 16.0 | 1056 | 0.6841 | 25.2402 | 9.3991 | 22.6384 | 22.6226 | 19.0 |
72
- | 0.7633 | 17.0 | 1122 | 0.6781 | 25.7124 | 9.5485 | 23.0809 | 23.0677 | 19.0 |
73
- | 0.7591 | 18.0 | 1188 | 0.6744 | 25.0679 | 9.4176 | 22.5225 | 22.4913 | 19.0 |
74
- | 0.7553 | 19.0 | 1254 | 0.6695 | 25.3046 | 9.2343 | 22.931 | 22.8948 | 19.0 |
75
- | 0.7514 | 20.0 | 1320 | 0.6661 | 25.3134 | 9.3234 | 22.8281 | 22.8198 | 19.0 |
76
- | 0.746 | 21.0 | 1386 | 0.6630 | 25.3837 | 9.2876 | 22.806 | 22.7907 | 19.0 |
77
- | 0.741 | 22.0 | 1452 | 0.6592 | 25.4751 | 9.3792 | 22.9321 | 22.9158 | 19.0 |
78
- | 0.7404 | 23.0 | 1518 | 0.6566 | 25.5734 | 9.4539 | 23.0627 | 23.063 | 19.0 |
79
- | 0.735 | 24.0 | 1584 | 0.6555 | 25.2529 | 9.5285 | 22.6775 | 22.6504 | 19.0 |
80
- | 0.7334 | 25.0 | 1650 | 0.6536 | 25.2281 | 9.4984 | 22.3494 | 22.3364 | 19.0 |
81
- | 0.7352 | 26.0 | 1716 | 0.6514 | 25.3464 | 9.7302 | 22.6918 | 22.6786 | 19.0 |
82
- | 0.7322 | 27.0 | 1782 | 0.6502 | 25.2349 | 9.6516 | 22.6298 | 22.6005 | 19.0 |
83
- | 0.7333 | 28.0 | 1848 | 0.6492 | 25.288 | 9.5646 | 22.6836 | 22.6629 | 19.0 |
84
- | 0.7291 | 29.0 | 1914 | 0.6488 | 25.2912 | 9.5617 | 22.6455 | 22.617 | 19.0 |
85
 
86
 
87
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 1.8196
20
+ - Rouge1: 25.0142
21
+ - Rouge2: 8.1802
22
+ - Rougel: 22.77
23
+ - Rougelsum: 22.8017
24
  - Gen Len: 19.0
25
 
26
  ## Model description
 
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
+ - num_epochs: 40
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
55
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:-------:|:---------:|:-------:|
56
+ | 2.8665 | 1.0 | 87 | 2.3813 | 14.8999 | 2.5462 | 13.9887 | 13.8228 | 19.0 |
57
+ | 2.4139 | 2.0 | 174 | 2.1669 | 18.1676 | 3.2466 | 16.6849 | 16.6869 | 19.0 |
58
+ | 2.2398 | 3.0 | 261 | 2.0699 | 19.2735 | 4.639 | 17.8309 | 17.8346 | 19.0 |
59
+ | 2.1557 | 4.0 | 348 | 2.0234 | 20.529 | 4.8054 | 19.0494 | 19.0052 | 19.0 |
60
+ | 2.097 | 5.0 | 435 | 1.9936 | 21.4325 | 5.6856 | 19.6801 | 19.6991 | 19.0 |
61
+ | 2.06 | 6.0 | 522 | 1.9644 | 21.2128 | 5.6864 | 19.6366 | 19.6068 | 19.0 |
62
+ | 2.0136 | 7.0 | 609 | 1.9457 | 21.9194 | 5.9025 | 20.1281 | 20.068 | 19.0 |
63
+ | 1.9781 | 8.0 | 696 | 1.9335 | 22.2101 | 6.366 | 20.6869 | 20.6609 | 19.0 |
64
+ | 1.9459 | 9.0 | 783 | 1.9192 | 22.8154 | 6.612 | 21.1065 | 21.1091 | 19.0 |
65
+ | 1.943 | 10.0 | 870 | 1.9074 | 23.7665 | 7.0722 | 22.033 | 22.0371 | 19.0 |
66
+ | 1.9309 | 11.0 | 957 | 1.8946 | 24.0329 | 7.3522 | 22.2798 | 22.2827 | 19.0 |
67
+ | 1.9028 | 12.0 | 1044 | 1.8856 | 24.6311 | 7.6312 | 22.6211 | 22.6265 | 19.0 |
68
+ | 1.8837 | 13.0 | 1131 | 1.8808 | 23.8252 | 7.2919 | 22.2651 | 22.2359 | 19.0 |
69
+ | 1.8606 | 14.0 | 1218 | 1.8751 | 23.875 | 7.6105 | 21.9304 | 21.9311 | 19.0 |
70
+ | 1.8386 | 15.0 | 1305 | 1.8661 | 24.5944 | 7.394 | 22.6082 | 22.5901 | 19.0 |
71
+ | 1.8313 | 16.0 | 1392 | 1.8598 | 24.6417 | 7.8094 | 22.6391 | 22.6353 | 19.0 |
72
+ | 1.821 | 17.0 | 1479 | 1.8568 | 24.6872 | 7.55 | 22.7157 | 22.7871 | 19.0 |
73
+ | 1.8092 | 18.0 | 1566 | 1.8508 | 24.6133 | 7.6888 | 22.6948 | 22.7972 | 19.0 |
74
+ | 1.8024 | 19.0 | 1653 | 1.8483 | 25.1081 | 7.61 | 22.8889 | 22.8858 | 19.0 |
75
+ | 1.7963 | 20.0 | 1740 | 1.8417 | 24.8799 | 7.6186 | 22.9405 | 22.9435 | 19.0 |
76
+ | 1.7775 | 21.0 | 1827 | 1.8383 | 25.3856 | 8.0504 | 23.1873 | 23.1687 | 19.0 |
77
+ | 1.7782 | 22.0 | 1914 | 1.8366 | 25.3015 | 8.2145 | 23.3779 | 23.3797 | 19.0 |
78
+ | 1.7619 | 23.0 | 2001 | 1.8329 | 24.8709 | 7.397 | 22.5124 | 22.5905 | 19.0 |
79
+ | 1.7625 | 24.0 | 2088 | 1.8304 | 25.0142 | 8.1525 | 22.9442 | 23.0429 | 19.0 |
80
+ | 1.7461 | 25.0 | 2175 | 1.8260 | 25.2686 | 8.3042 | 23.1614 | 23.2863 | 19.0 |
81
+ | 1.7433 | 26.0 | 2262 | 1.8228 | 25.4987 | 8.4777 | 23.2049 | 23.2753 | 19.0 |
82
+ | 1.7439 | 27.0 | 2349 | 1.8199 | 25.0074 | 8.2618 | 22.799 | 22.8579 | 19.0 |
83
+ | 1.7182 | 28.0 | 2436 | 1.8196 | 25.0142 | 8.1802 | 22.77 | 22.8017 | 19.0 |
 
84
 
85
 
86
  ### Framework versions