zera09 commited on
Commit
7e40baa
·
verified ·
1 Parent(s): fbef173

End of training

Browse files
Files changed (1) hide show
  1. README.md +38 -28
README.md CHANGED
@@ -18,12 +18,12 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [google/long-t5-tglobal-base](https://huggingface.co/google/long-t5-tglobal-base) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 1.5044
22
- - Rouge1: 0.5154
23
- - Rouge2: 0.3285
24
- - Rougel: 0.4684
25
- - Rougelsum: 0.4689
26
- - Gen Len: 26.0255
27
 
28
  ## Model description
29
 
@@ -42,38 +42,48 @@ More information needed
42
  ### Training hyperparameters
43
 
44
  The following hyperparameters were used during training:
45
- - learning_rate: 2e-05
46
  - train_batch_size: 8
47
  - eval_batch_size: 8
48
  - seed: 42
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
- - num_epochs: 20
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
  |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
57
- | 2.287 | 1.0 | 1000 | 1.7000 | 0.46 | 0.2766 | 0.4154 | 0.4159 | 25.549 |
58
- | 2.0938 | 2.0 | 2000 | 1.6285 | 0.4831 | 0.2971 | 0.4376 | 0.4382 | 26.247 |
59
- | 1.9825 | 3.0 | 3000 | 1.5920 | 0.4882 | 0.3029 | 0.4429 | 0.4434 | 25.7415 |
60
- | 1.9024 | 4.0 | 4000 | 1.5721 | 0.4936 | 0.3088 | 0.448 | 0.4489 | 25.4545 |
61
- | 1.8594 | 5.0 | 5000 | 1.5558 | 0.4933 | 0.3085 | 0.4477 | 0.448 | 26.5305 |
62
- | 1.7911 | 6.0 | 6000 | 1.5474 | 0.4971 | 0.312 | 0.4519 | 0.4525 | 26.0055 |
63
- | 1.7713 | 7.0 | 7000 | 1.5340 | 0.4984 | 0.314 | 0.4529 | 0.4536 | 25.853 |
64
- | 1.7167 | 8.0 | 8000 | 1.5247 | 0.5029 | 0.3186 | 0.4571 | 0.4579 | 26.2305 |
65
- | 1.6652 | 9.0 | 9000 | 1.5205 | 0.5052 | 0.3191 | 0.4595 | 0.46 | 26.017 |
66
- | 1.6657 | 10.0 | 10000 | 1.5170 | 0.5058 | 0.3196 | 0.4592 | 0.4598 | 26.2015 |
67
- | 1.6267 | 11.0 | 11000 | 1.5121 | 0.5078 | 0.3218 | 0.4617 | 0.4624 | 25.857 |
68
- | 1.5988 | 12.0 | 12000 | 1.5113 | 0.5094 | 0.3245 | 0.4631 | 0.4637 | 25.9415 |
69
- | 1.5983 | 13.0 | 13000 | 1.5049 | 0.5074 | 0.3214 | 0.4616 | 0.462 | 25.74 |
70
- | 1.5806 | 14.0 | 14000 | 1.5083 | 0.5114 | 0.3253 | 0.4642 | 0.4648 | 26.214 |
71
- | 1.588 | 15.0 | 15000 | 1.5038 | 0.5116 | 0.3257 | 0.4646 | 0.4649 | 26.2325 |
72
- | 1.5611 | 16.0 | 16000 | 1.5039 | 0.5126 | 0.3267 | 0.4666 | 0.4671 | 26.1765 |
73
- | 1.5723 | 17.0 | 17000 | 1.5021 | 0.5137 | 0.327 | 0.4667 | 0.4672 | 25.9735 |
74
- | 1.5319 | 18.0 | 18000 | 1.5041 | 0.5132 | 0.3265 | 0.4667 | 0.4672 | 25.9945 |
75
- | 1.5403 | 19.0 | 19000 | 1.5045 | 0.5151 | 0.3283 | 0.4682 | 0.4687 | 26.089 |
76
- | 1.5472 | 20.0 | 20000 | 1.5044 | 0.5154 | 0.3285 | 0.4684 | 0.4689 | 26.0255 |
 
 
 
 
 
 
 
 
 
 
77
 
78
 
79
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [google/long-t5-tglobal-base](https://huggingface.co/google/long-t5-tglobal-base) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 2.1612
22
+ - Rouge1: 0.5309
23
+ - Rouge2: 0.3406
24
+ - Rougel: 0.4779
25
+ - Rougelsum: 0.4778
26
+ - Gen Len: 30.6175
27
 
28
  ## Model description
29
 
 
42
  ### Training hyperparameters
43
 
44
  The following hyperparameters were used during training:
45
+ - learning_rate: 0.0001
46
  - train_batch_size: 8
47
  - eval_batch_size: 8
48
  - seed: 42
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
+ - num_epochs: 30
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
  |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
57
+ | 2.0161 | 1.0 | 1000 | 1.5665 | 0.4911 | 0.3059 | 0.4451 | 0.4451 | 25.5255 |
58
+ | 1.7658 | 2.0 | 2000 | 1.5150 | 0.5026 | 0.3142 | 0.4559 | 0.4557 | 26.8015 |
59
+ | 1.5969 | 3.0 | 3000 | 1.5031 | 0.51 | 0.3238 | 0.4628 | 0.4626 | 26.0075 |
60
+ | 1.4638 | 4.0 | 4000 | 1.5048 | 0.5189 | 0.3348 | 0.4724 | 0.4724 | 26.878 |
61
+ | 1.3675 | 5.0 | 5000 | 1.5363 | 0.5233 | 0.3369 | 0.4769 | 0.477 | 27.204 |
62
+ | 1.249 | 6.0 | 6000 | 1.5550 | 0.5206 | 0.3376 | 0.4762 | 0.4759 | 25.569 |
63
+ | 1.1861 | 7.0 | 7000 | 1.5511 | 0.5283 | 0.3444 | 0.4825 | 0.4824 | 26.8355 |
64
+ | 1.0985 | 8.0 | 8000 | 1.5838 | 0.5284 | 0.342 | 0.4792 | 0.4792 | 28.631 |
65
+ | 1.0178 | 9.0 | 9000 | 1.6231 | 0.5331 | 0.3451 | 0.4827 | 0.4828 | 28.7125 |
66
+ | 0.9649 | 10.0 | 10000 | 1.6392 | 0.5262 | 0.3384 | 0.4762 | 0.4762 | 29.0855 |
67
+ | 0.9069 | 11.0 | 11000 | 1.6758 | 0.5307 | 0.3421 | 0.4808 | 0.4804 | 28.9355 |
68
+ | 0.8472 | 12.0 | 12000 | 1.7137 | 0.5304 | 0.3458 | 0.481 | 0.4809 | 29.29 |
69
+ | 0.8087 | 13.0 | 13000 | 1.7478 | 0.5287 | 0.342 | 0.4789 | 0.4786 | 29.5185 |
70
+ | 0.773 | 14.0 | 14000 | 1.7628 | 0.5302 | 0.3436 | 0.4801 | 0.4801 | 29.725 |
71
+ | 0.7271 | 15.0 | 15000 | 1.8112 | 0.5293 | 0.3418 | 0.4789 | 0.4786 | 30.188 |
72
+ | 0.6919 | 16.0 | 16000 | 1.8520 | 0.5293 | 0.342 | 0.4778 | 0.4778 | 30.4125 |
73
+ | 0.665 | 17.0 | 17000 | 1.8738 | 0.5341 | 0.3432 | 0.4821 | 0.482 | 29.534 |
74
+ | 0.6242 | 18.0 | 18000 | 1.9228 | 0.5314 | 0.3439 | 0.4793 | 0.4792 | 29.2675 |
75
+ | 0.6024 | 19.0 | 19000 | 1.9288 | 0.535 | 0.347 | 0.4824 | 0.4823 | 29.852 |
76
+ | 0.5791 | 20.0 | 20000 | 1.9614 | 0.531 | 0.3417 | 0.4793 | 0.4791 | 29.754 |
77
+ | 0.5445 | 21.0 | 21000 | 2.0021 | 0.5302 | 0.3411 | 0.4784 | 0.4783 | 31.0095 |
78
+ | 0.5355 | 22.0 | 22000 | 2.0283 | 0.5318 | 0.3432 | 0.4792 | 0.4794 | 30.2985 |
79
+ | 0.5172 | 23.0 | 23000 | 2.0588 | 0.5296 | 0.3413 | 0.4775 | 0.4774 | 30.463 |
80
+ | 0.4968 | 24.0 | 24000 | 2.0907 | 0.5311 | 0.3423 | 0.4781 | 0.478 | 31.0295 |
81
+ | 0.4821 | 25.0 | 25000 | 2.0964 | 0.5318 | 0.3428 | 0.4792 | 0.4793 | 30.8365 |
82
+ | 0.4727 | 26.0 | 26000 | 2.1195 | 0.5317 | 0.3424 | 0.4789 | 0.4788 | 30.391 |
83
+ | 0.458 | 27.0 | 27000 | 2.1357 | 0.5301 | 0.3391 | 0.4761 | 0.4761 | 30.9145 |
84
+ | 0.4454 | 28.0 | 28000 | 2.1648 | 0.531 | 0.3409 | 0.4774 | 0.4774 | 31.1835 |
85
+ | 0.444 | 29.0 | 29000 | 2.1570 | 0.532 | 0.3418 | 0.4792 | 0.4791 | 30.596 |
86
+ | 0.4349 | 30.0 | 30000 | 2.1612 | 0.5309 | 0.3406 | 0.4779 | 0.4778 | 30.6175 |
87
 
88
 
89
  ### Framework versions