sickcell69 commited on
Commit
d437d6c
1 Parent(s): 4ed5b4f

End of training

Browse files
README.md CHANGED
@@ -2,6 +2,8 @@
2
  base_model: dccuchile/bert-base-spanish-wwm-uncased
3
  tags:
4
  - generated_from_trainer
 
 
5
  model-index:
6
  - name: bert-base-spanish-wwm-uncased-finetuned-github_cybersecurity_READMEs
7
  results: []
@@ -14,7 +16,8 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [dccuchile/bert-base-spanish-wwm-uncased](https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 2.6423
 
18
 
19
  ## Model description
20
 
@@ -33,26 +36,215 @@ More information needed
33
  ### Training hyperparameters
34
 
35
  The following hyperparameters were used during training:
36
- - learning_rate: 2e-05
37
- - train_batch_size: 16
38
- - eval_batch_size: 16
39
  - seed: 42
40
- - gradient_accumulation_steps: 2
41
- - total_train_batch_size: 32
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: linear
44
- - lr_scheduler_warmup_steps: 500
45
- - num_epochs: 5
46
 
47
  ### Training results
48
 
49
- | Training Loss | Epoch | Step | Validation Loss |
50
- |:-------------:|:-----:|:----:|:---------------:|
51
- | No log | 1.0 | 58 | 4.9842 |
52
- | No log | 2.0 | 116 | 3.8913 |
53
- | No log | 3.0 | 174 | 3.2654 |
54
- | No log | 4.0 | 232 | 2.9281 |
55
- | No log | 5.0 | 290 | 2.6200 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
56
 
57
 
58
  ### Framework versions
 
2
  base_model: dccuchile/bert-base-spanish-wwm-uncased
3
  tags:
4
  - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
  model-index:
8
  - name: bert-base-spanish-wwm-uncased-finetuned-github_cybersecurity_READMEs
9
  results: []
 
16
 
17
  This model is a fine-tuned version of [dccuchile/bert-base-spanish-wwm-uncased](https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 1.3626
20
+ - Accuracy: 0.7721
21
 
22
  ## Model description
23
 
 
36
  ### Training hyperparameters
37
 
38
  The following hyperparameters were used during training:
39
+ - learning_rate: 3e-05
40
+ - train_batch_size: 32
41
+ - eval_batch_size: 32
42
  - seed: 42
43
+ - gradient_accumulation_steps: 4
44
+ - total_train_batch_size: 128
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
+ - lr_scheduler_warmup_steps: 1000
48
+ - num_epochs: 200
49
 
50
  ### Training results
51
 
52
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
53
+ |:-------------:|:------:|:----:|:---------------:|:--------:|
54
+ | No log | 0.97 | 14 | 6.6700 | 0.2868 |
55
+ | No log | 2.0 | 29 | 6.1329 | 0.3035 |
56
+ | No log | 2.97 | 43 | 5.3933 | 0.3612 |
57
+ | No log | 4.0 | 58 | 5.0109 | 0.3777 |
58
+ | No log | 4.97 | 72 | 4.8244 | 0.3982 |
59
+ | No log | 6.0 | 87 | 4.3103 | 0.4191 |
60
+ | No log | 6.97 | 101 | 3.9390 | 0.4472 |
61
+ | No log | 8.0 | 116 | 3.7105 | 0.4643 |
62
+ | No log | 8.97 | 130 | 3.6200 | 0.4682 |
63
+ | No log | 10.0 | 145 | 3.3792 | 0.4746 |
64
+ | No log | 10.97 | 159 | 3.2035 | 0.5035 |
65
+ | No log | 12.0 | 174 | 3.0204 | 0.5292 |
66
+ | No log | 12.97 | 188 | 2.9428 | 0.5446 |
67
+ | No log | 14.0 | 203 | 2.8275 | 0.5586 |
68
+ | No log | 14.97 | 217 | 2.8530 | 0.5389 |
69
+ | No log | 16.0 | 232 | 2.7320 | 0.5552 |
70
+ | No log | 16.97 | 246 | 2.6976 | 0.5569 |
71
+ | No log | 18.0 | 261 | 2.6423 | 0.5672 |
72
+ | No log | 18.97 | 275 | 2.5589 | 0.5768 |
73
+ | No log | 20.0 | 290 | 2.5393 | 0.5725 |
74
+ | No log | 20.97 | 304 | 2.4149 | 0.5883 |
75
+ | No log | 22.0 | 319 | 2.3377 | 0.6106 |
76
+ | No log | 22.97 | 333 | 2.3686 | 0.6006 |
77
+ | No log | 24.0 | 348 | 2.3694 | 0.5896 |
78
+ | No log | 24.97 | 362 | 2.3411 | 0.6006 |
79
+ | No log | 26.0 | 377 | 2.1990 | 0.6192 |
80
+ | No log | 26.97 | 391 | 2.1937 | 0.6187 |
81
+ | No log | 28.0 | 406 | 2.1599 | 0.6263 |
82
+ | No log | 28.97 | 420 | 2.1169 | 0.6288 |
83
+ | No log | 30.0 | 435 | 2.1136 | 0.6363 |
84
+ | No log | 30.97 | 449 | 2.1705 | 0.6269 |
85
+ | No log | 32.0 | 464 | 1.9909 | 0.6551 |
86
+ | No log | 32.97 | 478 | 1.9930 | 0.6452 |
87
+ | No log | 34.0 | 493 | 1.9380 | 0.6622 |
88
+ | 3.3393 | 34.97 | 507 | 2.0509 | 0.6429 |
89
+ | 3.3393 | 36.0 | 522 | 1.9449 | 0.6556 |
90
+ | 3.3393 | 36.97 | 536 | 1.9595 | 0.6500 |
91
+ | 3.3393 | 38.0 | 551 | 1.8646 | 0.6703 |
92
+ | 3.3393 | 38.97 | 565 | 1.9297 | 0.6553 |
93
+ | 3.3393 | 40.0 | 580 | 1.8071 | 0.6820 |
94
+ | 3.3393 | 40.97 | 594 | 1.9239 | 0.6564 |
95
+ | 3.3393 | 42.0 | 609 | 1.7737 | 0.6769 |
96
+ | 3.3393 | 42.97 | 623 | 1.7695 | 0.6889 |
97
+ | 3.3393 | 44.0 | 638 | 1.7444 | 0.6842 |
98
+ | 3.3393 | 44.97 | 652 | 1.7503 | 0.6839 |
99
+ | 3.3393 | 46.0 | 667 | 1.7654 | 0.6932 |
100
+ | 3.3393 | 46.97 | 681 | 1.7225 | 0.6862 |
101
+ | 3.3393 | 48.0 | 696 | 1.8165 | 0.6815 |
102
+ | 3.3393 | 48.97 | 710 | 1.7971 | 0.6840 |
103
+ | 3.3393 | 50.0 | 725 | 1.7177 | 0.6942 |
104
+ | 3.3393 | 50.97 | 739 | 1.6890 | 0.6982 |
105
+ | 3.3393 | 52.0 | 754 | 1.7212 | 0.6990 |
106
+ | 3.3393 | 52.97 | 768 | 1.7562 | 0.6892 |
107
+ | 3.3393 | 54.0 | 783 | 1.7142 | 0.6971 |
108
+ | 3.3393 | 54.97 | 797 | 1.6899 | 0.6955 |
109
+ | 3.3393 | 56.0 | 812 | 1.7568 | 0.6898 |
110
+ | 3.3393 | 56.97 | 826 | 1.6427 | 0.7137 |
111
+ | 3.3393 | 58.0 | 841 | 1.5932 | 0.7183 |
112
+ | 3.3393 | 58.97 | 855 | 1.6001 | 0.7193 |
113
+ | 3.3393 | 60.0 | 870 | 1.6482 | 0.7109 |
114
+ | 3.3393 | 60.97 | 884 | 1.5384 | 0.7211 |
115
+ | 3.3393 | 62.0 | 899 | 1.6092 | 0.7085 |
116
+ | 3.3393 | 62.97 | 913 | 1.6621 | 0.7068 |
117
+ | 3.3393 | 64.0 | 928 | 1.5781 | 0.7108 |
118
+ | 3.3393 | 64.97 | 942 | 1.5365 | 0.7297 |
119
+ | 3.3393 | 66.0 | 957 | 1.5426 | 0.7155 |
120
+ | 3.3393 | 66.97 | 971 | 1.6601 | 0.7051 |
121
+ | 3.3393 | 68.0 | 986 | 1.5874 | 0.7218 |
122
+ | 1.654 | 68.97 | 1000 | 1.6337 | 0.7148 |
123
+ | 1.654 | 70.0 | 1015 | 1.5324 | 0.7244 |
124
+ | 1.654 | 70.97 | 1029 | 1.5848 | 0.7245 |
125
+ | 1.654 | 72.0 | 1044 | 1.4755 | 0.7301 |
126
+ | 1.654 | 72.97 | 1058 | 1.5183 | 0.7323 |
127
+ | 1.654 | 74.0 | 1073 | 1.4930 | 0.7307 |
128
+ | 1.654 | 74.97 | 1087 | 1.4618 | 0.7350 |
129
+ | 1.654 | 76.0 | 1102 | 1.5082 | 0.7381 |
130
+ | 1.654 | 76.97 | 1116 | 1.4550 | 0.7402 |
131
+ | 1.654 | 78.0 | 1131 | 1.4609 | 0.7350 |
132
+ | 1.654 | 78.97 | 1145 | 1.5692 | 0.7258 |
133
+ | 1.654 | 80.0 | 1160 | 1.4066 | 0.7524 |
134
+ | 1.654 | 80.97 | 1174 | 1.5256 | 0.7283 |
135
+ | 1.654 | 82.0 | 1189 | 1.4466 | 0.7396 |
136
+ | 1.654 | 82.97 | 1203 | 1.4642 | 0.7357 |
137
+ | 1.654 | 84.0 | 1218 | 1.4985 | 0.7364 |
138
+ | 1.654 | 84.97 | 1232 | 1.4829 | 0.7421 |
139
+ | 1.654 | 86.0 | 1247 | 1.4528 | 0.7423 |
140
+ | 1.654 | 86.97 | 1261 | 1.3744 | 0.7470 |
141
+ | 1.654 | 88.0 | 1276 | 1.4098 | 0.7534 |
142
+ | 1.654 | 88.97 | 1290 | 1.4666 | 0.7439 |
143
+ | 1.654 | 90.0 | 1305 | 1.3889 | 0.7606 |
144
+ | 1.654 | 90.97 | 1319 | 1.4525 | 0.7436 |
145
+ | 1.654 | 92.0 | 1334 | 1.3673 | 0.7547 |
146
+ | 1.654 | 92.97 | 1348 | 1.4549 | 0.7430 |
147
+ | 1.654 | 94.0 | 1363 | 1.4008 | 0.7417 |
148
+ | 1.654 | 94.97 | 1377 | 1.3820 | 0.7472 |
149
+ | 1.654 | 96.0 | 1392 | 1.3900 | 0.7592 |
150
+ | 1.654 | 96.97 | 1406 | 1.4227 | 0.7458 |
151
+ | 1.654 | 98.0 | 1421 | 1.4179 | 0.7546 |
152
+ | 1.654 | 98.97 | 1435 | 1.4474 | 0.7476 |
153
+ | 1.654 | 100.0 | 1450 | 1.4092 | 0.7485 |
154
+ | 1.654 | 100.97 | 1464 | 1.3163 | 0.7678 |
155
+ | 1.654 | 102.0 | 1479 | 1.3801 | 0.7631 |
156
+ | 1.654 | 102.97 | 1493 | 1.4153 | 0.7496 |
157
+ | 1.1613 | 104.0 | 1508 | 1.3168 | 0.7616 |
158
+ | 1.1613 | 104.97 | 1522 | 1.3385 | 0.7607 |
159
+ | 1.1613 | 106.0 | 1537 | 1.4633 | 0.7406 |
160
+ | 1.1613 | 106.97 | 1551 | 1.4509 | 0.7473 |
161
+ | 1.1613 | 108.0 | 1566 | 1.3938 | 0.7577 |
162
+ | 1.1613 | 108.97 | 1580 | 1.4659 | 0.7451 |
163
+ | 1.1613 | 110.0 | 1595 | 1.4536 | 0.7403 |
164
+ | 1.1613 | 110.97 | 1609 | 1.4069 | 0.7529 |
165
+ | 1.1613 | 112.0 | 1624 | 1.2818 | 0.7721 |
166
+ | 1.1613 | 112.97 | 1638 | 1.3530 | 0.7618 |
167
+ | 1.1613 | 114.0 | 1653 | 1.3854 | 0.7555 |
168
+ | 1.1613 | 114.97 | 1667 | 1.3213 | 0.7589 |
169
+ | 1.1613 | 116.0 | 1682 | 1.3547 | 0.7578 |
170
+ | 1.1613 | 116.97 | 1696 | 1.4230 | 0.7544 |
171
+ | 1.1613 | 118.0 | 1711 | 1.3296 | 0.7650 |
172
+ | 1.1613 | 118.97 | 1725 | 1.3777 | 0.7616 |
173
+ | 1.1613 | 120.0 | 1740 | 1.3832 | 0.7639 |
174
+ | 1.1613 | 120.97 | 1754 | 1.4333 | 0.7524 |
175
+ | 1.1613 | 122.0 | 1769 | 1.3613 | 0.7655 |
176
+ | 1.1613 | 122.97 | 1783 | 1.4481 | 0.7533 |
177
+ | 1.1613 | 124.0 | 1798 | 1.4398 | 0.7550 |
178
+ | 1.1613 | 124.97 | 1812 | 1.3509 | 0.7678 |
179
+ | 1.1613 | 126.0 | 1827 | 1.3034 | 0.7705 |
180
+ | 1.1613 | 126.97 | 1841 | 1.4733 | 0.7468 |
181
+ | 1.1613 | 128.0 | 1856 | 1.4400 | 0.7557 |
182
+ | 1.1613 | 128.97 | 1870 | 1.3901 | 0.7599 |
183
+ | 1.1613 | 130.0 | 1885 | 1.3529 | 0.7683 |
184
+ | 1.1613 | 130.97 | 1899 | 1.3677 | 0.7568 |
185
+ | 1.1613 | 132.0 | 1914 | 1.4481 | 0.7561 |
186
+ | 1.1613 | 132.97 | 1928 | 1.2518 | 0.7826 |
187
+ | 1.1613 | 134.0 | 1943 | 1.4324 | 0.7527 |
188
+ | 1.1613 | 134.97 | 1957 | 1.3740 | 0.7591 |
189
+ | 1.1613 | 136.0 | 1972 | 1.3782 | 0.7628 |
190
+ | 1.1613 | 136.97 | 1986 | 1.2933 | 0.7735 |
191
+ | 0.9181 | 138.0 | 2001 | 1.3451 | 0.7709 |
192
+ | 0.9181 | 138.97 | 2015 | 1.4064 | 0.7646 |
193
+ | 0.9181 | 140.0 | 2030 | 1.3908 | 0.7661 |
194
+ | 0.9181 | 140.97 | 2044 | 1.3139 | 0.7692 |
195
+ | 0.9181 | 142.0 | 2059 | 1.3602 | 0.7698 |
196
+ | 0.9181 | 142.97 | 2073 | 1.3171 | 0.7763 |
197
+ | 0.9181 | 144.0 | 2088 | 1.3736 | 0.7627 |
198
+ | 0.9181 | 144.97 | 2102 | 1.3348 | 0.7670 |
199
+ | 0.9181 | 146.0 | 2117 | 1.3745 | 0.7672 |
200
+ | 0.9181 | 146.97 | 2131 | 1.3725 | 0.7657 |
201
+ | 0.9181 | 148.0 | 2146 | 1.3939 | 0.7662 |
202
+ | 0.9181 | 148.97 | 2160 | 1.3793 | 0.7654 |
203
+ | 0.9181 | 150.0 | 2175 | 1.3246 | 0.7713 |
204
+ | 0.9181 | 150.97 | 2189 | 1.2930 | 0.7767 |
205
+ | 0.9181 | 152.0 | 2204 | 1.2810 | 0.7786 |
206
+ | 0.9181 | 152.97 | 2218 | 1.3552 | 0.7677 |
207
+ | 0.9181 | 154.0 | 2233 | 1.4365 | 0.7662 |
208
+ | 0.9181 | 154.97 | 2247 | 1.3108 | 0.7701 |
209
+ | 0.9181 | 156.0 | 2262 | 1.2976 | 0.7802 |
210
+ | 0.9181 | 156.97 | 2276 | 1.3652 | 0.7743 |
211
+ | 0.9181 | 158.0 | 2291 | 1.3912 | 0.7628 |
212
+ | 0.9181 | 158.97 | 2305 | 1.3401 | 0.7689 |
213
+ | 0.9181 | 160.0 | 2320 | 1.2996 | 0.7723 |
214
+ | 0.9181 | 160.97 | 2334 | 1.3340 | 0.7764 |
215
+ | 0.9181 | 162.0 | 2349 | 1.2927 | 0.7751 |
216
+ | 0.9181 | 162.97 | 2363 | 1.3123 | 0.7766 |
217
+ | 0.9181 | 164.0 | 2378 | 1.3185 | 0.7712 |
218
+ | 0.9181 | 164.97 | 2392 | 1.3288 | 0.7737 |
219
+ | 0.9181 | 166.0 | 2407 | 1.3510 | 0.7685 |
220
+ | 0.9181 | 166.97 | 2421 | 1.3598 | 0.7699 |
221
+ | 0.9181 | 168.0 | 2436 | 1.3490 | 0.7638 |
222
+ | 0.9181 | 168.97 | 2450 | 1.3381 | 0.7643 |
223
+ | 0.9181 | 170.0 | 2465 | 1.3074 | 0.7761 |
224
+ | 0.9181 | 170.97 | 2479 | 1.3886 | 0.7631 |
225
+ | 0.9181 | 172.0 | 2494 | 1.3931 | 0.7634 |
226
+ | 0.7949 | 172.97 | 2508 | 1.3627 | 0.7662 |
227
+ | 0.7949 | 174.0 | 2523 | 1.4032 | 0.7653 |
228
+ | 0.7949 | 174.97 | 2537 | 1.3016 | 0.7740 |
229
+ | 0.7949 | 176.0 | 2552 | 1.3341 | 0.7710 |
230
+ | 0.7949 | 176.97 | 2566 | 1.3820 | 0.7624 |
231
+ | 0.7949 | 178.0 | 2581 | 1.3502 | 0.7761 |
232
+ | 0.7949 | 178.97 | 2595 | 1.3273 | 0.7752 |
233
+ | 0.7949 | 180.0 | 2610 | 1.3915 | 0.7623 |
234
+ | 0.7949 | 180.97 | 2624 | 1.4012 | 0.7616 |
235
+ | 0.7949 | 182.0 | 2639 | 1.3881 | 0.7692 |
236
+ | 0.7949 | 182.97 | 2653 | 1.2757 | 0.7807 |
237
+ | 0.7949 | 184.0 | 2668 | 1.3941 | 0.7629 |
238
+ | 0.7949 | 184.97 | 2682 | 1.3301 | 0.7800 |
239
+ | 0.7949 | 186.0 | 2697 | 1.3781 | 0.7735 |
240
+ | 0.7949 | 186.97 | 2711 | 1.3267 | 0.7782 |
241
+ | 0.7949 | 188.0 | 2726 | 1.3695 | 0.7688 |
242
+ | 0.7949 | 188.97 | 2740 | 1.3516 | 0.7752 |
243
+ | 0.7949 | 190.0 | 2755 | 1.3627 | 0.7733 |
244
+ | 0.7949 | 190.97 | 2769 | 1.3846 | 0.7713 |
245
+ | 0.7949 | 192.0 | 2784 | 1.3710 | 0.7662 |
246
+ | 0.7949 | 192.97 | 2798 | 1.3902 | 0.7660 |
247
+ | 0.7949 | 193.1 | 2800 | 1.4705 | 0.7550 |
248
 
249
 
250
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:be99baf076cdfdaa3f037da779a58f8aff55b03446109a43a16eb476eab611a5
3
  size 439557376
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2ca374497a905cb673865b92408406fabf94c6c4f9bb33670fbe902e9c75a858
3
  size 439557376
runs/Apr14_16-43-40_DESKTOP-7EBBP1S/events.out.tfevents.1713084222.DESKTOP-7EBBP1S.1146.6 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:69061135b56d91912af35df60d3e4f2b91af0f0a7e947387b9db09f813455262
3
- size 61361
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6e12008c602f184475cadafb68d0ac19683484b64fcdddd778a3c2db23d5c56a
3
+ size 68821
runs/Apr14_16-43-40_DESKTOP-7EBBP1S/events.out.tfevents.1713087516.DESKTOP-7EBBP1S.1146.7 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:52484b6df5cd1992b78b2140c010c7864fa7c3db279eb7e3c5c3e8c456201a3c
3
+ size 411