Upload ./training.log with huggingface_hub
Browse files- training.log +245 -0
training.log
ADDED
@@ -0,0 +1,245 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2024-03-26 10:45:09,966 ----------------------------------------------------------------------------------------------------
|
2 |
+
2024-03-26 10:45:09,966 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): BertModel(
|
5 |
+
(embeddings): BertEmbeddings(
|
6 |
+
(word_embeddings): Embedding(31103, 768)
|
7 |
+
(position_embeddings): Embedding(512, 768)
|
8 |
+
(token_type_embeddings): Embedding(2, 768)
|
9 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): BertEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-11): 12 x BertLayer(
|
15 |
+
(attention): BertAttention(
|
16 |
+
(self): BertSelfAttention(
|
17 |
+
(query): Linear(in_features=768, out_features=768, bias=True)
|
18 |
+
(key): Linear(in_features=768, out_features=768, bias=True)
|
19 |
+
(value): Linear(in_features=768, out_features=768, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): BertSelfOutput(
|
23 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): BertIntermediate(
|
29 |
+
(dense): Linear(in_features=768, out_features=3072, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): BertOutput(
|
33 |
+
(dense): Linear(in_features=3072, out_features=768, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
(pooler): BertPooler(
|
41 |
+
(dense): Linear(in_features=768, out_features=768, bias=True)
|
42 |
+
(activation): Tanh()
|
43 |
+
)
|
44 |
+
)
|
45 |
+
)
|
46 |
+
(locked_dropout): LockedDropout(p=0.5)
|
47 |
+
(linear): Linear(in_features=768, out_features=17, bias=True)
|
48 |
+
(loss_function): CrossEntropyLoss()
|
49 |
+
)"
|
50 |
+
2024-03-26 10:45:09,966 ----------------------------------------------------------------------------------------------------
|
51 |
+
2024-03-26 10:45:09,966 Corpus: 758 train + 94 dev + 96 test sentences
|
52 |
+
2024-03-26 10:45:09,966 ----------------------------------------------------------------------------------------------------
|
53 |
+
2024-03-26 10:45:09,966 Train: 758 sentences
|
54 |
+
2024-03-26 10:45:09,966 (train_with_dev=False, train_with_test=False)
|
55 |
+
2024-03-26 10:45:09,966 ----------------------------------------------------------------------------------------------------
|
56 |
+
2024-03-26 10:45:09,966 Training Params:
|
57 |
+
2024-03-26 10:45:09,966 - learning_rate: "5e-05"
|
58 |
+
2024-03-26 10:45:09,966 - mini_batch_size: "8"
|
59 |
+
2024-03-26 10:45:09,966 - max_epochs: "10"
|
60 |
+
2024-03-26 10:45:09,966 - shuffle: "True"
|
61 |
+
2024-03-26 10:45:09,967 ----------------------------------------------------------------------------------------------------
|
62 |
+
2024-03-26 10:45:09,967 Plugins:
|
63 |
+
2024-03-26 10:45:09,967 - TensorboardLogger
|
64 |
+
2024-03-26 10:45:09,967 - LinearScheduler | warmup_fraction: '0.1'
|
65 |
+
2024-03-26 10:45:09,967 ----------------------------------------------------------------------------------------------------
|
66 |
+
2024-03-26 10:45:09,967 Final evaluation on model from best epoch (best-model.pt)
|
67 |
+
2024-03-26 10:45:09,967 - metric: "('micro avg', 'f1-score')"
|
68 |
+
2024-03-26 10:45:09,967 ----------------------------------------------------------------------------------------------------
|
69 |
+
2024-03-26 10:45:09,967 Computation:
|
70 |
+
2024-03-26 10:45:09,967 - compute on device: cuda:0
|
71 |
+
2024-03-26 10:45:09,967 - embedding storage: none
|
72 |
+
2024-03-26 10:45:09,967 ----------------------------------------------------------------------------------------------------
|
73 |
+
2024-03-26 10:45:09,967 Model training base path: "flair-co-funer-gbert_base-bs8-e10-lr5e-05-5"
|
74 |
+
2024-03-26 10:45:09,967 ----------------------------------------------------------------------------------------------------
|
75 |
+
2024-03-26 10:45:09,967 ----------------------------------------------------------------------------------------------------
|
76 |
+
2024-03-26 10:45:09,967 Logging anything other than scalars to TensorBoard is currently not supported.
|
77 |
+
2024-03-26 10:45:11,842 epoch 1 - iter 9/95 - loss 3.40476800 - time (sec): 1.87 - samples/sec: 1672.42 - lr: 0.000004 - momentum: 0.000000
|
78 |
+
2024-03-26 10:45:13,726 epoch 1 - iter 18/95 - loss 3.14482635 - time (sec): 3.76 - samples/sec: 1764.79 - lr: 0.000009 - momentum: 0.000000
|
79 |
+
2024-03-26 10:45:16,019 epoch 1 - iter 27/95 - loss 2.87497592 - time (sec): 6.05 - samples/sec: 1713.71 - lr: 0.000014 - momentum: 0.000000
|
80 |
+
2024-03-26 10:45:17,503 epoch 1 - iter 36/95 - loss 2.68868444 - time (sec): 7.54 - samples/sec: 1792.56 - lr: 0.000018 - momentum: 0.000000
|
81 |
+
2024-03-26 10:45:19,648 epoch 1 - iter 45/95 - loss 2.50863173 - time (sec): 9.68 - samples/sec: 1775.18 - lr: 0.000023 - momentum: 0.000000
|
82 |
+
2024-03-26 10:45:21,225 epoch 1 - iter 54/95 - loss 2.33901067 - time (sec): 11.26 - samples/sec: 1797.28 - lr: 0.000028 - momentum: 0.000000
|
83 |
+
2024-03-26 10:45:22,866 epoch 1 - iter 63/95 - loss 2.19277919 - time (sec): 12.90 - samples/sec: 1814.29 - lr: 0.000033 - momentum: 0.000000
|
84 |
+
2024-03-26 10:45:24,750 epoch 1 - iter 72/95 - loss 2.04776243 - time (sec): 14.78 - samples/sec: 1806.01 - lr: 0.000037 - momentum: 0.000000
|
85 |
+
2024-03-26 10:45:26,812 epoch 1 - iter 81/95 - loss 1.89776627 - time (sec): 16.85 - samples/sec: 1789.76 - lr: 0.000042 - momentum: 0.000000
|
86 |
+
2024-03-26 10:45:28,446 epoch 1 - iter 90/95 - loss 1.78488952 - time (sec): 18.48 - samples/sec: 1781.78 - lr: 0.000047 - momentum: 0.000000
|
87 |
+
2024-03-26 10:45:29,199 ----------------------------------------------------------------------------------------------------
|
88 |
+
2024-03-26 10:45:29,199 EPOCH 1 done: loss 1.7294 - lr: 0.000047
|
89 |
+
2024-03-26 10:45:30,025 DEV : loss 0.4357704520225525 - f1-score (micro avg) 0.6862
|
90 |
+
2024-03-26 10:45:30,026 saving best model
|
91 |
+
2024-03-26 10:45:30,316 ----------------------------------------------------------------------------------------------------
|
92 |
+
2024-03-26 10:45:32,580 epoch 2 - iter 9/95 - loss 0.52047333 - time (sec): 2.26 - samples/sec: 1685.56 - lr: 0.000050 - momentum: 0.000000
|
93 |
+
2024-03-26 10:45:34,500 epoch 2 - iter 18/95 - loss 0.47727333 - time (sec): 4.18 - samples/sec: 1671.94 - lr: 0.000049 - momentum: 0.000000
|
94 |
+
2024-03-26 10:45:36,830 epoch 2 - iter 27/95 - loss 0.42420900 - time (sec): 6.51 - samples/sec: 1642.22 - lr: 0.000048 - momentum: 0.000000
|
95 |
+
2024-03-26 10:45:38,199 epoch 2 - iter 36/95 - loss 0.41117217 - time (sec): 7.88 - samples/sec: 1753.17 - lr: 0.000048 - momentum: 0.000000
|
96 |
+
2024-03-26 10:45:40,147 epoch 2 - iter 45/95 - loss 0.38005126 - time (sec): 9.83 - samples/sec: 1716.83 - lr: 0.000047 - momentum: 0.000000
|
97 |
+
2024-03-26 10:45:41,463 epoch 2 - iter 54/95 - loss 0.37659104 - time (sec): 11.15 - samples/sec: 1763.25 - lr: 0.000047 - momentum: 0.000000
|
98 |
+
2024-03-26 10:45:43,037 epoch 2 - iter 63/95 - loss 0.36250726 - time (sec): 12.72 - samples/sec: 1779.04 - lr: 0.000046 - momentum: 0.000000
|
99 |
+
2024-03-26 10:45:45,111 epoch 2 - iter 72/95 - loss 0.35529566 - time (sec): 14.79 - samples/sec: 1771.61 - lr: 0.000046 - momentum: 0.000000
|
100 |
+
2024-03-26 10:45:47,033 epoch 2 - iter 81/95 - loss 0.36371319 - time (sec): 16.72 - samples/sec: 1770.30 - lr: 0.000045 - momentum: 0.000000
|
101 |
+
2024-03-26 10:45:48,968 epoch 2 - iter 90/95 - loss 0.35037735 - time (sec): 18.65 - samples/sec: 1773.33 - lr: 0.000045 - momentum: 0.000000
|
102 |
+
2024-03-26 10:45:49,542 ----------------------------------------------------------------------------------------------------
|
103 |
+
2024-03-26 10:45:49,542 EPOCH 2 done: loss 0.3500 - lr: 0.000045
|
104 |
+
2024-03-26 10:45:50,439 DEV : loss 0.2682156562805176 - f1-score (micro avg) 0.8339
|
105 |
+
2024-03-26 10:45:50,440 saving best model
|
106 |
+
2024-03-26 10:45:50,864 ----------------------------------------------------------------------------------------------------
|
107 |
+
2024-03-26 10:45:52,083 epoch 3 - iter 9/95 - loss 0.26780059 - time (sec): 1.22 - samples/sec: 2130.20 - lr: 0.000044 - momentum: 0.000000
|
108 |
+
2024-03-26 10:45:54,395 epoch 3 - iter 18/95 - loss 0.22616209 - time (sec): 3.53 - samples/sec: 1818.98 - lr: 0.000043 - momentum: 0.000000
|
109 |
+
2024-03-26 10:45:56,091 epoch 3 - iter 27/95 - loss 0.23208566 - time (sec): 5.22 - samples/sec: 1867.58 - lr: 0.000043 - momentum: 0.000000
|
110 |
+
2024-03-26 10:45:57,779 epoch 3 - iter 36/95 - loss 0.21819078 - time (sec): 6.91 - samples/sec: 1904.95 - lr: 0.000042 - momentum: 0.000000
|
111 |
+
2024-03-26 10:45:59,221 epoch 3 - iter 45/95 - loss 0.20710332 - time (sec): 8.36 - samples/sec: 1900.72 - lr: 0.000042 - momentum: 0.000000
|
112 |
+
2024-03-26 10:46:01,327 epoch 3 - iter 54/95 - loss 0.19913518 - time (sec): 10.46 - samples/sec: 1846.61 - lr: 0.000041 - momentum: 0.000000
|
113 |
+
2024-03-26 10:46:02,997 epoch 3 - iter 63/95 - loss 0.19480733 - time (sec): 12.13 - samples/sec: 1834.76 - lr: 0.000041 - momentum: 0.000000
|
114 |
+
2024-03-26 10:46:05,260 epoch 3 - iter 72/95 - loss 0.18974922 - time (sec): 14.39 - samples/sec: 1802.02 - lr: 0.000040 - momentum: 0.000000
|
115 |
+
2024-03-26 10:46:07,425 epoch 3 - iter 81/95 - loss 0.19095483 - time (sec): 16.56 - samples/sec: 1797.60 - lr: 0.000040 - momentum: 0.000000
|
116 |
+
2024-03-26 10:46:09,176 epoch 3 - iter 90/95 - loss 0.18550780 - time (sec): 18.31 - samples/sec: 1787.02 - lr: 0.000039 - momentum: 0.000000
|
117 |
+
2024-03-26 10:46:10,044 ----------------------------------------------------------------------------------------------------
|
118 |
+
2024-03-26 10:46:10,044 EPOCH 3 done: loss 0.1844 - lr: 0.000039
|
119 |
+
2024-03-26 10:46:10,947 DEV : loss 0.24703261256217957 - f1-score (micro avg) 0.855
|
120 |
+
2024-03-26 10:46:10,947 saving best model
|
121 |
+
2024-03-26 10:46:11,444 ----------------------------------------------------------------------------------------------------
|
122 |
+
2024-03-26 10:46:14,220 epoch 4 - iter 9/95 - loss 0.10523460 - time (sec): 2.77 - samples/sec: 1538.62 - lr: 0.000039 - momentum: 0.000000
|
123 |
+
2024-03-26 10:46:15,257 epoch 4 - iter 18/95 - loss 0.13184327 - time (sec): 3.81 - samples/sec: 1746.28 - lr: 0.000038 - momentum: 0.000000
|
124 |
+
2024-03-26 10:46:17,747 epoch 4 - iter 27/95 - loss 0.11703814 - time (sec): 6.30 - samples/sec: 1688.16 - lr: 0.000037 - momentum: 0.000000
|
125 |
+
2024-03-26 10:46:20,345 epoch 4 - iter 36/95 - loss 0.11749788 - time (sec): 8.90 - samples/sec: 1630.81 - lr: 0.000037 - momentum: 0.000000
|
126 |
+
2024-03-26 10:46:22,047 epoch 4 - iter 45/95 - loss 0.10936671 - time (sec): 10.60 - samples/sec: 1661.40 - lr: 0.000036 - momentum: 0.000000
|
127 |
+
2024-03-26 10:46:23,750 epoch 4 - iter 54/95 - loss 0.10799577 - time (sec): 12.30 - samples/sec: 1674.24 - lr: 0.000036 - momentum: 0.000000
|
128 |
+
2024-03-26 10:46:25,652 epoch 4 - iter 63/95 - loss 0.11015899 - time (sec): 14.21 - samples/sec: 1700.40 - lr: 0.000035 - momentum: 0.000000
|
129 |
+
2024-03-26 10:46:27,322 epoch 4 - iter 72/95 - loss 0.11128318 - time (sec): 15.88 - samples/sec: 1748.34 - lr: 0.000035 - momentum: 0.000000
|
130 |
+
2024-03-26 10:46:28,354 epoch 4 - iter 81/95 - loss 0.11258121 - time (sec): 16.91 - samples/sec: 1786.99 - lr: 0.000034 - momentum: 0.000000
|
131 |
+
2024-03-26 10:46:29,751 epoch 4 - iter 90/95 - loss 0.11335549 - time (sec): 18.31 - samples/sec: 1813.05 - lr: 0.000034 - momentum: 0.000000
|
132 |
+
2024-03-26 10:46:30,283 ----------------------------------------------------------------------------------------------------
|
133 |
+
2024-03-26 10:46:30,284 EPOCH 4 done: loss 0.1157 - lr: 0.000034
|
134 |
+
2024-03-26 10:46:31,185 DEV : loss 0.1816115528345108 - f1-score (micro avg) 0.8905
|
135 |
+
2024-03-26 10:46:31,185 saving best model
|
136 |
+
2024-03-26 10:46:31,667 ----------------------------------------------------------------------------------------------------
|
137 |
+
2024-03-26 10:46:33,307 epoch 5 - iter 9/95 - loss 0.11400357 - time (sec): 1.64 - samples/sec: 1999.32 - lr: 0.000033 - momentum: 0.000000
|
138 |
+
2024-03-26 10:46:35,270 epoch 5 - iter 18/95 - loss 0.09404016 - time (sec): 3.60 - samples/sec: 1977.31 - lr: 0.000032 - momentum: 0.000000
|
139 |
+
2024-03-26 10:46:37,387 epoch 5 - iter 27/95 - loss 0.07875278 - time (sec): 5.72 - samples/sec: 1851.46 - lr: 0.000032 - momentum: 0.000000
|
140 |
+
2024-03-26 10:46:38,711 epoch 5 - iter 36/95 - loss 0.08929817 - time (sec): 7.04 - samples/sec: 1909.11 - lr: 0.000031 - momentum: 0.000000
|
141 |
+
2024-03-26 10:46:40,790 epoch 5 - iter 45/95 - loss 0.08692335 - time (sec): 9.12 - samples/sec: 1868.06 - lr: 0.000031 - momentum: 0.000000
|
142 |
+
2024-03-26 10:46:41,966 epoch 5 - iter 54/95 - loss 0.08984232 - time (sec): 10.30 - samples/sec: 1901.42 - lr: 0.000030 - momentum: 0.000000
|
143 |
+
2024-03-26 10:46:43,434 epoch 5 - iter 63/95 - loss 0.09315014 - time (sec): 11.77 - samples/sec: 1916.15 - lr: 0.000030 - momentum: 0.000000
|
144 |
+
2024-03-26 10:46:45,346 epoch 5 - iter 72/95 - loss 0.09260548 - time (sec): 13.68 - samples/sec: 1888.07 - lr: 0.000029 - momentum: 0.000000
|
145 |
+
2024-03-26 10:46:47,105 epoch 5 - iter 81/95 - loss 0.09102178 - time (sec): 15.44 - samples/sec: 1876.45 - lr: 0.000029 - momentum: 0.000000
|
146 |
+
2024-03-26 10:46:49,506 epoch 5 - iter 90/95 - loss 0.08955795 - time (sec): 17.84 - samples/sec: 1843.26 - lr: 0.000028 - momentum: 0.000000
|
147 |
+
2024-03-26 10:46:50,472 ----------------------------------------------------------------------------------------------------
|
148 |
+
2024-03-26 10:46:50,472 EPOCH 5 done: loss 0.0873 - lr: 0.000028
|
149 |
+
2024-03-26 10:46:51,369 DEV : loss 0.2073821723461151 - f1-score (micro avg) 0.8891
|
150 |
+
2024-03-26 10:46:51,370 ----------------------------------------------------------------------------------------------------
|
151 |
+
2024-03-26 10:46:53,310 epoch 6 - iter 9/95 - loss 0.06490025 - time (sec): 1.94 - samples/sec: 1680.45 - lr: 0.000027 - momentum: 0.000000
|
152 |
+
2024-03-26 10:46:55,746 epoch 6 - iter 18/95 - loss 0.07018426 - time (sec): 4.38 - samples/sec: 1694.31 - lr: 0.000027 - momentum: 0.000000
|
153 |
+
2024-03-26 10:46:56,900 epoch 6 - iter 27/95 - loss 0.08971755 - time (sec): 5.53 - samples/sec: 1787.62 - lr: 0.000026 - momentum: 0.000000
|
154 |
+
2024-03-26 10:46:58,504 epoch 6 - iter 36/95 - loss 0.08177094 - time (sec): 7.13 - samples/sec: 1808.76 - lr: 0.000026 - momentum: 0.000000
|
155 |
+
2024-03-26 10:47:00,439 epoch 6 - iter 45/95 - loss 0.07622431 - time (sec): 9.07 - samples/sec: 1801.35 - lr: 0.000025 - momentum: 0.000000
|
156 |
+
2024-03-26 10:47:02,578 epoch 6 - iter 54/95 - loss 0.07246962 - time (sec): 11.21 - samples/sec: 1765.45 - lr: 0.000025 - momentum: 0.000000
|
157 |
+
2024-03-26 10:47:04,260 epoch 6 - iter 63/95 - loss 0.07253559 - time (sec): 12.89 - samples/sec: 1781.84 - lr: 0.000024 - momentum: 0.000000
|
158 |
+
2024-03-26 10:47:05,812 epoch 6 - iter 72/95 - loss 0.07235842 - time (sec): 14.44 - samples/sec: 1803.22 - lr: 0.000024 - momentum: 0.000000
|
159 |
+
2024-03-26 10:47:07,044 epoch 6 - iter 81/95 - loss 0.07052301 - time (sec): 15.67 - samples/sec: 1833.84 - lr: 0.000023 - momentum: 0.000000
|
160 |
+
2024-03-26 10:47:08,914 epoch 6 - iter 90/95 - loss 0.06692717 - time (sec): 17.54 - samples/sec: 1831.47 - lr: 0.000023 - momentum: 0.000000
|
161 |
+
2024-03-26 10:47:10,426 ----------------------------------------------------------------------------------------------------
|
162 |
+
2024-03-26 10:47:10,426 EPOCH 6 done: loss 0.0643 - lr: 0.000023
|
163 |
+
2024-03-26 10:47:11,339 DEV : loss 0.1831476390361786 - f1-score (micro avg) 0.9163
|
164 |
+
2024-03-26 10:47:11,341 saving best model
|
165 |
+
2024-03-26 10:47:11,849 ----------------------------------------------------------------------------------------------------
|
166 |
+
2024-03-26 10:47:13,513 epoch 7 - iter 9/95 - loss 0.03474363 - time (sec): 1.66 - samples/sec: 1892.57 - lr: 0.000022 - momentum: 0.000000
|
167 |
+
2024-03-26 10:47:15,001 epoch 7 - iter 18/95 - loss 0.04622635 - time (sec): 3.15 - samples/sec: 1866.44 - lr: 0.000021 - momentum: 0.000000
|
168 |
+
2024-03-26 10:47:16,389 epoch 7 - iter 27/95 - loss 0.05885735 - time (sec): 4.54 - samples/sec: 1865.33 - lr: 0.000021 - momentum: 0.000000
|
169 |
+
2024-03-26 10:47:18,625 epoch 7 - iter 36/95 - loss 0.05082974 - time (sec): 6.78 - samples/sec: 1875.84 - lr: 0.000020 - momentum: 0.000000
|
170 |
+
2024-03-26 10:47:20,539 epoch 7 - iter 45/95 - loss 0.05326851 - time (sec): 8.69 - samples/sec: 1873.99 - lr: 0.000020 - momentum: 0.000000
|
171 |
+
2024-03-26 10:47:22,218 epoch 7 - iter 54/95 - loss 0.05190215 - time (sec): 10.37 - samples/sec: 1870.24 - lr: 0.000019 - momentum: 0.000000
|
172 |
+
2024-03-26 10:47:23,762 epoch 7 - iter 63/95 - loss 0.05035584 - time (sec): 11.91 - samples/sec: 1892.08 - lr: 0.000019 - momentum: 0.000000
|
173 |
+
2024-03-26 10:47:25,270 epoch 7 - iter 72/95 - loss 0.05035426 - time (sec): 13.42 - samples/sec: 1883.43 - lr: 0.000018 - momentum: 0.000000
|
174 |
+
2024-03-26 10:47:27,978 epoch 7 - iter 81/95 - loss 0.04799126 - time (sec): 16.13 - samples/sec: 1821.55 - lr: 0.000018 - momentum: 0.000000
|
175 |
+
2024-03-26 10:47:29,584 epoch 7 - iter 90/95 - loss 0.04814565 - time (sec): 17.73 - samples/sec: 1831.24 - lr: 0.000017 - momentum: 0.000000
|
176 |
+
2024-03-26 10:47:30,756 ----------------------------------------------------------------------------------------------------
|
177 |
+
2024-03-26 10:47:30,756 EPOCH 7 done: loss 0.0469 - lr: 0.000017
|
178 |
+
2024-03-26 10:47:31,668 DEV : loss 0.20823831856250763 - f1-score (micro avg) 0.9111
|
179 |
+
2024-03-26 10:47:31,670 ----------------------------------------------------------------------------------------------------
|
180 |
+
2024-03-26 10:47:33,828 epoch 8 - iter 9/95 - loss 0.04578165 - time (sec): 2.16 - samples/sec: 1566.09 - lr: 0.000016 - momentum: 0.000000
|
181 |
+
2024-03-26 10:47:35,339 epoch 8 - iter 18/95 - loss 0.03518377 - time (sec): 3.67 - samples/sec: 1663.23 - lr: 0.000016 - momentum: 0.000000
|
182 |
+
2024-03-26 10:47:37,327 epoch 8 - iter 27/95 - loss 0.03932391 - time (sec): 5.66 - samples/sec: 1734.34 - lr: 0.000015 - momentum: 0.000000
|
183 |
+
2024-03-26 10:47:39,294 epoch 8 - iter 36/95 - loss 0.03690593 - time (sec): 7.62 - samples/sec: 1765.82 - lr: 0.000015 - momentum: 0.000000
|
184 |
+
2024-03-26 10:47:40,714 epoch 8 - iter 45/95 - loss 0.03555396 - time (sec): 9.04 - samples/sec: 1819.70 - lr: 0.000014 - momentum: 0.000000
|
185 |
+
2024-03-26 10:47:42,199 epoch 8 - iter 54/95 - loss 0.03629148 - time (sec): 10.53 - samples/sec: 1887.44 - lr: 0.000014 - momentum: 0.000000
|
186 |
+
2024-03-26 10:47:43,778 epoch 8 - iter 63/95 - loss 0.03665286 - time (sec): 12.11 - samples/sec: 1878.15 - lr: 0.000013 - momentum: 0.000000
|
187 |
+
2024-03-26 10:47:45,866 epoch 8 - iter 72/95 - loss 0.03495998 - time (sec): 14.20 - samples/sec: 1844.37 - lr: 0.000013 - momentum: 0.000000
|
188 |
+
2024-03-26 10:47:47,429 epoch 8 - iter 81/95 - loss 0.03616178 - time (sec): 15.76 - samples/sec: 1868.49 - lr: 0.000012 - momentum: 0.000000
|
189 |
+
2024-03-26 10:47:49,493 epoch 8 - iter 90/95 - loss 0.03584999 - time (sec): 17.82 - samples/sec: 1844.78 - lr: 0.000012 - momentum: 0.000000
|
190 |
+
2024-03-26 10:47:50,135 ----------------------------------------------------------------------------------------------------
|
191 |
+
2024-03-26 10:47:50,135 EPOCH 8 done: loss 0.0355 - lr: 0.000012
|
192 |
+
2024-03-26 10:47:51,048 DEV : loss 0.18582023680210114 - f1-score (micro avg) 0.9237
|
193 |
+
2024-03-26 10:47:51,049 saving best model
|
194 |
+
2024-03-26 10:47:51,516 ----------------------------------------------------------------------------------------------------
|
195 |
+
2024-03-26 10:47:54,058 epoch 9 - iter 9/95 - loss 0.01804784 - time (sec): 2.54 - samples/sec: 1697.92 - lr: 0.000011 - momentum: 0.000000
|
196 |
+
2024-03-26 10:47:55,622 epoch 9 - iter 18/95 - loss 0.02575402 - time (sec): 4.10 - samples/sec: 1761.89 - lr: 0.000010 - momentum: 0.000000
|
197 |
+
2024-03-26 10:47:58,103 epoch 9 - iter 27/95 - loss 0.02713474 - time (sec): 6.59 - samples/sec: 1716.53 - lr: 0.000010 - momentum: 0.000000
|
198 |
+
2024-03-26 10:47:59,921 epoch 9 - iter 36/95 - loss 0.03071684 - time (sec): 8.40 - samples/sec: 1724.52 - lr: 0.000009 - momentum: 0.000000
|
199 |
+
2024-03-26 10:48:01,085 epoch 9 - iter 45/95 - loss 0.02979760 - time (sec): 9.57 - samples/sec: 1782.86 - lr: 0.000009 - momentum: 0.000000
|
200 |
+
2024-03-26 10:48:02,835 epoch 9 - iter 54/95 - loss 0.02644262 - time (sec): 11.32 - samples/sec: 1772.87 - lr: 0.000008 - momentum: 0.000000
|
201 |
+
2024-03-26 10:48:04,233 epoch 9 - iter 63/95 - loss 0.02858852 - time (sec): 12.72 - samples/sec: 1816.71 - lr: 0.000008 - momentum: 0.000000
|
202 |
+
2024-03-26 10:48:05,406 epoch 9 - iter 72/95 - loss 0.02824175 - time (sec): 13.89 - samples/sec: 1865.41 - lr: 0.000007 - momentum: 0.000000
|
203 |
+
2024-03-26 10:48:06,937 epoch 9 - iter 81/95 - loss 0.02648636 - time (sec): 15.42 - samples/sec: 1863.01 - lr: 0.000007 - momentum: 0.000000
|
204 |
+
2024-03-26 10:48:09,683 epoch 9 - iter 90/95 - loss 0.02833540 - time (sec): 18.17 - samples/sec: 1815.23 - lr: 0.000006 - momentum: 0.000000
|
205 |
+
2024-03-26 10:48:10,449 ----------------------------------------------------------------------------------------------------
|
206 |
+
2024-03-26 10:48:10,449 EPOCH 9 done: loss 0.0274 - lr: 0.000006
|
207 |
+
2024-03-26 10:48:11,383 DEV : loss 0.19166618585586548 - f1-score (micro avg) 0.9304
|
208 |
+
2024-03-26 10:48:11,384 saving best model
|
209 |
+
2024-03-26 10:48:11,875 ----------------------------------------------------------------------------------------------------
|
210 |
+
2024-03-26 10:48:14,339 epoch 10 - iter 9/95 - loss 0.01910654 - time (sec): 2.46 - samples/sec: 1639.28 - lr: 0.000005 - momentum: 0.000000
|
211 |
+
2024-03-26 10:48:15,920 epoch 10 - iter 18/95 - loss 0.01670295 - time (sec): 4.04 - samples/sec: 1724.49 - lr: 0.000005 - momentum: 0.000000
|
212 |
+
2024-03-26 10:48:17,881 epoch 10 - iter 27/95 - loss 0.01753978 - time (sec): 6.00 - samples/sec: 1678.44 - lr: 0.000004 - momentum: 0.000000
|
213 |
+
2024-03-26 10:48:19,886 epoch 10 - iter 36/95 - loss 0.01982955 - time (sec): 8.01 - samples/sec: 1703.88 - lr: 0.000004 - momentum: 0.000000
|
214 |
+
2024-03-26 10:48:21,752 epoch 10 - iter 45/95 - loss 0.01704063 - time (sec): 9.87 - samples/sec: 1717.78 - lr: 0.000003 - momentum: 0.000000
|
215 |
+
2024-03-26 10:48:22,878 epoch 10 - iter 54/95 - loss 0.01803494 - time (sec): 11.00 - samples/sec: 1780.87 - lr: 0.000003 - momentum: 0.000000
|
216 |
+
2024-03-26 10:48:24,504 epoch 10 - iter 63/95 - loss 0.02231773 - time (sec): 12.63 - samples/sec: 1802.17 - lr: 0.000002 - momentum: 0.000000
|
217 |
+
2024-03-26 10:48:26,323 epoch 10 - iter 72/95 - loss 0.02212617 - time (sec): 14.45 - samples/sec: 1790.82 - lr: 0.000002 - momentum: 0.000000
|
218 |
+
2024-03-26 10:48:27,998 epoch 10 - iter 81/95 - loss 0.02345921 - time (sec): 16.12 - samples/sec: 1801.11 - lr: 0.000001 - momentum: 0.000000
|
219 |
+
2024-03-26 10:48:30,771 epoch 10 - iter 90/95 - loss 0.02165914 - time (sec): 18.89 - samples/sec: 1764.48 - lr: 0.000001 - momentum: 0.000000
|
220 |
+
2024-03-26 10:48:31,339 ----------------------------------------------------------------------------------------------------
|
221 |
+
2024-03-26 10:48:31,339 EPOCH 10 done: loss 0.0221 - lr: 0.000001
|
222 |
+
2024-03-26 10:48:32,266 DEV : loss 0.1952328383922577 - f1-score (micro avg) 0.9342
|
223 |
+
2024-03-26 10:48:32,267 saving best model
|
224 |
+
2024-03-26 10:48:33,061 ----------------------------------------------------------------------------------------------------
|
225 |
+
2024-03-26 10:48:33,062 Loading model from best epoch ...
|
226 |
+
2024-03-26 10:48:34,008 SequenceTagger predicts: Dictionary with 17 tags: O, S-Unternehmen, B-Unternehmen, E-Unternehmen, I-Unternehmen, S-Auslagerung, B-Auslagerung, E-Auslagerung, I-Auslagerung, S-Ort, B-Ort, E-Ort, I-Ort, S-Software, B-Software, E-Software, I-Software
|
227 |
+
2024-03-26 10:48:34,853
|
228 |
+
Results:
|
229 |
+
- F-score (micro) 0.911
|
230 |
+
- F-score (macro) 0.692
|
231 |
+
- Accuracy 0.8402
|
232 |
+
|
233 |
+
By class:
|
234 |
+
precision recall f1-score support
|
235 |
+
|
236 |
+
Unternehmen 0.9255 0.8872 0.9060 266
|
237 |
+
Auslagerung 0.8654 0.9036 0.8841 249
|
238 |
+
Ort 0.9638 0.9925 0.9779 134
|
239 |
+
Software 0.0000 0.0000 0.0000 0
|
240 |
+
|
241 |
+
micro avg 0.9069 0.9153 0.9110 649
|
242 |
+
macro avg 0.6887 0.6958 0.6920 649
|
243 |
+
weighted avg 0.9103 0.9153 0.9124 649
|
244 |
+
|
245 |
+
2024-03-26 10:48:34,853 ----------------------------------------------------------------------------------------------------
|