Upload folder using huggingface_hub
Browse files- best-model.pt +3 -0
- dev.tsv +0 -0
- final-model.pt +3 -0
- loss.tsv +11 -0
- runs/events.out.tfevents.1697141368.c8b2203b18a8.2408.10 +3 -0
- test.tsv +0 -0
- training.log +261 -0
best-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:80ad1644f97dad861e0a99d9cf497f0890578caa76d95b48b1cb221e2564d652
|
3 |
+
size 870793839
|
dev.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
final-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d3498347eabf491f49c8094c2cf8925945128d516cbbe40c63818f38525bfa3c
|
3 |
+
size 870793956
|
loss.tsv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
|
2 |
+
1 20:16:45 0.0001 0.8849 0.1860 0.3199 0.3192 0.3195 0.2320
|
3 |
+
2 20:24:10 0.0001 0.1129 0.0952 0.8491 0.7789 0.8125 0.6949
|
4 |
+
3 20:31:38 0.0001 0.0675 0.0834 0.8713 0.8182 0.8439 0.7437
|
5 |
+
4 20:39:09 0.0001 0.0460 0.0958 0.9032 0.8099 0.8540 0.7531
|
6 |
+
5 20:46:30 0.0001 0.0332 0.1017 0.8914 0.8140 0.8510 0.7505
|
7 |
+
6 20:53:46 0.0001 0.0249 0.1183 0.8772 0.8264 0.8511 0.7498
|
8 |
+
7 21:00:57 0.0001 0.0182 0.1187 0.8732 0.8326 0.8525 0.7554
|
9 |
+
8 21:08:04 0.0000 0.0150 0.1392 0.8836 0.8233 0.8524 0.7519
|
10 |
+
9 21:15:11 0.0000 0.0104 0.1443 0.8795 0.8295 0.8538 0.7568
|
11 |
+
10 21:22:24 0.0000 0.0077 0.1513 0.8816 0.8233 0.8515 0.7533
|
runs/events.out.tfevents.1697141368.c8b2203b18a8.2408.10
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2ecc7a8c510512d5c96d0bc0ab49c783355b068a5c135597502a18d037117a04
|
3 |
+
size 808480
|
test.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
training.log
ADDED
@@ -0,0 +1,261 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-12 20:09:28,155 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-12 20:09:28,157 Model: "SequenceTagger(
|
3 |
+
(embeddings): ByT5Embeddings(
|
4 |
+
(model): T5EncoderModel(
|
5 |
+
(shared): Embedding(384, 1472)
|
6 |
+
(encoder): T5Stack(
|
7 |
+
(embed_tokens): Embedding(384, 1472)
|
8 |
+
(block): ModuleList(
|
9 |
+
(0): T5Block(
|
10 |
+
(layer): ModuleList(
|
11 |
+
(0): T5LayerSelfAttention(
|
12 |
+
(SelfAttention): T5Attention(
|
13 |
+
(q): Linear(in_features=1472, out_features=384, bias=False)
|
14 |
+
(k): Linear(in_features=1472, out_features=384, bias=False)
|
15 |
+
(v): Linear(in_features=1472, out_features=384, bias=False)
|
16 |
+
(o): Linear(in_features=384, out_features=1472, bias=False)
|
17 |
+
(relative_attention_bias): Embedding(32, 6)
|
18 |
+
)
|
19 |
+
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(1): T5LayerFF(
|
23 |
+
(DenseReluDense): T5DenseGatedActDense(
|
24 |
+
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
|
25 |
+
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
|
26 |
+
(wo): Linear(in_features=3584, out_features=1472, bias=False)
|
27 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
28 |
+
(act): NewGELUActivation()
|
29 |
+
)
|
30 |
+
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
31 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
32 |
+
)
|
33 |
+
)
|
34 |
+
)
|
35 |
+
(1-11): 11 x T5Block(
|
36 |
+
(layer): ModuleList(
|
37 |
+
(0): T5LayerSelfAttention(
|
38 |
+
(SelfAttention): T5Attention(
|
39 |
+
(q): Linear(in_features=1472, out_features=384, bias=False)
|
40 |
+
(k): Linear(in_features=1472, out_features=384, bias=False)
|
41 |
+
(v): Linear(in_features=1472, out_features=384, bias=False)
|
42 |
+
(o): Linear(in_features=384, out_features=1472, bias=False)
|
43 |
+
)
|
44 |
+
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
45 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
46 |
+
)
|
47 |
+
(1): T5LayerFF(
|
48 |
+
(DenseReluDense): T5DenseGatedActDense(
|
49 |
+
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
|
50 |
+
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
|
51 |
+
(wo): Linear(in_features=3584, out_features=1472, bias=False)
|
52 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
53 |
+
(act): NewGELUActivation()
|
54 |
+
)
|
55 |
+
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
56 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
57 |
+
)
|
58 |
+
)
|
59 |
+
)
|
60 |
+
)
|
61 |
+
(final_layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
62 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
63 |
+
)
|
64 |
+
)
|
65 |
+
)
|
66 |
+
(locked_dropout): LockedDropout(p=0.5)
|
67 |
+
(linear): Linear(in_features=1472, out_features=13, bias=True)
|
68 |
+
(loss_function): CrossEntropyLoss()
|
69 |
+
)"
|
70 |
+
2023-10-12 20:09:28,157 ----------------------------------------------------------------------------------------------------
|
71 |
+
2023-10-12 20:09:28,157 MultiCorpus: 5777 train + 722 dev + 723 test sentences
|
72 |
+
- NER_ICDAR_EUROPEANA Corpus: 5777 train + 722 dev + 723 test sentences - /root/.flair/datasets/ner_icdar_europeana/nl
|
73 |
+
2023-10-12 20:09:28,157 ----------------------------------------------------------------------------------------------------
|
74 |
+
2023-10-12 20:09:28,158 Train: 5777 sentences
|
75 |
+
2023-10-12 20:09:28,158 (train_with_dev=False, train_with_test=False)
|
76 |
+
2023-10-12 20:09:28,158 ----------------------------------------------------------------------------------------------------
|
77 |
+
2023-10-12 20:09:28,158 Training Params:
|
78 |
+
2023-10-12 20:09:28,158 - learning_rate: "0.00015"
|
79 |
+
2023-10-12 20:09:28,158 - mini_batch_size: "4"
|
80 |
+
2023-10-12 20:09:28,158 - max_epochs: "10"
|
81 |
+
2023-10-12 20:09:28,158 - shuffle: "True"
|
82 |
+
2023-10-12 20:09:28,158 ----------------------------------------------------------------------------------------------------
|
83 |
+
2023-10-12 20:09:28,158 Plugins:
|
84 |
+
2023-10-12 20:09:28,158 - TensorboardLogger
|
85 |
+
2023-10-12 20:09:28,158 - LinearScheduler | warmup_fraction: '0.1'
|
86 |
+
2023-10-12 20:09:28,158 ----------------------------------------------------------------------------------------------------
|
87 |
+
2023-10-12 20:09:28,158 Final evaluation on model from best epoch (best-model.pt)
|
88 |
+
2023-10-12 20:09:28,158 - metric: "('micro avg', 'f1-score')"
|
89 |
+
2023-10-12 20:09:28,159 ----------------------------------------------------------------------------------------------------
|
90 |
+
2023-10-12 20:09:28,159 Computation:
|
91 |
+
2023-10-12 20:09:28,159 - compute on device: cuda:0
|
92 |
+
2023-10-12 20:09:28,159 - embedding storage: none
|
93 |
+
2023-10-12 20:09:28,159 ----------------------------------------------------------------------------------------------------
|
94 |
+
2023-10-12 20:09:28,159 Model training base path: "hmbench-icdar/nl-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs4-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-5"
|
95 |
+
2023-10-12 20:09:28,159 ----------------------------------------------------------------------------------------------------
|
96 |
+
2023-10-12 20:09:28,159 ----------------------------------------------------------------------------------------------------
|
97 |
+
2023-10-12 20:09:28,159 Logging anything other than scalars to TensorBoard is currently not supported.
|
98 |
+
2023-10-12 20:10:10,201 epoch 1 - iter 144/1445 - loss 2.53513164 - time (sec): 42.04 - samples/sec: 429.57 - lr: 0.000015 - momentum: 0.000000
|
99 |
+
2023-10-12 20:10:51,555 epoch 1 - iter 288/1445 - loss 2.38125034 - time (sec): 83.39 - samples/sec: 432.87 - lr: 0.000030 - momentum: 0.000000
|
100 |
+
2023-10-12 20:11:32,357 epoch 1 - iter 432/1445 - loss 2.14339240 - time (sec): 124.20 - samples/sec: 421.72 - lr: 0.000045 - momentum: 0.000000
|
101 |
+
2023-10-12 20:12:14,329 epoch 1 - iter 576/1445 - loss 1.85422411 - time (sec): 166.17 - samples/sec: 421.64 - lr: 0.000060 - momentum: 0.000000
|
102 |
+
2023-10-12 20:12:56,106 epoch 1 - iter 720/1445 - loss 1.58043777 - time (sec): 207.95 - samples/sec: 422.36 - lr: 0.000075 - momentum: 0.000000
|
103 |
+
2023-10-12 20:13:37,386 epoch 1 - iter 864/1445 - loss 1.36953584 - time (sec): 249.22 - samples/sec: 420.10 - lr: 0.000090 - momentum: 0.000000
|
104 |
+
2023-10-12 20:14:19,750 epoch 1 - iter 1008/1445 - loss 1.20108235 - time (sec): 291.59 - samples/sec: 419.86 - lr: 0.000105 - momentum: 0.000000
|
105 |
+
2023-10-12 20:14:59,913 epoch 1 - iter 1152/1445 - loss 1.07934407 - time (sec): 331.75 - samples/sec: 419.62 - lr: 0.000119 - momentum: 0.000000
|
106 |
+
2023-10-12 20:15:40,857 epoch 1 - iter 1296/1445 - loss 0.97634839 - time (sec): 372.70 - samples/sec: 421.06 - lr: 0.000134 - momentum: 0.000000
|
107 |
+
2023-10-12 20:16:23,000 epoch 1 - iter 1440/1445 - loss 0.88821258 - time (sec): 414.84 - samples/sec: 422.96 - lr: 0.000149 - momentum: 0.000000
|
108 |
+
2023-10-12 20:16:24,456 ----------------------------------------------------------------------------------------------------
|
109 |
+
2023-10-12 20:16:24,456 EPOCH 1 done: loss 0.8849 - lr: 0.000149
|
110 |
+
2023-10-12 20:16:45,374 DEV : loss 0.18604247272014618 - f1-score (micro avg) 0.3195
|
111 |
+
2023-10-12 20:16:45,410 saving best model
|
112 |
+
2023-10-12 20:16:46,297 ----------------------------------------------------------------------------------------------------
|
113 |
+
2023-10-12 20:17:28,351 epoch 2 - iter 144/1445 - loss 0.13676022 - time (sec): 42.05 - samples/sec: 414.84 - lr: 0.000148 - momentum: 0.000000
|
114 |
+
2023-10-12 20:18:10,440 epoch 2 - iter 288/1445 - loss 0.12929980 - time (sec): 84.14 - samples/sec: 421.50 - lr: 0.000147 - momentum: 0.000000
|
115 |
+
2023-10-12 20:18:51,718 epoch 2 - iter 432/1445 - loss 0.12842442 - time (sec): 125.42 - samples/sec: 414.49 - lr: 0.000145 - momentum: 0.000000
|
116 |
+
2023-10-12 20:19:33,782 epoch 2 - iter 576/1445 - loss 0.12658184 - time (sec): 167.48 - samples/sec: 416.65 - lr: 0.000143 - momentum: 0.000000
|
117 |
+
2023-10-12 20:20:17,490 epoch 2 - iter 720/1445 - loss 0.12315068 - time (sec): 211.19 - samples/sec: 413.09 - lr: 0.000142 - momentum: 0.000000
|
118 |
+
2023-10-12 20:20:59,183 epoch 2 - iter 864/1445 - loss 0.12053054 - time (sec): 252.88 - samples/sec: 414.43 - lr: 0.000140 - momentum: 0.000000
|
119 |
+
2023-10-12 20:21:41,327 epoch 2 - iter 1008/1445 - loss 0.12141328 - time (sec): 295.03 - samples/sec: 413.84 - lr: 0.000138 - momentum: 0.000000
|
120 |
+
2023-10-12 20:22:23,104 epoch 2 - iter 1152/1445 - loss 0.11874436 - time (sec): 336.80 - samples/sec: 415.42 - lr: 0.000137 - momentum: 0.000000
|
121 |
+
2023-10-12 20:23:06,226 epoch 2 - iter 1296/1445 - loss 0.11649198 - time (sec): 379.93 - samples/sec: 416.37 - lr: 0.000135 - momentum: 0.000000
|
122 |
+
2023-10-12 20:23:47,727 epoch 2 - iter 1440/1445 - loss 0.11308681 - time (sec): 421.43 - samples/sec: 416.69 - lr: 0.000133 - momentum: 0.000000
|
123 |
+
2023-10-12 20:23:49,078 ----------------------------------------------------------------------------------------------------
|
124 |
+
2023-10-12 20:23:49,079 EPOCH 2 done: loss 0.1129 - lr: 0.000133
|
125 |
+
2023-10-12 20:24:10,065 DEV : loss 0.09517565369606018 - f1-score (micro avg) 0.8125
|
126 |
+
2023-10-12 20:24:10,102 saving best model
|
127 |
+
2023-10-12 20:24:12,992 ----------------------------------------------------------------------------------------------------
|
128 |
+
2023-10-12 20:24:56,006 epoch 3 - iter 144/1445 - loss 0.07327007 - time (sec): 43.01 - samples/sec: 416.27 - lr: 0.000132 - momentum: 0.000000
|
129 |
+
2023-10-12 20:25:38,617 epoch 3 - iter 288/1445 - loss 0.07041840 - time (sec): 85.62 - samples/sec: 420.04 - lr: 0.000130 - momentum: 0.000000
|
130 |
+
2023-10-12 20:26:20,786 epoch 3 - iter 432/1445 - loss 0.06999514 - time (sec): 127.79 - samples/sec: 418.84 - lr: 0.000128 - momentum: 0.000000
|
131 |
+
2023-10-12 20:27:03,456 epoch 3 - iter 576/1445 - loss 0.07112475 - time (sec): 170.46 - samples/sec: 417.00 - lr: 0.000127 - momentum: 0.000000
|
132 |
+
2023-10-12 20:27:45,281 epoch 3 - iter 720/1445 - loss 0.07021459 - time (sec): 212.29 - samples/sec: 419.44 - lr: 0.000125 - momentum: 0.000000
|
133 |
+
2023-10-12 20:28:27,466 epoch 3 - iter 864/1445 - loss 0.07000170 - time (sec): 254.47 - samples/sec: 423.51 - lr: 0.000123 - momentum: 0.000000
|
134 |
+
2023-10-12 20:29:09,522 epoch 3 - iter 1008/1445 - loss 0.06982317 - time (sec): 296.53 - samples/sec: 422.11 - lr: 0.000122 - momentum: 0.000000
|
135 |
+
2023-10-12 20:29:50,667 epoch 3 - iter 1152/1445 - loss 0.06964722 - time (sec): 337.67 - samples/sec: 421.05 - lr: 0.000120 - momentum: 0.000000
|
136 |
+
2023-10-12 20:30:31,746 epoch 3 - iter 1296/1445 - loss 0.06839580 - time (sec): 378.75 - samples/sec: 419.07 - lr: 0.000118 - momentum: 0.000000
|
137 |
+
2023-10-12 20:31:14,591 epoch 3 - iter 1440/1445 - loss 0.06745276 - time (sec): 421.60 - samples/sec: 416.31 - lr: 0.000117 - momentum: 0.000000
|
138 |
+
2023-10-12 20:31:16,021 ----------------------------------------------------------------------------------------------------
|
139 |
+
2023-10-12 20:31:16,022 EPOCH 3 done: loss 0.0675 - lr: 0.000117
|
140 |
+
2023-10-12 20:31:37,989 DEV : loss 0.08340664207935333 - f1-score (micro avg) 0.8439
|
141 |
+
2023-10-12 20:31:38,020 saving best model
|
142 |
+
2023-10-12 20:31:40,553 ----------------------------------------------------------------------------------------------------
|
143 |
+
2023-10-12 20:32:23,274 epoch 4 - iter 144/1445 - loss 0.05297012 - time (sec): 42.72 - samples/sec: 419.86 - lr: 0.000115 - momentum: 0.000000
|
144 |
+
2023-10-12 20:33:04,940 epoch 4 - iter 288/1445 - loss 0.05061011 - time (sec): 84.38 - samples/sec: 411.81 - lr: 0.000113 - momentum: 0.000000
|
145 |
+
2023-10-12 20:33:47,048 epoch 4 - iter 432/1445 - loss 0.04713746 - time (sec): 126.49 - samples/sec: 414.49 - lr: 0.000112 - momentum: 0.000000
|
146 |
+
2023-10-12 20:34:30,755 epoch 4 - iter 576/1445 - loss 0.04639052 - time (sec): 170.20 - samples/sec: 419.88 - lr: 0.000110 - momentum: 0.000000
|
147 |
+
2023-10-12 20:35:13,049 epoch 4 - iter 720/1445 - loss 0.04446589 - time (sec): 212.49 - samples/sec: 419.88 - lr: 0.000108 - momentum: 0.000000
|
148 |
+
2023-10-12 20:35:54,366 epoch 4 - iter 864/1445 - loss 0.04321469 - time (sec): 253.81 - samples/sec: 418.40 - lr: 0.000107 - momentum: 0.000000
|
149 |
+
2023-10-12 20:36:39,400 epoch 4 - iter 1008/1445 - loss 0.04351905 - time (sec): 298.84 - samples/sec: 412.20 - lr: 0.000105 - momentum: 0.000000
|
150 |
+
2023-10-12 20:37:22,743 epoch 4 - iter 1152/1445 - loss 0.04392393 - time (sec): 342.18 - samples/sec: 413.28 - lr: 0.000103 - momentum: 0.000000
|
151 |
+
2023-10-12 20:38:05,982 epoch 4 - iter 1296/1445 - loss 0.04654178 - time (sec): 385.42 - samples/sec: 412.16 - lr: 0.000102 - momentum: 0.000000
|
152 |
+
2023-10-12 20:38:47,608 epoch 4 - iter 1440/1445 - loss 0.04601732 - time (sec): 427.05 - samples/sec: 411.73 - lr: 0.000100 - momentum: 0.000000
|
153 |
+
2023-10-12 20:38:48,759 ----------------------------------------------------------------------------------------------------
|
154 |
+
2023-10-12 20:38:48,759 EPOCH 4 done: loss 0.0460 - lr: 0.000100
|
155 |
+
2023-10-12 20:39:09,821 DEV : loss 0.09578309208154678 - f1-score (micro avg) 0.854
|
156 |
+
2023-10-12 20:39:09,854 saving best model
|
157 |
+
2023-10-12 20:39:12,430 ----------------------------------------------------------------------------------------------------
|
158 |
+
2023-10-12 20:39:55,317 epoch 5 - iter 144/1445 - loss 0.03906087 - time (sec): 42.88 - samples/sec: 440.19 - lr: 0.000098 - momentum: 0.000000
|
159 |
+
2023-10-12 20:40:37,355 epoch 5 - iter 288/1445 - loss 0.03326712 - time (sec): 84.92 - samples/sec: 427.34 - lr: 0.000097 - momentum: 0.000000
|
160 |
+
2023-10-12 20:41:17,016 epoch 5 - iter 432/1445 - loss 0.03091069 - time (sec): 124.58 - samples/sec: 416.23 - lr: 0.000095 - momentum: 0.000000
|
161 |
+
2023-10-12 20:41:56,651 epoch 5 - iter 576/1445 - loss 0.03004124 - time (sec): 164.22 - samples/sec: 414.41 - lr: 0.000093 - momentum: 0.000000
|
162 |
+
2023-10-12 20:42:38,714 epoch 5 - iter 720/1445 - loss 0.03116302 - time (sec): 206.28 - samples/sec: 419.94 - lr: 0.000092 - momentum: 0.000000
|
163 |
+
2023-10-12 20:43:20,128 epoch 5 - iter 864/1445 - loss 0.03024606 - time (sec): 247.69 - samples/sec: 419.91 - lr: 0.000090 - momentum: 0.000000
|
164 |
+
2023-10-12 20:44:02,818 epoch 5 - iter 1008/1445 - loss 0.03089293 - time (sec): 290.38 - samples/sec: 421.95 - lr: 0.000088 - momentum: 0.000000
|
165 |
+
2023-10-12 20:44:44,492 epoch 5 - iter 1152/1445 - loss 0.03113353 - time (sec): 332.06 - samples/sec: 422.52 - lr: 0.000087 - momentum: 0.000000
|
166 |
+
2023-10-12 20:45:26,335 epoch 5 - iter 1296/1445 - loss 0.03083010 - time (sec): 373.90 - samples/sec: 422.31 - lr: 0.000085 - momentum: 0.000000
|
167 |
+
2023-10-12 20:46:07,798 epoch 5 - iter 1440/1445 - loss 0.03275021 - time (sec): 415.36 - samples/sec: 422.21 - lr: 0.000083 - momentum: 0.000000
|
168 |
+
2023-10-12 20:46:09,256 ----------------------------------------------------------------------------------------------------
|
169 |
+
2023-10-12 20:46:09,256 EPOCH 5 done: loss 0.0332 - lr: 0.000083
|
170 |
+
2023-10-12 20:46:30,762 DEV : loss 0.10168028622865677 - f1-score (micro avg) 0.851
|
171 |
+
2023-10-12 20:46:30,793 ----------------------------------------------------------------------------------------------------
|
172 |
+
2023-10-12 20:47:12,084 epoch 6 - iter 144/1445 - loss 0.02321712 - time (sec): 41.29 - samples/sec: 416.75 - lr: 0.000082 - momentum: 0.000000
|
173 |
+
2023-10-12 20:47:52,553 epoch 6 - iter 288/1445 - loss 0.02481388 - time (sec): 81.76 - samples/sec: 422.31 - lr: 0.000080 - momentum: 0.000000
|
174 |
+
2023-10-12 20:48:33,748 epoch 6 - iter 432/1445 - loss 0.02757483 - time (sec): 122.95 - samples/sec: 426.21 - lr: 0.000078 - momentum: 0.000000
|
175 |
+
2023-10-12 20:49:15,702 epoch 6 - iter 576/1445 - loss 0.02650027 - time (sec): 164.91 - samples/sec: 426.98 - lr: 0.000077 - momentum: 0.000000
|
176 |
+
2023-10-12 20:49:57,116 epoch 6 - iter 720/1445 - loss 0.02588266 - time (sec): 206.32 - samples/sec: 427.16 - lr: 0.000075 - momentum: 0.000000
|
177 |
+
2023-10-12 20:50:40,206 epoch 6 - iter 864/1445 - loss 0.02373812 - time (sec): 249.41 - samples/sec: 427.26 - lr: 0.000073 - momentum: 0.000000
|
178 |
+
2023-10-12 20:51:22,906 epoch 6 - iter 1008/1445 - loss 0.02602288 - time (sec): 292.11 - samples/sec: 425.85 - lr: 0.000072 - momentum: 0.000000
|
179 |
+
2023-10-12 20:52:03,431 epoch 6 - iter 1152/1445 - loss 0.02504745 - time (sec): 332.64 - samples/sec: 423.75 - lr: 0.000070 - momentum: 0.000000
|
180 |
+
2023-10-12 20:52:43,013 epoch 6 - iter 1296/1445 - loss 0.02425352 - time (sec): 372.22 - samples/sec: 423.56 - lr: 0.000068 - momentum: 0.000000
|
181 |
+
2023-10-12 20:53:24,660 epoch 6 - iter 1440/1445 - loss 0.02500831 - time (sec): 413.86 - samples/sec: 424.47 - lr: 0.000067 - momentum: 0.000000
|
182 |
+
2023-10-12 20:53:25,858 ----------------------------------------------------------------------------------------------------
|
183 |
+
2023-10-12 20:53:25,858 EPOCH 6 done: loss 0.0249 - lr: 0.000067
|
184 |
+
2023-10-12 20:53:46,613 DEV : loss 0.11829700320959091 - f1-score (micro avg) 0.8511
|
185 |
+
2023-10-12 20:53:46,644 ----------------------------------------------------------------------------------------------------
|
186 |
+
2023-10-12 20:54:27,677 epoch 7 - iter 144/1445 - loss 0.02617530 - time (sec): 41.03 - samples/sec: 429.90 - lr: 0.000065 - momentum: 0.000000
|
187 |
+
2023-10-12 20:55:08,659 epoch 7 - iter 288/1445 - loss 0.01907134 - time (sec): 82.01 - samples/sec: 433.79 - lr: 0.000063 - momentum: 0.000000
|
188 |
+
2023-10-12 20:55:48,427 epoch 7 - iter 432/1445 - loss 0.01974084 - time (sec): 121.78 - samples/sec: 427.77 - lr: 0.000062 - momentum: 0.000000
|
189 |
+
2023-10-12 20:56:27,995 epoch 7 - iter 576/1445 - loss 0.01857623 - time (sec): 161.35 - samples/sec: 425.46 - lr: 0.000060 - momentum: 0.000000
|
190 |
+
2023-10-12 20:57:09,150 epoch 7 - iter 720/1445 - loss 0.02000500 - time (sec): 202.50 - samples/sec: 428.61 - lr: 0.000058 - momentum: 0.000000
|
191 |
+
2023-10-12 20:57:50,603 epoch 7 - iter 864/1445 - loss 0.01835280 - time (sec): 243.96 - samples/sec: 427.19 - lr: 0.000057 - momentum: 0.000000
|
192 |
+
2023-10-12 20:58:32,369 epoch 7 - iter 1008/1445 - loss 0.01795777 - time (sec): 285.72 - samples/sec: 426.74 - lr: 0.000055 - momentum: 0.000000
|
193 |
+
2023-10-12 20:59:13,015 epoch 7 - iter 1152/1445 - loss 0.01805962 - time (sec): 326.37 - samples/sec: 425.71 - lr: 0.000053 - momentum: 0.000000
|
194 |
+
2023-10-12 20:59:53,241 epoch 7 - iter 1296/1445 - loss 0.01759878 - time (sec): 366.60 - samples/sec: 426.95 - lr: 0.000052 - momentum: 0.000000
|
195 |
+
2023-10-12 21:00:34,943 epoch 7 - iter 1440/1445 - loss 0.01818970 - time (sec): 408.30 - samples/sec: 429.81 - lr: 0.000050 - momentum: 0.000000
|
196 |
+
2023-10-12 21:00:36,340 ----------------------------------------------------------------------------------------------------
|
197 |
+
2023-10-12 21:00:36,340 EPOCH 7 done: loss 0.0182 - lr: 0.000050
|
198 |
+
2023-10-12 21:00:57,548 DEV : loss 0.11870528757572174 - f1-score (micro avg) 0.8525
|
199 |
+
2023-10-12 21:00:57,579 ----------------------------------------------------------------------------------------------------
|
200 |
+
2023-10-12 21:01:39,477 epoch 8 - iter 144/1445 - loss 0.01437875 - time (sec): 41.90 - samples/sec: 443.00 - lr: 0.000048 - momentum: 0.000000
|
201 |
+
2023-10-12 21:02:19,894 epoch 8 - iter 288/1445 - loss 0.01688151 - time (sec): 82.31 - samples/sec: 436.25 - lr: 0.000047 - momentum: 0.000000
|
202 |
+
2023-10-12 21:02:59,900 epoch 8 - iter 432/1445 - loss 0.01434297 - time (sec): 122.32 - samples/sec: 431.85 - lr: 0.000045 - momentum: 0.000000
|
203 |
+
2023-10-12 21:03:41,511 epoch 8 - iter 576/1445 - loss 0.01368236 - time (sec): 163.93 - samples/sec: 438.79 - lr: 0.000043 - momentum: 0.000000
|
204 |
+
2023-10-12 21:04:21,650 epoch 8 - iter 720/1445 - loss 0.01360876 - time (sec): 204.07 - samples/sec: 437.61 - lr: 0.000042 - momentum: 0.000000
|
205 |
+
2023-10-12 21:05:01,428 epoch 8 - iter 864/1445 - loss 0.01358988 - time (sec): 243.85 - samples/sec: 433.72 - lr: 0.000040 - momentum: 0.000000
|
206 |
+
2023-10-12 21:05:42,391 epoch 8 - iter 1008/1445 - loss 0.01382289 - time (sec): 284.81 - samples/sec: 432.03 - lr: 0.000038 - momentum: 0.000000
|
207 |
+
2023-10-12 21:06:21,709 epoch 8 - iter 1152/1445 - loss 0.01364830 - time (sec): 324.13 - samples/sec: 430.45 - lr: 0.000037 - momentum: 0.000000
|
208 |
+
2023-10-12 21:07:02,459 epoch 8 - iter 1296/1445 - loss 0.01538390 - time (sec): 364.88 - samples/sec: 432.75 - lr: 0.000035 - momentum: 0.000000
|
209 |
+
2023-10-12 21:07:42,880 epoch 8 - iter 1440/1445 - loss 0.01492242 - time (sec): 405.30 - samples/sec: 433.56 - lr: 0.000033 - momentum: 0.000000
|
210 |
+
2023-10-12 21:07:44,100 ----------------------------------------------------------------------------------------------------
|
211 |
+
2023-10-12 21:07:44,100 EPOCH 8 done: loss 0.0150 - lr: 0.000033
|
212 |
+
2023-10-12 21:08:04,932 DEV : loss 0.13916537165641785 - f1-score (micro avg) 0.8524
|
213 |
+
2023-10-12 21:08:04,963 ----------------------------------------------------------------------------------------------------
|
214 |
+
2023-10-12 21:08:45,633 epoch 9 - iter 144/1445 - loss 0.00328491 - time (sec): 40.67 - samples/sec: 452.52 - lr: 0.000032 - momentum: 0.000000
|
215 |
+
2023-10-12 21:09:27,458 epoch 9 - iter 288/1445 - loss 0.01469900 - time (sec): 82.49 - samples/sec: 450.93 - lr: 0.000030 - momentum: 0.000000
|
216 |
+
2023-10-12 21:10:07,918 epoch 9 - iter 432/1445 - loss 0.01343190 - time (sec): 122.95 - samples/sec: 447.75 - lr: 0.000028 - momentum: 0.000000
|
217 |
+
2023-10-12 21:10:47,029 epoch 9 - iter 576/1445 - loss 0.01174943 - time (sec): 162.06 - samples/sec: 438.96 - lr: 0.000027 - momentum: 0.000000
|
218 |
+
2023-10-12 21:11:25,827 epoch 9 - iter 720/1445 - loss 0.01124109 - time (sec): 200.86 - samples/sec: 433.13 - lr: 0.000025 - momentum: 0.000000
|
219 |
+
2023-10-12 21:12:06,252 epoch 9 - iter 864/1445 - loss 0.01129143 - time (sec): 241.29 - samples/sec: 434.82 - lr: 0.000023 - momentum: 0.000000
|
220 |
+
2023-10-12 21:12:46,994 epoch 9 - iter 1008/1445 - loss 0.01164716 - time (sec): 282.03 - samples/sec: 434.48 - lr: 0.000022 - momentum: 0.000000
|
221 |
+
2023-10-12 21:13:28,846 epoch 9 - iter 1152/1445 - loss 0.01179582 - time (sec): 323.88 - samples/sec: 436.77 - lr: 0.000020 - momentum: 0.000000
|
222 |
+
2023-10-12 21:14:08,610 epoch 9 - iter 1296/1445 - loss 0.01095141 - time (sec): 363.64 - samples/sec: 435.91 - lr: 0.000018 - momentum: 0.000000
|
223 |
+
2023-10-12 21:14:49,912 epoch 9 - iter 1440/1445 - loss 0.01040016 - time (sec): 404.95 - samples/sec: 433.81 - lr: 0.000017 - momentum: 0.000000
|
224 |
+
2023-10-12 21:14:51,213 ----------------------------------------------------------------------------------------------------
|
225 |
+
2023-10-12 21:14:51,213 EPOCH 9 done: loss 0.0104 - lr: 0.000017
|
226 |
+
2023-10-12 21:15:11,280 DEV : loss 0.1443011313676834 - f1-score (micro avg) 0.8538
|
227 |
+
2023-10-12 21:15:11,310 ----------------------------------------------------------------------------------------------------
|
228 |
+
2023-10-12 21:15:53,223 epoch 10 - iter 144/1445 - loss 0.00659592 - time (sec): 41.91 - samples/sec: 429.81 - lr: 0.000015 - momentum: 0.000000
|
229 |
+
2023-10-12 21:16:33,015 epoch 10 - iter 288/1445 - loss 0.00689152 - time (sec): 81.70 - samples/sec: 412.85 - lr: 0.000013 - momentum: 0.000000
|
230 |
+
2023-10-12 21:17:13,952 epoch 10 - iter 432/1445 - loss 0.00829991 - time (sec): 122.64 - samples/sec: 413.32 - lr: 0.000012 - momentum: 0.000000
|
231 |
+
2023-10-12 21:17:55,529 epoch 10 - iter 576/1445 - loss 0.00927037 - time (sec): 164.22 - samples/sec: 420.87 - lr: 0.000010 - momentum: 0.000000
|
232 |
+
2023-10-12 21:18:36,103 epoch 10 - iter 720/1445 - loss 0.00865224 - time (sec): 204.79 - samples/sec: 421.42 - lr: 0.000008 - momentum: 0.000000
|
233 |
+
2023-10-12 21:19:17,606 epoch 10 - iter 864/1445 - loss 0.00794145 - time (sec): 246.29 - samples/sec: 425.13 - lr: 0.000007 - momentum: 0.000000
|
234 |
+
2023-10-12 21:19:59,178 epoch 10 - iter 1008/1445 - loss 0.00815458 - time (sec): 287.87 - samples/sec: 428.24 - lr: 0.000005 - momentum: 0.000000
|
235 |
+
2023-10-12 21:20:39,135 epoch 10 - iter 1152/1445 - loss 0.00746884 - time (sec): 327.82 - samples/sec: 426.96 - lr: 0.000003 - momentum: 0.000000
|
236 |
+
2023-10-12 21:21:19,627 epoch 10 - iter 1296/1445 - loss 0.00801406 - time (sec): 368.31 - samples/sec: 428.13 - lr: 0.000002 - momentum: 0.000000
|
237 |
+
2023-10-12 21:22:01,376 epoch 10 - iter 1440/1445 - loss 0.00775432 - time (sec): 410.06 - samples/sec: 428.56 - lr: 0.000000 - momentum: 0.000000
|
238 |
+
2023-10-12 21:22:02,551 ----------------------------------------------------------------------------------------------------
|
239 |
+
2023-10-12 21:22:02,552 EPOCH 10 done: loss 0.0077 - lr: 0.000000
|
240 |
+
2023-10-12 21:22:24,387 DEV : loss 0.15128682553768158 - f1-score (micro avg) 0.8515
|
241 |
+
2023-10-12 21:22:25,328 ----------------------------------------------------------------------------------------------------
|
242 |
+
2023-10-12 21:22:25,330 Loading model from best epoch ...
|
243 |
+
2023-10-12 21:22:29,244 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG
|
244 |
+
2023-10-12 21:22:50,647
|
245 |
+
Results:
|
246 |
+
- F-score (micro) 0.8113
|
247 |
+
- F-score (macro) 0.7148
|
248 |
+
- Accuracy 0.6935
|
249 |
+
|
250 |
+
By class:
|
251 |
+
precision recall f1-score support
|
252 |
+
|
253 |
+
PER 0.8591 0.7718 0.8131 482
|
254 |
+
LOC 0.9056 0.8166 0.8588 458
|
255 |
+
ORG 0.5172 0.4348 0.4724 69
|
256 |
+
|
257 |
+
micro avg 0.8584 0.7691 0.8113 1009
|
258 |
+
macro avg 0.7606 0.6744 0.7148 1009
|
259 |
+
weighted avg 0.8568 0.7691 0.8105 1009
|
260 |
+
|
261 |
+
2023-10-12 21:22:50,647 ----------------------------------------------------------------------------------------------------
|