Upload folder using huggingface_hub
Browse files- best-model.pt +3 -0
- dev.tsv +0 -0
- final-model.pt +3 -0
- loss.tsv +11 -0
- runs/events.out.tfevents.1697199110.6d4c7681f95b.3224.4 +3 -0
- test.tsv +0 -0
- training.log +262 -0
best-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5dc0206d7f36a4465f2efb6cfd28b51f89db372b750cb0da71d5389463f1b40f
|
3 |
+
size 870793839
|
dev.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
final-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:613f7c226ad90ad008efd9c7e728624c024953e5043db4e5ff220abc9194fea2
|
3 |
+
size 870793956
|
loss.tsv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
|
2 |
+
1 12:19:20 0.0001 1.1083 0.1016 0.0000 0.0000 0.0000 0.0000
|
3 |
+
2 12:26:30 0.0001 0.1042 0.0559 0.7965 0.7764 0.7863 0.6691
|
4 |
+
3 12:33:47 0.0001 0.0611 0.0515 0.7782 0.8143 0.7959 0.6772
|
5 |
+
4 12:41:06 0.0001 0.0403 0.0635 0.7669 0.8608 0.8111 0.6915
|
6 |
+
5 12:48:23 0.0001 0.0259 0.0732 0.7895 0.8228 0.8058 0.6890
|
7 |
+
6 12:55:28 0.0001 0.0174 0.0744 0.7565 0.8650 0.8071 0.6949
|
8 |
+
7 13:02:31 0.0001 0.0125 0.0863 0.7727 0.8608 0.8144 0.7010
|
9 |
+
8 13:09:37 0.0000 0.0086 0.0887 0.7977 0.8650 0.8300 0.7270
|
10 |
+
9 13:16:45 0.0000 0.0063 0.0927 0.7821 0.8481 0.8138 0.7003
|
11 |
+
10 13:24:07 0.0000 0.0045 0.0917 0.7812 0.8439 0.8114 0.6993
|
runs/events.out.tfevents.1697199110.6d4c7681f95b.3224.4
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5670489d08ebedf838b1f37e1ed840ee9dbf2a70cc51ec67f689acf985c86006
|
3 |
+
size 434848
|
test.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
training.log
ADDED
@@ -0,0 +1,262 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-13 12:11:50,968 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-13 12:11:50,971 Model: "SequenceTagger(
|
3 |
+
(embeddings): ByT5Embeddings(
|
4 |
+
(model): T5EncoderModel(
|
5 |
+
(shared): Embedding(384, 1472)
|
6 |
+
(encoder): T5Stack(
|
7 |
+
(embed_tokens): Embedding(384, 1472)
|
8 |
+
(block): ModuleList(
|
9 |
+
(0): T5Block(
|
10 |
+
(layer): ModuleList(
|
11 |
+
(0): T5LayerSelfAttention(
|
12 |
+
(SelfAttention): T5Attention(
|
13 |
+
(q): Linear(in_features=1472, out_features=384, bias=False)
|
14 |
+
(k): Linear(in_features=1472, out_features=384, bias=False)
|
15 |
+
(v): Linear(in_features=1472, out_features=384, bias=False)
|
16 |
+
(o): Linear(in_features=384, out_features=1472, bias=False)
|
17 |
+
(relative_attention_bias): Embedding(32, 6)
|
18 |
+
)
|
19 |
+
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(1): T5LayerFF(
|
23 |
+
(DenseReluDense): T5DenseGatedActDense(
|
24 |
+
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
|
25 |
+
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
|
26 |
+
(wo): Linear(in_features=3584, out_features=1472, bias=False)
|
27 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
28 |
+
(act): NewGELUActivation()
|
29 |
+
)
|
30 |
+
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
31 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
32 |
+
)
|
33 |
+
)
|
34 |
+
)
|
35 |
+
(1-11): 11 x T5Block(
|
36 |
+
(layer): ModuleList(
|
37 |
+
(0): T5LayerSelfAttention(
|
38 |
+
(SelfAttention): T5Attention(
|
39 |
+
(q): Linear(in_features=1472, out_features=384, bias=False)
|
40 |
+
(k): Linear(in_features=1472, out_features=384, bias=False)
|
41 |
+
(v): Linear(in_features=1472, out_features=384, bias=False)
|
42 |
+
(o): Linear(in_features=384, out_features=1472, bias=False)
|
43 |
+
)
|
44 |
+
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
45 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
46 |
+
)
|
47 |
+
(1): T5LayerFF(
|
48 |
+
(DenseReluDense): T5DenseGatedActDense(
|
49 |
+
(wi_0): Linear(in_features=1472, out_features=3584, bias=False)
|
50 |
+
(wi_1): Linear(in_features=1472, out_features=3584, bias=False)
|
51 |
+
(wo): Linear(in_features=3584, out_features=1472, bias=False)
|
52 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
53 |
+
(act): NewGELUActivation()
|
54 |
+
)
|
55 |
+
(layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
56 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
57 |
+
)
|
58 |
+
)
|
59 |
+
)
|
60 |
+
)
|
61 |
+
(final_layer_norm): FusedRMSNorm(torch.Size([1472]), eps=1e-06, elementwise_affine=True)
|
62 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
63 |
+
)
|
64 |
+
)
|
65 |
+
)
|
66 |
+
(locked_dropout): LockedDropout(p=0.5)
|
67 |
+
(linear): Linear(in_features=1472, out_features=13, bias=True)
|
68 |
+
(loss_function): CrossEntropyLoss()
|
69 |
+
)"
|
70 |
+
2023-10-13 12:11:50,971 ----------------------------------------------------------------------------------------------------
|
71 |
+
2023-10-13 12:11:50,971 MultiCorpus: 6183 train + 680 dev + 2113 test sentences
|
72 |
+
- NER_HIPE_2022 Corpus: 6183 train + 680 dev + 2113 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/topres19th/en/with_doc_seperator
|
73 |
+
2023-10-13 12:11:50,971 ----------------------------------------------------------------------------------------------------
|
74 |
+
2023-10-13 12:11:50,971 Train: 6183 sentences
|
75 |
+
2023-10-13 12:11:50,971 (train_with_dev=False, train_with_test=False)
|
76 |
+
2023-10-13 12:11:50,971 ----------------------------------------------------------------------------------------------------
|
77 |
+
2023-10-13 12:11:50,971 Training Params:
|
78 |
+
2023-10-13 12:11:50,972 - learning_rate: "0.00015"
|
79 |
+
2023-10-13 12:11:50,972 - mini_batch_size: "8"
|
80 |
+
2023-10-13 12:11:50,972 - max_epochs: "10"
|
81 |
+
2023-10-13 12:11:50,972 - shuffle: "True"
|
82 |
+
2023-10-13 12:11:50,972 ----------------------------------------------------------------------------------------------------
|
83 |
+
2023-10-13 12:11:50,972 Plugins:
|
84 |
+
2023-10-13 12:11:50,972 - TensorboardLogger
|
85 |
+
2023-10-13 12:11:50,972 - LinearScheduler | warmup_fraction: '0.1'
|
86 |
+
2023-10-13 12:11:50,972 ----------------------------------------------------------------------------------------------------
|
87 |
+
2023-10-13 12:11:50,972 Final evaluation on model from best epoch (best-model.pt)
|
88 |
+
2023-10-13 12:11:50,972 - metric: "('micro avg', 'f1-score')"
|
89 |
+
2023-10-13 12:11:50,972 ----------------------------------------------------------------------------------------------------
|
90 |
+
2023-10-13 12:11:50,972 Computation:
|
91 |
+
2023-10-13 12:11:50,972 - compute on device: cuda:0
|
92 |
+
2023-10-13 12:11:50,973 - embedding storage: none
|
93 |
+
2023-10-13 12:11:50,973 ----------------------------------------------------------------------------------------------------
|
94 |
+
2023-10-13 12:11:50,973 Model training base path: "hmbench-topres19th/en-hmbyt5-preliminary/byt5-small-historic-multilingual-span20-flax-bs8-wsFalse-e10-lr0.00015-poolingfirst-layers-1-crfFalse-2"
|
95 |
+
2023-10-13 12:11:50,973 ----------------------------------------------------------------------------------------------------
|
96 |
+
2023-10-13 12:11:50,973 ----------------------------------------------------------------------------------------------------
|
97 |
+
2023-10-13 12:11:50,973 Logging anything other than scalars to TensorBoard is currently not supported.
|
98 |
+
2023-10-13 12:12:34,518 epoch 1 - iter 77/773 - loss 2.57123659 - time (sec): 43.54 - samples/sec: 283.97 - lr: 0.000015 - momentum: 0.000000
|
99 |
+
2023-10-13 12:13:16,883 epoch 1 - iter 154/773 - loss 2.54004423 - time (sec): 85.91 - samples/sec: 278.44 - lr: 0.000030 - momentum: 0.000000
|
100 |
+
2023-10-13 12:13:59,430 epoch 1 - iter 231/773 - loss 2.38127903 - time (sec): 128.46 - samples/sec: 285.60 - lr: 0.000045 - momentum: 0.000000
|
101 |
+
2023-10-13 12:14:44,917 epoch 1 - iter 308/773 - loss 2.14419240 - time (sec): 173.94 - samples/sec: 289.56 - lr: 0.000060 - momentum: 0.000000
|
102 |
+
2023-10-13 12:15:28,830 epoch 1 - iter 385/773 - loss 1.91099630 - time (sec): 217.86 - samples/sec: 288.68 - lr: 0.000075 - momentum: 0.000000
|
103 |
+
2023-10-13 12:16:11,609 epoch 1 - iter 462/773 - loss 1.70018509 - time (sec): 260.63 - samples/sec: 285.81 - lr: 0.000089 - momentum: 0.000000
|
104 |
+
2023-10-13 12:16:53,244 epoch 1 - iter 539/773 - loss 1.51041487 - time (sec): 302.27 - samples/sec: 284.97 - lr: 0.000104 - momentum: 0.000000
|
105 |
+
2023-10-13 12:17:35,038 epoch 1 - iter 616/773 - loss 1.35148076 - time (sec): 344.06 - samples/sec: 285.00 - lr: 0.000119 - momentum: 0.000000
|
106 |
+
2023-10-13 12:18:17,952 epoch 1 - iter 693/773 - loss 1.21911109 - time (sec): 386.98 - samples/sec: 286.83 - lr: 0.000134 - momentum: 0.000000
|
107 |
+
2023-10-13 12:19:01,031 epoch 1 - iter 770/773 - loss 1.11179986 - time (sec): 430.06 - samples/sec: 287.98 - lr: 0.000149 - momentum: 0.000000
|
108 |
+
2023-10-13 12:19:02,586 ----------------------------------------------------------------------------------------------------
|
109 |
+
2023-10-13 12:19:02,586 EPOCH 1 done: loss 1.1083 - lr: 0.000149
|
110 |
+
2023-10-13 12:19:20,096 DEV : loss 0.10156270116567612 - f1-score (micro avg) 0.0
|
111 |
+
2023-10-13 12:19:20,135 ----------------------------------------------------------------------------------------------------
|
112 |
+
2023-10-13 12:20:03,733 epoch 2 - iter 77/773 - loss 0.14343911 - time (sec): 43.59 - samples/sec: 277.88 - lr: 0.000148 - momentum: 0.000000
|
113 |
+
2023-10-13 12:20:45,246 epoch 2 - iter 154/773 - loss 0.14111574 - time (sec): 85.11 - samples/sec: 292.96 - lr: 0.000147 - momentum: 0.000000
|
114 |
+
2023-10-13 12:21:24,773 epoch 2 - iter 231/773 - loss 0.13134074 - time (sec): 124.63 - samples/sec: 298.67 - lr: 0.000145 - momentum: 0.000000
|
115 |
+
2023-10-13 12:22:04,927 epoch 2 - iter 308/773 - loss 0.12226088 - time (sec): 164.79 - samples/sec: 303.09 - lr: 0.000143 - momentum: 0.000000
|
116 |
+
2023-10-13 12:22:44,343 epoch 2 - iter 385/773 - loss 0.11876136 - time (sec): 204.20 - samples/sec: 301.47 - lr: 0.000142 - momentum: 0.000000
|
117 |
+
2023-10-13 12:23:25,052 epoch 2 - iter 462/773 - loss 0.11190683 - time (sec): 244.91 - samples/sec: 304.46 - lr: 0.000140 - momentum: 0.000000
|
118 |
+
2023-10-13 12:24:07,027 epoch 2 - iter 539/773 - loss 0.11157185 - time (sec): 286.89 - samples/sec: 304.72 - lr: 0.000138 - momentum: 0.000000
|
119 |
+
2023-10-13 12:24:48,683 epoch 2 - iter 616/773 - loss 0.10933321 - time (sec): 328.54 - samples/sec: 303.00 - lr: 0.000137 - momentum: 0.000000
|
120 |
+
2023-10-13 12:25:30,303 epoch 2 - iter 693/773 - loss 0.10704561 - time (sec): 370.16 - samples/sec: 302.26 - lr: 0.000135 - momentum: 0.000000
|
121 |
+
2023-10-13 12:26:11,495 epoch 2 - iter 770/773 - loss 0.10437133 - time (sec): 411.36 - samples/sec: 301.17 - lr: 0.000133 - momentum: 0.000000
|
122 |
+
2023-10-13 12:26:12,989 ----------------------------------------------------------------------------------------------------
|
123 |
+
2023-10-13 12:26:12,990 EPOCH 2 done: loss 0.1042 - lr: 0.000133
|
124 |
+
2023-10-13 12:26:30,440 DEV : loss 0.055891476571559906 - f1-score (micro avg) 0.7863
|
125 |
+
2023-10-13 12:26:30,470 saving best model
|
126 |
+
2023-10-13 12:26:31,427 ----------------------------------------------------------------------------------------------------
|
127 |
+
2023-10-13 12:27:13,239 epoch 3 - iter 77/773 - loss 0.07137218 - time (sec): 41.81 - samples/sec: 304.93 - lr: 0.000132 - momentum: 0.000000
|
128 |
+
2023-10-13 12:27:54,626 epoch 3 - iter 154/773 - loss 0.06623441 - time (sec): 83.20 - samples/sec: 303.37 - lr: 0.000130 - momentum: 0.000000
|
129 |
+
2023-10-13 12:28:36,413 epoch 3 - iter 231/773 - loss 0.06634603 - time (sec): 124.98 - samples/sec: 295.12 - lr: 0.000128 - momentum: 0.000000
|
130 |
+
2023-10-13 12:29:19,074 epoch 3 - iter 308/773 - loss 0.06925348 - time (sec): 167.64 - samples/sec: 298.65 - lr: 0.000127 - momentum: 0.000000
|
131 |
+
2023-10-13 12:29:59,867 epoch 3 - iter 385/773 - loss 0.06673849 - time (sec): 208.44 - samples/sec: 297.41 - lr: 0.000125 - momentum: 0.000000
|
132 |
+
2023-10-13 12:30:41,086 epoch 3 - iter 462/773 - loss 0.06675991 - time (sec): 249.66 - samples/sec: 297.38 - lr: 0.000123 - momentum: 0.000000
|
133 |
+
2023-10-13 12:31:23,319 epoch 3 - iter 539/773 - loss 0.06556065 - time (sec): 291.89 - samples/sec: 298.20 - lr: 0.000122 - momentum: 0.000000
|
134 |
+
2023-10-13 12:32:05,608 epoch 3 - iter 616/773 - loss 0.06289494 - time (sec): 334.18 - samples/sec: 299.27 - lr: 0.000120 - momentum: 0.000000
|
135 |
+
2023-10-13 12:32:46,787 epoch 3 - iter 693/773 - loss 0.06170302 - time (sec): 375.36 - samples/sec: 299.34 - lr: 0.000118 - momentum: 0.000000
|
136 |
+
2023-10-13 12:33:27,330 epoch 3 - iter 770/773 - loss 0.06115844 - time (sec): 415.90 - samples/sec: 297.40 - lr: 0.000117 - momentum: 0.000000
|
137 |
+
2023-10-13 12:33:28,989 ----------------------------------------------------------------------------------------------------
|
138 |
+
2023-10-13 12:33:28,989 EPOCH 3 done: loss 0.0611 - lr: 0.000117
|
139 |
+
2023-10-13 12:33:47,675 DEV : loss 0.051485899835824966 - f1-score (micro avg) 0.7959
|
140 |
+
2023-10-13 12:33:47,708 saving best model
|
141 |
+
2023-10-13 12:33:50,361 ----------------------------------------------------------------------------------------------------
|
142 |
+
2023-10-13 12:34:30,646 epoch 4 - iter 77/773 - loss 0.04947376 - time (sec): 40.28 - samples/sec: 280.87 - lr: 0.000115 - momentum: 0.000000
|
143 |
+
2023-10-13 12:35:12,235 epoch 4 - iter 154/773 - loss 0.04506746 - time (sec): 81.87 - samples/sec: 298.04 - lr: 0.000113 - momentum: 0.000000
|
144 |
+
2023-10-13 12:35:53,297 epoch 4 - iter 231/773 - loss 0.04471770 - time (sec): 122.93 - samples/sec: 293.77 - lr: 0.000112 - momentum: 0.000000
|
145 |
+
2023-10-13 12:36:36,166 epoch 4 - iter 308/773 - loss 0.04502334 - time (sec): 165.80 - samples/sec: 301.74 - lr: 0.000110 - momentum: 0.000000
|
146 |
+
2023-10-13 12:37:17,047 epoch 4 - iter 385/773 - loss 0.04266320 - time (sec): 206.68 - samples/sec: 299.71 - lr: 0.000108 - momentum: 0.000000
|
147 |
+
2023-10-13 12:37:59,706 epoch 4 - iter 462/773 - loss 0.04112529 - time (sec): 249.34 - samples/sec: 299.60 - lr: 0.000107 - momentum: 0.000000
|
148 |
+
2023-10-13 12:38:41,524 epoch 4 - iter 539/773 - loss 0.04114812 - time (sec): 291.16 - samples/sec: 299.15 - lr: 0.000105 - momentum: 0.000000
|
149 |
+
2023-10-13 12:39:22,331 epoch 4 - iter 616/773 - loss 0.03933986 - time (sec): 331.97 - samples/sec: 300.79 - lr: 0.000103 - momentum: 0.000000
|
150 |
+
2023-10-13 12:40:03,518 epoch 4 - iter 693/773 - loss 0.04060396 - time (sec): 373.15 - samples/sec: 299.37 - lr: 0.000102 - momentum: 0.000000
|
151 |
+
2023-10-13 12:40:46,318 epoch 4 - iter 770/773 - loss 0.04042546 - time (sec): 415.95 - samples/sec: 297.67 - lr: 0.000100 - momentum: 0.000000
|
152 |
+
2023-10-13 12:40:47,905 ----------------------------------------------------------------------------------------------------
|
153 |
+
2023-10-13 12:40:47,905 EPOCH 4 done: loss 0.0403 - lr: 0.000100
|
154 |
+
2023-10-13 12:41:06,017 DEV : loss 0.06352678686380386 - f1-score (micro avg) 0.8111
|
155 |
+
2023-10-13 12:41:06,046 saving best model
|
156 |
+
2023-10-13 12:41:08,698 ----------------------------------------------------------------------------------------------------
|
157 |
+
2023-10-13 12:41:50,692 epoch 5 - iter 77/773 - loss 0.02612534 - time (sec): 41.99 - samples/sec: 291.71 - lr: 0.000098 - momentum: 0.000000
|
158 |
+
2023-10-13 12:42:31,783 epoch 5 - iter 154/773 - loss 0.02825120 - time (sec): 83.08 - samples/sec: 293.80 - lr: 0.000097 - momentum: 0.000000
|
159 |
+
2023-10-13 12:43:13,014 epoch 5 - iter 231/773 - loss 0.02760483 - time (sec): 124.31 - samples/sec: 303.77 - lr: 0.000095 - momentum: 0.000000
|
160 |
+
2023-10-13 12:43:55,603 epoch 5 - iter 308/773 - loss 0.02738820 - time (sec): 166.90 - samples/sec: 303.64 - lr: 0.000093 - momentum: 0.000000
|
161 |
+
2023-10-13 12:44:37,368 epoch 5 - iter 385/773 - loss 0.02615460 - time (sec): 208.67 - samples/sec: 298.58 - lr: 0.000092 - momentum: 0.000000
|
162 |
+
2023-10-13 12:45:20,043 epoch 5 - iter 462/773 - loss 0.02553182 - time (sec): 251.34 - samples/sec: 297.00 - lr: 0.000090 - momentum: 0.000000
|
163 |
+
2023-10-13 12:46:01,461 epoch 5 - iter 539/773 - loss 0.02485120 - time (sec): 292.76 - samples/sec: 295.15 - lr: 0.000088 - momentum: 0.000000
|
164 |
+
2023-10-13 12:46:43,286 epoch 5 - iter 616/773 - loss 0.02594767 - time (sec): 334.58 - samples/sec: 297.42 - lr: 0.000087 - momentum: 0.000000
|
165 |
+
2023-10-13 12:47:22,820 epoch 5 - iter 693/773 - loss 0.02558115 - time (sec): 374.12 - samples/sec: 298.56 - lr: 0.000085 - momentum: 0.000000
|
166 |
+
2023-10-13 12:48:04,090 epoch 5 - iter 770/773 - loss 0.02571670 - time (sec): 415.39 - samples/sec: 297.82 - lr: 0.000083 - momentum: 0.000000
|
167 |
+
2023-10-13 12:48:05,685 ----------------------------------------------------------------------------------------------------
|
168 |
+
2023-10-13 12:48:05,685 EPOCH 5 done: loss 0.0259 - lr: 0.000083
|
169 |
+
2023-10-13 12:48:23,336 DEV : loss 0.07320675998926163 - f1-score (micro avg) 0.8058
|
170 |
+
2023-10-13 12:48:23,366 ----------------------------------------------------------------------------------------------------
|
171 |
+
2023-10-13 12:49:03,912 epoch 6 - iter 77/773 - loss 0.01551022 - time (sec): 40.54 - samples/sec: 276.10 - lr: 0.000082 - momentum: 0.000000
|
172 |
+
2023-10-13 12:49:46,020 epoch 6 - iter 154/773 - loss 0.01573350 - time (sec): 82.65 - samples/sec: 292.58 - lr: 0.000080 - momentum: 0.000000
|
173 |
+
2023-10-13 12:50:27,182 epoch 6 - iter 231/773 - loss 0.01748381 - time (sec): 123.81 - samples/sec: 293.12 - lr: 0.000078 - momentum: 0.000000
|
174 |
+
2023-10-13 12:51:06,849 epoch 6 - iter 308/773 - loss 0.01658042 - time (sec): 163.48 - samples/sec: 296.68 - lr: 0.000077 - momentum: 0.000000
|
175 |
+
2023-10-13 12:51:45,807 epoch 6 - iter 385/773 - loss 0.01809232 - time (sec): 202.44 - samples/sec: 299.61 - lr: 0.000075 - momentum: 0.000000
|
176 |
+
2023-10-13 12:52:27,059 epoch 6 - iter 462/773 - loss 0.01736914 - time (sec): 243.69 - samples/sec: 302.60 - lr: 0.000073 - momentum: 0.000000
|
177 |
+
2023-10-13 12:53:08,010 epoch 6 - iter 539/773 - loss 0.01676794 - time (sec): 284.64 - samples/sec: 302.43 - lr: 0.000072 - momentum: 0.000000
|
178 |
+
2023-10-13 12:53:48,567 epoch 6 - iter 616/773 - loss 0.01830890 - time (sec): 325.20 - samples/sec: 304.50 - lr: 0.000070 - momentum: 0.000000
|
179 |
+
2023-10-13 12:54:28,837 epoch 6 - iter 693/773 - loss 0.01814792 - time (sec): 365.47 - samples/sec: 304.81 - lr: 0.000068 - momentum: 0.000000
|
180 |
+
2023-10-13 12:55:09,144 epoch 6 - iter 770/773 - loss 0.01745267 - time (sec): 405.78 - samples/sec: 304.88 - lr: 0.000067 - momentum: 0.000000
|
181 |
+
2023-10-13 12:55:10,695 ----------------------------------------------------------------------------------------------------
|
182 |
+
2023-10-13 12:55:10,695 EPOCH 6 done: loss 0.0174 - lr: 0.000067
|
183 |
+
2023-10-13 12:55:28,242 DEV : loss 0.07442227005958557 - f1-score (micro avg) 0.8071
|
184 |
+
2023-10-13 12:55:28,272 ----------------------------------------------------------------------------------------------------
|
185 |
+
2023-10-13 12:56:09,091 epoch 7 - iter 77/773 - loss 0.00999803 - time (sec): 40.82 - samples/sec: 302.86 - lr: 0.000065 - momentum: 0.000000
|
186 |
+
2023-10-13 12:56:48,526 epoch 7 - iter 154/773 - loss 0.01303046 - time (sec): 80.25 - samples/sec: 303.67 - lr: 0.000063 - momentum: 0.000000
|
187 |
+
2023-10-13 12:57:27,898 epoch 7 - iter 231/773 - loss 0.01221184 - time (sec): 119.62 - samples/sec: 305.15 - lr: 0.000062 - momentum: 0.000000
|
188 |
+
2023-10-13 12:58:09,432 epoch 7 - iter 308/773 - loss 0.01126403 - time (sec): 161.16 - samples/sec: 310.29 - lr: 0.000060 - momentum: 0.000000
|
189 |
+
2023-10-13 12:58:50,442 epoch 7 - iter 385/773 - loss 0.01243846 - time (sec): 202.17 - samples/sec: 309.82 - lr: 0.000058 - momentum: 0.000000
|
190 |
+
2023-10-13 12:59:31,439 epoch 7 - iter 462/773 - loss 0.01232868 - time (sec): 243.17 - samples/sec: 306.15 - lr: 0.000057 - momentum: 0.000000
|
191 |
+
2023-10-13 13:00:12,190 epoch 7 - iter 539/773 - loss 0.01239029 - time (sec): 283.92 - samples/sec: 304.08 - lr: 0.000055 - momentum: 0.000000
|
192 |
+
2023-10-13 13:00:52,186 epoch 7 - iter 616/773 - loss 0.01259808 - time (sec): 323.91 - samples/sec: 305.64 - lr: 0.000054 - momentum: 0.000000
|
193 |
+
2023-10-13 13:01:32,889 epoch 7 - iter 693/773 - loss 0.01254284 - time (sec): 364.62 - samples/sec: 306.14 - lr: 0.000052 - momentum: 0.000000
|
194 |
+
2023-10-13 13:02:13,324 epoch 7 - iter 770/773 - loss 0.01237555 - time (sec): 405.05 - samples/sec: 306.02 - lr: 0.000050 - momentum: 0.000000
|
195 |
+
2023-10-13 13:02:14,776 ----------------------------------------------------------------------------------------------------
|
196 |
+
2023-10-13 13:02:14,776 EPOCH 7 done: loss 0.0125 - lr: 0.000050
|
197 |
+
2023-10-13 13:02:31,857 DEV : loss 0.08628595620393753 - f1-score (micro avg) 0.8144
|
198 |
+
2023-10-13 13:02:31,886 saving best model
|
199 |
+
2023-10-13 13:02:34,611 ----------------------------------------------------------------------------------------------------
|
200 |
+
2023-10-13 13:03:14,909 epoch 8 - iter 77/773 - loss 0.00855586 - time (sec): 40.29 - samples/sec: 299.53 - lr: 0.000048 - momentum: 0.000000
|
201 |
+
2023-10-13 13:03:55,802 epoch 8 - iter 154/773 - loss 0.01073063 - time (sec): 81.19 - samples/sec: 309.92 - lr: 0.000047 - momentum: 0.000000
|
202 |
+
2023-10-13 13:04:35,209 epoch 8 - iter 231/773 - loss 0.00879026 - time (sec): 120.59 - samples/sec: 308.27 - lr: 0.000045 - momentum: 0.000000
|
203 |
+
2023-10-13 13:05:15,170 epoch 8 - iter 308/773 - loss 0.00919919 - time (sec): 160.55 - samples/sec: 303.49 - lr: 0.000043 - momentum: 0.000000
|
204 |
+
2023-10-13 13:05:55,095 epoch 8 - iter 385/773 - loss 0.00880865 - time (sec): 200.48 - samples/sec: 299.32 - lr: 0.000042 - momentum: 0.000000
|
205 |
+
2023-10-13 13:06:36,217 epoch 8 - iter 462/773 - loss 0.00921083 - time (sec): 241.60 - samples/sec: 302.65 - lr: 0.000040 - momentum: 0.000000
|
206 |
+
2023-10-13 13:07:17,052 epoch 8 - iter 539/773 - loss 0.01001243 - time (sec): 282.44 - samples/sec: 306.16 - lr: 0.000039 - momentum: 0.000000
|
207 |
+
2023-10-13 13:07:58,590 epoch 8 - iter 616/773 - loss 0.00960529 - time (sec): 323.97 - samples/sec: 306.67 - lr: 0.000037 - momentum: 0.000000
|
208 |
+
2023-10-13 13:08:39,476 epoch 8 - iter 693/773 - loss 0.00919135 - time (sec): 364.86 - samples/sec: 306.10 - lr: 0.000035 - momentum: 0.000000
|
209 |
+
2023-10-13 13:09:19,241 epoch 8 - iter 770/773 - loss 0.00868187 - time (sec): 404.63 - samples/sec: 305.60 - lr: 0.000034 - momentum: 0.000000
|
210 |
+
2023-10-13 13:09:20,852 ----------------------------------------------------------------------------------------------------
|
211 |
+
2023-10-13 13:09:20,853 EPOCH 8 done: loss 0.0086 - lr: 0.000034
|
212 |
+
2023-10-13 13:09:37,627 DEV : loss 0.08868994563817978 - f1-score (micro avg) 0.83
|
213 |
+
2023-10-13 13:09:37,655 saving best model
|
214 |
+
2023-10-13 13:09:40,347 ----------------------------------------------------------------------------------------------------
|
215 |
+
2023-10-13 13:10:21,144 epoch 9 - iter 77/773 - loss 0.00335222 - time (sec): 40.79 - samples/sec: 306.70 - lr: 0.000032 - momentum: 0.000000
|
216 |
+
2023-10-13 13:11:01,271 epoch 9 - iter 154/773 - loss 0.00362335 - time (sec): 80.92 - samples/sec: 313.31 - lr: 0.000030 - momentum: 0.000000
|
217 |
+
2023-10-13 13:11:41,379 epoch 9 - iter 231/773 - loss 0.00379545 - time (sec): 121.03 - samples/sec: 314.51 - lr: 0.000028 - momentum: 0.000000
|
218 |
+
2023-10-13 13:12:21,676 epoch 9 - iter 308/773 - loss 0.00485971 - time (sec): 161.32 - samples/sec: 310.82 - lr: 0.000027 - momentum: 0.000000
|
219 |
+
2023-10-13 13:13:01,870 epoch 9 - iter 385/773 - loss 0.00587335 - time (sec): 201.52 - samples/sec: 312.32 - lr: 0.000025 - momentum: 0.000000
|
220 |
+
2023-10-13 13:13:43,524 epoch 9 - iter 462/773 - loss 0.00613505 - time (sec): 243.17 - samples/sec: 309.28 - lr: 0.000024 - momentum: 0.000000
|
221 |
+
2023-10-13 13:14:24,212 epoch 9 - iter 539/773 - loss 0.00629327 - time (sec): 283.86 - samples/sec: 305.77 - lr: 0.000022 - momentum: 0.000000
|
222 |
+
2023-10-13 13:15:04,808 epoch 9 - iter 616/773 - loss 0.00630642 - time (sec): 324.46 - samples/sec: 307.27 - lr: 0.000020 - momentum: 0.000000
|
223 |
+
2023-10-13 13:15:45,380 epoch 9 - iter 693/773 - loss 0.00629380 - time (sec): 365.03 - samples/sec: 306.37 - lr: 0.000019 - momentum: 0.000000
|
224 |
+
2023-10-13 13:16:26,131 epoch 9 - iter 770/773 - loss 0.00636277 - time (sec): 405.78 - samples/sec: 305.61 - lr: 0.000017 - momentum: 0.000000
|
225 |
+
2023-10-13 13:16:27,532 ----------------------------------------------------------------------------------------------------
|
226 |
+
2023-10-13 13:16:27,533 EPOCH 9 done: loss 0.0063 - lr: 0.000017
|
227 |
+
2023-10-13 13:16:45,497 DEV : loss 0.09266868978738785 - f1-score (micro avg) 0.8138
|
228 |
+
2023-10-13 13:16:45,526 ----------------------------------------------------------------------------------------------------
|
229 |
+
2023-10-13 13:17:26,201 epoch 10 - iter 77/773 - loss 0.00190450 - time (sec): 40.67 - samples/sec: 301.95 - lr: 0.000015 - momentum: 0.000000
|
230 |
+
2023-10-13 13:18:07,106 epoch 10 - iter 154/773 - loss 0.00359834 - time (sec): 81.58 - samples/sec: 308.59 - lr: 0.000014 - momentum: 0.000000
|
231 |
+
2023-10-13 13:18:46,933 epoch 10 - iter 231/773 - loss 0.00380266 - time (sec): 121.40 - samples/sec: 308.56 - lr: 0.000012 - momentum: 0.000000
|
232 |
+
2023-10-13 13:19:27,460 epoch 10 - iter 308/773 - loss 0.00362008 - time (sec): 161.93 - samples/sec: 307.67 - lr: 0.000010 - momentum: 0.000000
|
233 |
+
2023-10-13 13:20:09,363 epoch 10 - iter 385/773 - loss 0.00387252 - time (sec): 203.83 - samples/sec: 304.86 - lr: 0.000009 - momentum: 0.000000
|
234 |
+
2023-10-13 13:20:50,798 epoch 10 - iter 462/773 - loss 0.00419240 - time (sec): 245.27 - samples/sec: 300.94 - lr: 0.000007 - momentum: 0.000000
|
235 |
+
2023-10-13 13:21:33,881 epoch 10 - iter 539/773 - loss 0.00456516 - time (sec): 288.35 - samples/sec: 298.57 - lr: 0.000005 - momentum: 0.000000
|
236 |
+
2023-10-13 13:22:19,370 epoch 10 - iter 616/773 - loss 0.00451077 - time (sec): 333.84 - samples/sec: 292.87 - lr: 0.000004 - momentum: 0.000000
|
237 |
+
2023-10-13 13:23:04,395 epoch 10 - iter 693/773 - loss 0.00444498 - time (sec): 378.87 - samples/sec: 293.39 - lr: 0.000002 - momentum: 0.000000
|
238 |
+
2023-10-13 13:23:47,529 epoch 10 - iter 770/773 - loss 0.00454130 - time (sec): 422.00 - samples/sec: 293.04 - lr: 0.000000 - momentum: 0.000000
|
239 |
+
2023-10-13 13:23:49,225 ----------------------------------------------------------------------------------------------------
|
240 |
+
2023-10-13 13:23:49,225 EPOCH 10 done: loss 0.0045 - lr: 0.000000
|
241 |
+
2023-10-13 13:24:07,865 DEV : loss 0.09171322733163834 - f1-score (micro avg) 0.8114
|
242 |
+
2023-10-13 13:24:08,877 ----------------------------------------------------------------------------------------------------
|
243 |
+
2023-10-13 13:24:08,879 Loading model from best epoch ...
|
244 |
+
2023-10-13 13:24:13,748 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-BUILDING, B-BUILDING, E-BUILDING, I-BUILDING, S-STREET, B-STREET, E-STREET, I-STREET
|
245 |
+
2023-10-13 13:25:13,121
|
246 |
+
Results:
|
247 |
+
- F-score (micro) 0.8083
|
248 |
+
- F-score (macro) 0.7183
|
249 |
+
- Accuracy 0.6988
|
250 |
+
|
251 |
+
By class:
|
252 |
+
precision recall f1-score support
|
253 |
+
|
254 |
+
LOC 0.8551 0.8668 0.8609 946
|
255 |
+
BUILDING 0.5567 0.5838 0.5699 185
|
256 |
+
STREET 0.7000 0.7500 0.7241 56
|
257 |
+
|
258 |
+
micro avg 0.7997 0.8172 0.8083 1187
|
259 |
+
macro avg 0.7039 0.7335 0.7183 1187
|
260 |
+
weighted avg 0.8012 0.8172 0.8091 1187
|
261 |
+
|
262 |
+
2023-10-13 13:25:13,122 ----------------------------------------------------------------------------------------------------
|