stefan-it commited on
Commit
8cc7f3b
·
1 Parent(s): 03526da

Upload folder using huggingface_hub

Browse files
best-model.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fdfacccf16a64ab09dd99a15475e63de752d5b8f99aa93d4d8c768c87ab053dd
3
+ size 19045922
dev.tsv ADDED
The diff for this file is too large to render. See raw diff
 
loss.tsv ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
2
+ 1 20:41:19 0.0000 0.7888 0.2271 0.2876 0.2443 0.2642 0.1720
3
+ 2 20:41:51 0.0000 0.2838 0.1987 0.3568 0.3676 0.3621 0.2449
4
+ 3 20:42:23 0.0000 0.2463 0.1812 0.3776 0.3767 0.3771 0.2550
5
+ 4 20:42:56 0.0000 0.2227 0.1686 0.3723 0.4502 0.4076 0.2825
6
+ 5 20:43:28 0.0000 0.2080 0.1608 0.3943 0.4661 0.4272 0.2968
7
+ 6 20:44:00 0.0000 0.1950 0.1577 0.4155 0.4864 0.4482 0.3134
8
+ 7 20:44:32 0.0000 0.1860 0.1586 0.4439 0.4695 0.4563 0.3202
9
+ 8 20:45:04 0.0000 0.1807 0.1574 0.4361 0.4943 0.4634 0.3288
10
+ 9 20:45:37 0.0000 0.1760 0.1574 0.4460 0.5045 0.4735 0.3374
11
+ 10 20:46:09 0.0000 0.1746 0.1569 0.4411 0.5079 0.4721 0.3358
runs/events.out.tfevents.1697661646.46dc0c540dd0.3341.4 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:552587f941e7c4f7974f7ad474f40e5b492373ff91aa2d64079e7e0add3933ed
3
+ size 1108164
test.tsv ADDED
The diff for this file is too large to render. See raw diff
 
training.log ADDED
@@ -0,0 +1,246 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-10-18 20:40:46,083 ----------------------------------------------------------------------------------------------------
2
+ 2023-10-18 20:40:46,083 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(32001, 128)
7
+ (position_embeddings): Embedding(512, 128)
8
+ (token_type_embeddings): Embedding(2, 128)
9
+ (LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0-1): 2 x BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=128, out_features=128, bias=True)
18
+ (key): Linear(in_features=128, out_features=128, bias=True)
19
+ (value): Linear(in_features=128, out_features=128, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=128, out_features=128, bias=True)
24
+ (LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=128, out_features=512, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=512, out_features=128, bias=True)
34
+ (LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ )
39
+ )
40
+ (pooler): BertPooler(
41
+ (dense): Linear(in_features=128, out_features=128, bias=True)
42
+ (activation): Tanh()
43
+ )
44
+ )
45
+ )
46
+ (locked_dropout): LockedDropout(p=0.5)
47
+ (linear): Linear(in_features=128, out_features=13, bias=True)
48
+ (loss_function): CrossEntropyLoss()
49
+ )"
50
+ 2023-10-18 20:40:46,083 ----------------------------------------------------------------------------------------------------
51
+ 2023-10-18 20:40:46,083 MultiCorpus: 7936 train + 992 dev + 992 test sentences
52
+ - NER_ICDAR_EUROPEANA Corpus: 7936 train + 992 dev + 992 test sentences - /root/.flair/datasets/ner_icdar_europeana/fr
53
+ 2023-10-18 20:40:46,083 ----------------------------------------------------------------------------------------------------
54
+ 2023-10-18 20:40:46,083 Train: 7936 sentences
55
+ 2023-10-18 20:40:46,083 (train_with_dev=False, train_with_test=False)
56
+ 2023-10-18 20:40:46,083 ----------------------------------------------------------------------------------------------------
57
+ 2023-10-18 20:40:46,083 Training Params:
58
+ 2023-10-18 20:40:46,083 - learning_rate: "3e-05"
59
+ 2023-10-18 20:40:46,084 - mini_batch_size: "4"
60
+ 2023-10-18 20:40:46,084 - max_epochs: "10"
61
+ 2023-10-18 20:40:46,084 - shuffle: "True"
62
+ 2023-10-18 20:40:46,084 ----------------------------------------------------------------------------------------------------
63
+ 2023-10-18 20:40:46,084 Plugins:
64
+ 2023-10-18 20:40:46,084 - TensorboardLogger
65
+ 2023-10-18 20:40:46,084 - LinearScheduler | warmup_fraction: '0.1'
66
+ 2023-10-18 20:40:46,084 ----------------------------------------------------------------------------------------------------
67
+ 2023-10-18 20:40:46,084 Final evaluation on model from best epoch (best-model.pt)
68
+ 2023-10-18 20:40:46,084 - metric: "('micro avg', 'f1-score')"
69
+ 2023-10-18 20:40:46,084 ----------------------------------------------------------------------------------------------------
70
+ 2023-10-18 20:40:46,084 Computation:
71
+ 2023-10-18 20:40:46,084 - compute on device: cuda:0
72
+ 2023-10-18 20:40:46,084 - embedding storage: none
73
+ 2023-10-18 20:40:46,084 ----------------------------------------------------------------------------------------------------
74
+ 2023-10-18 20:40:46,084 Model training base path: "hmbench-icdar/fr-dbmdz/bert-tiny-historic-multilingual-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-2"
75
+ 2023-10-18 20:40:46,084 ----------------------------------------------------------------------------------------------------
76
+ 2023-10-18 20:40:46,084 ----------------------------------------------------------------------------------------------------
77
+ 2023-10-18 20:40:46,084 Logging anything other than scalars to TensorBoard is currently not supported.
78
+ 2023-10-18 20:40:49,484 epoch 1 - iter 198/1984 - loss 2.40181665 - time (sec): 3.40 - samples/sec: 4847.36 - lr: 0.000003 - momentum: 0.000000
79
+ 2023-10-18 20:40:52,929 epoch 1 - iter 396/1984 - loss 2.06051017 - time (sec): 6.85 - samples/sec: 4779.10 - lr: 0.000006 - momentum: 0.000000
80
+ 2023-10-18 20:40:56,097 epoch 1 - iter 594/1984 - loss 1.65005864 - time (sec): 10.01 - samples/sec: 4974.57 - lr: 0.000009 - momentum: 0.000000
81
+ 2023-10-18 20:40:59,185 epoch 1 - iter 792/1984 - loss 1.36396891 - time (sec): 13.10 - samples/sec: 5050.84 - lr: 0.000012 - momentum: 0.000000
82
+ 2023-10-18 20:41:02,224 epoch 1 - iter 990/1984 - loss 1.19110595 - time (sec): 16.14 - samples/sec: 5155.96 - lr: 0.000015 - momentum: 0.000000
83
+ 2023-10-18 20:41:05,316 epoch 1 - iter 1188/1984 - loss 1.06964597 - time (sec): 19.23 - samples/sec: 5168.26 - lr: 0.000018 - momentum: 0.000000
84
+ 2023-10-18 20:41:08,345 epoch 1 - iter 1386/1984 - loss 0.97250875 - time (sec): 22.26 - samples/sec: 5200.44 - lr: 0.000021 - momentum: 0.000000
85
+ 2023-10-18 20:41:11,364 epoch 1 - iter 1584/1984 - loss 0.89537197 - time (sec): 25.28 - samples/sec: 5207.18 - lr: 0.000024 - momentum: 0.000000
86
+ 2023-10-18 20:41:14,412 epoch 1 - iter 1782/1984 - loss 0.83846567 - time (sec): 28.33 - samples/sec: 5199.51 - lr: 0.000027 - momentum: 0.000000
87
+ 2023-10-18 20:41:17,456 epoch 1 - iter 1980/1984 - loss 0.79010219 - time (sec): 31.37 - samples/sec: 5217.57 - lr: 0.000030 - momentum: 0.000000
88
+ 2023-10-18 20:41:17,514 ----------------------------------------------------------------------------------------------------
89
+ 2023-10-18 20:41:17,514 EPOCH 1 done: loss 0.7888 - lr: 0.000030
90
+ 2023-10-18 20:41:19,039 DEV : loss 0.22708812355995178 - f1-score (micro avg) 0.2642
91
+ 2023-10-18 20:41:19,058 saving best model
92
+ 2023-10-18 20:41:19,093 ----------------------------------------------------------------------------------------------------
93
+ 2023-10-18 20:41:22,126 epoch 2 - iter 198/1984 - loss 0.32835931 - time (sec): 3.03 - samples/sec: 5320.46 - lr: 0.000030 - momentum: 0.000000
94
+ 2023-10-18 20:41:25,167 epoch 2 - iter 396/1984 - loss 0.29962721 - time (sec): 6.07 - samples/sec: 5482.59 - lr: 0.000029 - momentum: 0.000000
95
+ 2023-10-18 20:41:28,151 epoch 2 - iter 594/1984 - loss 0.29692658 - time (sec): 9.06 - samples/sec: 5412.42 - lr: 0.000029 - momentum: 0.000000
96
+ 2023-10-18 20:41:31,207 epoch 2 - iter 792/1984 - loss 0.29657996 - time (sec): 12.11 - samples/sec: 5383.77 - lr: 0.000029 - momentum: 0.000000
97
+ 2023-10-18 20:41:34,225 epoch 2 - iter 990/1984 - loss 0.28962444 - time (sec): 15.13 - samples/sec: 5372.89 - lr: 0.000028 - momentum: 0.000000
98
+ 2023-10-18 20:41:37,219 epoch 2 - iter 1188/1984 - loss 0.28843578 - time (sec): 18.13 - samples/sec: 5362.62 - lr: 0.000028 - momentum: 0.000000
99
+ 2023-10-18 20:41:40,292 epoch 2 - iter 1386/1984 - loss 0.29008589 - time (sec): 21.20 - samples/sec: 5333.12 - lr: 0.000028 - momentum: 0.000000
100
+ 2023-10-18 20:41:43,095 epoch 2 - iter 1584/1984 - loss 0.28766926 - time (sec): 24.00 - samples/sec: 5415.25 - lr: 0.000027 - momentum: 0.000000
101
+ 2023-10-18 20:41:46,153 epoch 2 - iter 1782/1984 - loss 0.28669108 - time (sec): 27.06 - samples/sec: 5436.91 - lr: 0.000027 - momentum: 0.000000
102
+ 2023-10-18 20:41:49,222 epoch 2 - iter 1980/1984 - loss 0.28366818 - time (sec): 30.13 - samples/sec: 5432.08 - lr: 0.000027 - momentum: 0.000000
103
+ 2023-10-18 20:41:49,279 ----------------------------------------------------------------------------------------------------
104
+ 2023-10-18 20:41:49,279 EPOCH 2 done: loss 0.2838 - lr: 0.000027
105
+ 2023-10-18 20:41:51,481 DEV : loss 0.19872356951236725 - f1-score (micro avg) 0.3621
106
+ 2023-10-18 20:41:51,500 saving best model
107
+ 2023-10-18 20:41:51,532 ----------------------------------------------------------------------------------------------------
108
+ 2023-10-18 20:41:54,500 epoch 3 - iter 198/1984 - loss 0.24485053 - time (sec): 2.97 - samples/sec: 5641.36 - lr: 0.000026 - momentum: 0.000000
109
+ 2023-10-18 20:41:57,549 epoch 3 - iter 396/1984 - loss 0.25674337 - time (sec): 6.02 - samples/sec: 5432.56 - lr: 0.000026 - momentum: 0.000000
110
+ 2023-10-18 20:42:00,576 epoch 3 - iter 594/1984 - loss 0.25987147 - time (sec): 9.04 - samples/sec: 5408.81 - lr: 0.000026 - momentum: 0.000000
111
+ 2023-10-18 20:42:03,632 epoch 3 - iter 792/1984 - loss 0.25653396 - time (sec): 12.10 - samples/sec: 5357.86 - lr: 0.000025 - momentum: 0.000000
112
+ 2023-10-18 20:42:06,666 epoch 3 - iter 990/1984 - loss 0.24885921 - time (sec): 15.13 - samples/sec: 5396.16 - lr: 0.000025 - momentum: 0.000000
113
+ 2023-10-18 20:42:09,707 epoch 3 - iter 1188/1984 - loss 0.25191638 - time (sec): 18.17 - samples/sec: 5341.92 - lr: 0.000025 - momentum: 0.000000
114
+ 2023-10-18 20:42:12,610 epoch 3 - iter 1386/1984 - loss 0.25739981 - time (sec): 21.08 - samples/sec: 5388.28 - lr: 0.000024 - momentum: 0.000000
115
+ 2023-10-18 20:42:15,681 epoch 3 - iter 1584/1984 - loss 0.25300334 - time (sec): 24.15 - samples/sec: 5398.44 - lr: 0.000024 - momentum: 0.000000
116
+ 2023-10-18 20:42:18,737 epoch 3 - iter 1782/1984 - loss 0.24809037 - time (sec): 27.20 - samples/sec: 5403.39 - lr: 0.000024 - momentum: 0.000000
117
+ 2023-10-18 20:42:21,795 epoch 3 - iter 1980/1984 - loss 0.24646940 - time (sec): 30.26 - samples/sec: 5404.26 - lr: 0.000023 - momentum: 0.000000
118
+ 2023-10-18 20:42:21,867 ----------------------------------------------------------------------------------------------------
119
+ 2023-10-18 20:42:21,867 EPOCH 3 done: loss 0.2463 - lr: 0.000023
120
+ 2023-10-18 20:42:23,662 DEV : loss 0.18124178051948547 - f1-score (micro avg) 0.3771
121
+ 2023-10-18 20:42:23,680 saving best model
122
+ 2023-10-18 20:42:23,714 ----------------------------------------------------------------------------------------------------
123
+ 2023-10-18 20:42:26,738 epoch 4 - iter 198/1984 - loss 0.23822861 - time (sec): 3.02 - samples/sec: 5210.29 - lr: 0.000023 - momentum: 0.000000
124
+ 2023-10-18 20:42:29,815 epoch 4 - iter 396/1984 - loss 0.23209028 - time (sec): 6.10 - samples/sec: 5352.30 - lr: 0.000023 - momentum: 0.000000
125
+ 2023-10-18 20:42:32,825 epoch 4 - iter 594/1984 - loss 0.22861673 - time (sec): 9.11 - samples/sec: 5254.98 - lr: 0.000022 - momentum: 0.000000
126
+ 2023-10-18 20:42:35,855 epoch 4 - iter 792/1984 - loss 0.22608576 - time (sec): 12.14 - samples/sec: 5233.89 - lr: 0.000022 - momentum: 0.000000
127
+ 2023-10-18 20:42:38,904 epoch 4 - iter 990/1984 - loss 0.22959492 - time (sec): 15.19 - samples/sec: 5306.19 - lr: 0.000022 - momentum: 0.000000
128
+ 2023-10-18 20:42:41,627 epoch 4 - iter 1188/1984 - loss 0.22608383 - time (sec): 17.91 - samples/sec: 5413.99 - lr: 0.000021 - momentum: 0.000000
129
+ 2023-10-18 20:42:44,722 epoch 4 - iter 1386/1984 - loss 0.22511283 - time (sec): 21.01 - samples/sec: 5393.16 - lr: 0.000021 - momentum: 0.000000
130
+ 2023-10-18 20:42:47,953 epoch 4 - iter 1584/1984 - loss 0.22675445 - time (sec): 24.24 - samples/sec: 5359.52 - lr: 0.000021 - momentum: 0.000000
131
+ 2023-10-18 20:42:51,209 epoch 4 - iter 1782/1984 - loss 0.22351402 - time (sec): 27.49 - samples/sec: 5338.20 - lr: 0.000020 - momentum: 0.000000
132
+ 2023-10-18 20:42:54,429 epoch 4 - iter 1980/1984 - loss 0.22289582 - time (sec): 30.71 - samples/sec: 5326.93 - lr: 0.000020 - momentum: 0.000000
133
+ 2023-10-18 20:42:54,494 ----------------------------------------------------------------------------------------------------
134
+ 2023-10-18 20:42:54,494 EPOCH 4 done: loss 0.2227 - lr: 0.000020
135
+ 2023-10-18 20:42:56,316 DEV : loss 0.1685991734266281 - f1-score (micro avg) 0.4076
136
+ 2023-10-18 20:42:56,334 saving best model
137
+ 2023-10-18 20:42:56,368 ----------------------------------------------------------------------------------------------------
138
+ 2023-10-18 20:42:59,420 epoch 5 - iter 198/1984 - loss 0.18916448 - time (sec): 3.05 - samples/sec: 5486.17 - lr: 0.000020 - momentum: 0.000000
139
+ 2023-10-18 20:43:02,418 epoch 5 - iter 396/1984 - loss 0.19556003 - time (sec): 6.05 - samples/sec: 5364.05 - lr: 0.000019 - momentum: 0.000000
140
+ 2023-10-18 20:43:05,400 epoch 5 - iter 594/1984 - loss 0.19887775 - time (sec): 9.03 - samples/sec: 5281.04 - lr: 0.000019 - momentum: 0.000000
141
+ 2023-10-18 20:43:08,412 epoch 5 - iter 792/1984 - loss 0.19819805 - time (sec): 12.04 - samples/sec: 5354.70 - lr: 0.000019 - momentum: 0.000000
142
+ 2023-10-18 20:43:11,457 epoch 5 - iter 990/1984 - loss 0.20054697 - time (sec): 15.09 - samples/sec: 5333.89 - lr: 0.000018 - momentum: 0.000000
143
+ 2023-10-18 20:43:14,512 epoch 5 - iter 1188/1984 - loss 0.20228121 - time (sec): 18.14 - samples/sec: 5324.58 - lr: 0.000018 - momentum: 0.000000
144
+ 2023-10-18 20:43:17,524 epoch 5 - iter 1386/1984 - loss 0.20335431 - time (sec): 21.15 - samples/sec: 5352.50 - lr: 0.000018 - momentum: 0.000000
145
+ 2023-10-18 20:43:20,519 epoch 5 - iter 1584/1984 - loss 0.20160468 - time (sec): 24.15 - samples/sec: 5381.60 - lr: 0.000017 - momentum: 0.000000
146
+ 2023-10-18 20:43:23,580 epoch 5 - iter 1782/1984 - loss 0.20444659 - time (sec): 27.21 - samples/sec: 5400.66 - lr: 0.000017 - momentum: 0.000000
147
+ 2023-10-18 20:43:26,633 epoch 5 - iter 1980/1984 - loss 0.20797252 - time (sec): 30.26 - samples/sec: 5405.93 - lr: 0.000017 - momentum: 0.000000
148
+ 2023-10-18 20:43:26,699 ----------------------------------------------------------------------------------------------------
149
+ 2023-10-18 20:43:26,699 EPOCH 5 done: loss 0.2080 - lr: 0.000017
150
+ 2023-10-18 20:43:28,526 DEV : loss 0.16076675057411194 - f1-score (micro avg) 0.4272
151
+ 2023-10-18 20:43:28,544 saving best model
152
+ 2023-10-18 20:43:28,580 ----------------------------------------------------------------------------------------------------
153
+ 2023-10-18 20:43:31,611 epoch 6 - iter 198/1984 - loss 0.18626775 - time (sec): 3.03 - samples/sec: 5385.04 - lr: 0.000016 - momentum: 0.000000
154
+ 2023-10-18 20:43:34,762 epoch 6 - iter 396/1984 - loss 0.18358758 - time (sec): 6.18 - samples/sec: 5174.46 - lr: 0.000016 - momentum: 0.000000
155
+ 2023-10-18 20:43:37,867 epoch 6 - iter 594/1984 - loss 0.18721054 - time (sec): 9.29 - samples/sec: 5277.32 - lr: 0.000016 - momentum: 0.000000
156
+ 2023-10-18 20:43:40,881 epoch 6 - iter 792/1984 - loss 0.18658538 - time (sec): 12.30 - samples/sec: 5288.77 - lr: 0.000015 - momentum: 0.000000
157
+ 2023-10-18 20:43:43,940 epoch 6 - iter 990/1984 - loss 0.18950152 - time (sec): 15.36 - samples/sec: 5274.00 - lr: 0.000015 - momentum: 0.000000
158
+ 2023-10-18 20:43:46,978 epoch 6 - iter 1188/1984 - loss 0.18810016 - time (sec): 18.40 - samples/sec: 5298.65 - lr: 0.000015 - momentum: 0.000000
159
+ 2023-10-18 20:43:49,811 epoch 6 - iter 1386/1984 - loss 0.19049194 - time (sec): 21.23 - samples/sec: 5340.47 - lr: 0.000014 - momentum: 0.000000
160
+ 2023-10-18 20:43:52,777 epoch 6 - iter 1584/1984 - loss 0.19125360 - time (sec): 24.20 - samples/sec: 5349.24 - lr: 0.000014 - momentum: 0.000000
161
+ 2023-10-18 20:43:55,574 epoch 6 - iter 1782/1984 - loss 0.19255913 - time (sec): 26.99 - samples/sec: 5394.96 - lr: 0.000014 - momentum: 0.000000
162
+ 2023-10-18 20:43:58,643 epoch 6 - iter 1980/1984 - loss 0.19515696 - time (sec): 30.06 - samples/sec: 5444.12 - lr: 0.000013 - momentum: 0.000000
163
+ 2023-10-18 20:43:58,710 ----------------------------------------------------------------------------------------------------
164
+ 2023-10-18 20:43:58,710 EPOCH 6 done: loss 0.1950 - lr: 0.000013
165
+ 2023-10-18 20:44:00,528 DEV : loss 0.1576894223690033 - f1-score (micro avg) 0.4482
166
+ 2023-10-18 20:44:00,546 saving best model
167
+ 2023-10-18 20:44:00,581 ----------------------------------------------------------------------------------------------------
168
+ 2023-10-18 20:44:03,606 epoch 7 - iter 198/1984 - loss 0.20041330 - time (sec): 3.02 - samples/sec: 5330.01 - lr: 0.000013 - momentum: 0.000000
169
+ 2023-10-18 20:44:06,304 epoch 7 - iter 396/1984 - loss 0.19185999 - time (sec): 5.72 - samples/sec: 5607.62 - lr: 0.000013 - momentum: 0.000000
170
+ 2023-10-18 20:44:09,246 epoch 7 - iter 594/1984 - loss 0.19074333 - time (sec): 8.66 - samples/sec: 5651.53 - lr: 0.000012 - momentum: 0.000000
171
+ 2023-10-18 20:44:12,277 epoch 7 - iter 792/1984 - loss 0.19616294 - time (sec): 11.70 - samples/sec: 5593.71 - lr: 0.000012 - momentum: 0.000000
172
+ 2023-10-18 20:44:15,367 epoch 7 - iter 990/1984 - loss 0.19168219 - time (sec): 14.79 - samples/sec: 5536.96 - lr: 0.000012 - momentum: 0.000000
173
+ 2023-10-18 20:44:18,405 epoch 7 - iter 1188/1984 - loss 0.18962249 - time (sec): 17.82 - samples/sec: 5504.98 - lr: 0.000011 - momentum: 0.000000
174
+ 2023-10-18 20:44:21,422 epoch 7 - iter 1386/1984 - loss 0.18950612 - time (sec): 20.84 - samples/sec: 5474.31 - lr: 0.000011 - momentum: 0.000000
175
+ 2023-10-18 20:44:24,452 epoch 7 - iter 1584/1984 - loss 0.18837304 - time (sec): 23.87 - samples/sec: 5449.01 - lr: 0.000011 - momentum: 0.000000
176
+ 2023-10-18 20:44:27,644 epoch 7 - iter 1782/1984 - loss 0.18498108 - time (sec): 27.06 - samples/sec: 5459.72 - lr: 0.000010 - momentum: 0.000000
177
+ 2023-10-18 20:44:30,700 epoch 7 - iter 1980/1984 - loss 0.18632929 - time (sec): 30.12 - samples/sec: 5433.74 - lr: 0.000010 - momentum: 0.000000
178
+ 2023-10-18 20:44:30,761 ----------------------------------------------------------------------------------------------------
179
+ 2023-10-18 20:44:30,761 EPOCH 7 done: loss 0.1860 - lr: 0.000010
180
+ 2023-10-18 20:44:32,576 DEV : loss 0.158553346991539 - f1-score (micro avg) 0.4563
181
+ 2023-10-18 20:44:32,594 saving best model
182
+ 2023-10-18 20:44:32,627 ----------------------------------------------------------------------------------------------------
183
+ 2023-10-18 20:44:35,383 epoch 8 - iter 198/1984 - loss 0.18600477 - time (sec): 2.75 - samples/sec: 6073.68 - lr: 0.000010 - momentum: 0.000000
184
+ 2023-10-18 20:44:38,367 epoch 8 - iter 396/1984 - loss 0.17748589 - time (sec): 5.74 - samples/sec: 5766.31 - lr: 0.000009 - momentum: 0.000000
185
+ 2023-10-18 20:44:41,403 epoch 8 - iter 594/1984 - loss 0.17785171 - time (sec): 8.77 - samples/sec: 5787.16 - lr: 0.000009 - momentum: 0.000000
186
+ 2023-10-18 20:44:44,402 epoch 8 - iter 792/1984 - loss 0.18108144 - time (sec): 11.77 - samples/sec: 5648.04 - lr: 0.000009 - momentum: 0.000000
187
+ 2023-10-18 20:44:47,271 epoch 8 - iter 990/1984 - loss 0.17884882 - time (sec): 14.64 - samples/sec: 5596.01 - lr: 0.000008 - momentum: 0.000000
188
+ 2023-10-18 20:44:50,250 epoch 8 - iter 1188/1984 - loss 0.18579308 - time (sec): 17.62 - samples/sec: 5569.41 - lr: 0.000008 - momentum: 0.000000
189
+ 2023-10-18 20:44:53,279 epoch 8 - iter 1386/1984 - loss 0.18431354 - time (sec): 20.65 - samples/sec: 5527.58 - lr: 0.000008 - momentum: 0.000000
190
+ 2023-10-18 20:44:56,298 epoch 8 - iter 1584/1984 - loss 0.18224221 - time (sec): 23.67 - samples/sec: 5503.83 - lr: 0.000007 - momentum: 0.000000
191
+ 2023-10-18 20:44:59,335 epoch 8 - iter 1782/1984 - loss 0.18145094 - time (sec): 26.71 - samples/sec: 5496.95 - lr: 0.000007 - momentum: 0.000000
192
+ 2023-10-18 20:45:02,392 epoch 8 - iter 1980/1984 - loss 0.18085147 - time (sec): 29.76 - samples/sec: 5496.75 - lr: 0.000007 - momentum: 0.000000
193
+ 2023-10-18 20:45:02,463 ----------------------------------------------------------------------------------------------------
194
+ 2023-10-18 20:45:02,463 EPOCH 8 done: loss 0.1807 - lr: 0.000007
195
+ 2023-10-18 20:45:04,686 DEV : loss 0.15737873315811157 - f1-score (micro avg) 0.4634
196
+ 2023-10-18 20:45:04,704 saving best model
197
+ 2023-10-18 20:45:04,739 ----------------------------------------------------------------------------------------------------
198
+ 2023-10-18 20:45:07,834 epoch 9 - iter 198/1984 - loss 0.17913017 - time (sec): 3.09 - samples/sec: 5424.89 - lr: 0.000006 - momentum: 0.000000
199
+ 2023-10-18 20:45:10,867 epoch 9 - iter 396/1984 - loss 0.18063398 - time (sec): 6.13 - samples/sec: 5637.56 - lr: 0.000006 - momentum: 0.000000
200
+ 2023-10-18 20:45:13,885 epoch 9 - iter 594/1984 - loss 0.17485929 - time (sec): 9.14 - samples/sec: 5510.69 - lr: 0.000006 - momentum: 0.000000
201
+ 2023-10-18 20:45:16,911 epoch 9 - iter 792/1984 - loss 0.17546060 - time (sec): 12.17 - samples/sec: 5414.96 - lr: 0.000005 - momentum: 0.000000
202
+ 2023-10-18 20:45:19,931 epoch 9 - iter 990/1984 - loss 0.17552482 - time (sec): 15.19 - samples/sec: 5424.88 - lr: 0.000005 - momentum: 0.000000
203
+ 2023-10-18 20:45:23,005 epoch 9 - iter 1188/1984 - loss 0.18049553 - time (sec): 18.26 - samples/sec: 5403.10 - lr: 0.000005 - momentum: 0.000000
204
+ 2023-10-18 20:45:26,042 epoch 9 - iter 1386/1984 - loss 0.18155530 - time (sec): 21.30 - samples/sec: 5357.94 - lr: 0.000004 - momentum: 0.000000
205
+ 2023-10-18 20:45:29,095 epoch 9 - iter 1584/1984 - loss 0.17860347 - time (sec): 24.36 - samples/sec: 5354.22 - lr: 0.000004 - momentum: 0.000000
206
+ 2023-10-18 20:45:32,172 epoch 9 - iter 1782/1984 - loss 0.17832306 - time (sec): 27.43 - samples/sec: 5365.33 - lr: 0.000004 - momentum: 0.000000
207
+ 2023-10-18 20:45:35,265 epoch 9 - iter 1980/1984 - loss 0.17617574 - time (sec): 30.53 - samples/sec: 5361.27 - lr: 0.000003 - momentum: 0.000000
208
+ 2023-10-18 20:45:35,330 ----------------------------------------------------------------------------------------------------
209
+ 2023-10-18 20:45:35,330 EPOCH 9 done: loss 0.1760 - lr: 0.000003
210
+ 2023-10-18 20:45:37,172 DEV : loss 0.15737827122211456 - f1-score (micro avg) 0.4735
211
+ 2023-10-18 20:45:37,192 saving best model
212
+ 2023-10-18 20:45:37,227 ----------------------------------------------------------------------------------------------------
213
+ 2023-10-18 20:45:40,620 epoch 10 - iter 198/1984 - loss 0.18505314 - time (sec): 3.39 - samples/sec: 4622.04 - lr: 0.000003 - momentum: 0.000000
214
+ 2023-10-18 20:45:43,687 epoch 10 - iter 396/1984 - loss 0.17359687 - time (sec): 6.46 - samples/sec: 5035.29 - lr: 0.000003 - momentum: 0.000000
215
+ 2023-10-18 20:45:46,745 epoch 10 - iter 594/1984 - loss 0.17583619 - time (sec): 9.52 - samples/sec: 5147.81 - lr: 0.000002 - momentum: 0.000000
216
+ 2023-10-18 20:45:49,746 epoch 10 - iter 792/1984 - loss 0.17469714 - time (sec): 12.52 - samples/sec: 5204.51 - lr: 0.000002 - momentum: 0.000000
217
+ 2023-10-18 20:45:52,506 epoch 10 - iter 990/1984 - loss 0.17268153 - time (sec): 15.28 - samples/sec: 5359.97 - lr: 0.000002 - momentum: 0.000000
218
+ 2023-10-18 20:45:55,438 epoch 10 - iter 1188/1984 - loss 0.17585525 - time (sec): 18.21 - samples/sec: 5413.48 - lr: 0.000001 - momentum: 0.000000
219
+ 2023-10-18 20:45:58,487 epoch 10 - iter 1386/1984 - loss 0.17518554 - time (sec): 21.26 - samples/sec: 5415.87 - lr: 0.000001 - momentum: 0.000000
220
+ 2023-10-18 20:46:01,428 epoch 10 - iter 1584/1984 - loss 0.17473003 - time (sec): 24.20 - samples/sec: 5442.09 - lr: 0.000001 - momentum: 0.000000
221
+ 2023-10-18 20:46:04,477 epoch 10 - iter 1782/1984 - loss 0.17456587 - time (sec): 27.25 - samples/sec: 5406.90 - lr: 0.000000 - momentum: 0.000000
222
+ 2023-10-18 20:46:07,538 epoch 10 - iter 1980/1984 - loss 0.17442832 - time (sec): 30.31 - samples/sec: 5402.43 - lr: 0.000000 - momentum: 0.000000
223
+ 2023-10-18 20:46:07,607 ----------------------------------------------------------------------------------------------------
224
+ 2023-10-18 20:46:07,607 EPOCH 10 done: loss 0.1746 - lr: 0.000000
225
+ 2023-10-18 20:46:09,452 DEV : loss 0.15687035024166107 - f1-score (micro avg) 0.4721
226
+ 2023-10-18 20:46:09,500 ----------------------------------------------------------------------------------------------------
227
+ 2023-10-18 20:46:09,501 Loading model from best epoch ...
228
+ 2023-10-18 20:46:09,584 SequenceTagger predicts: Dictionary with 13 tags: O, S-PER, B-PER, E-PER, I-PER, S-LOC, B-LOC, E-LOC, I-LOC, S-ORG, B-ORG, E-ORG, I-ORG
229
+ 2023-10-18 20:46:11,130
230
+ Results:
231
+ - F-score (micro) 0.5343
232
+ - F-score (macro) 0.3481
233
+ - Accuracy 0.4039
234
+
235
+ By class:
236
+ precision recall f1-score support
237
+
238
+ LOC 0.6647 0.6779 0.6712 655
239
+ PER 0.2786 0.5022 0.3584 223
240
+ ORG 0.1000 0.0079 0.0146 127
241
+
242
+ micro avg 0.5157 0.5542 0.5343 1005
243
+ macro avg 0.3478 0.3960 0.3481 1005
244
+ weighted avg 0.5077 0.5542 0.5188 1005
245
+
246
+ 2023-10-18 20:46:11,130 ----------------------------------------------------------------------------------------------------