Upload folder using huggingface_hub
Browse files- best-model.pt +3 -0
- dev.tsv +0 -0
- loss.tsv +11 -0
- runs/events.out.tfevents.1697666365.46dc0c540dd0.3571.0 +3 -0
- test.tsv +0 -0
- training.log +245 -0
best-model.pt
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:94771ea3904dd6ca5704562710f1189c4b6a616f00556ed0f8859734db10e3b1
|
3 |
+
size 19045922
|
dev.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
loss.tsv
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
EPOCH TIMESTAMP LEARNING_RATE TRAIN_LOSS DEV_LOSS DEV_PRECISION DEV_RECALL DEV_F1 DEV_ACCURACY
|
2 |
+
1 21:59:51 0.0000 0.9422 0.3453 0.0000 0.0000 0.0000 0.0000
|
3 |
+
2 22:00:17 0.0000 0.2217 0.2471 0.5409 0.1777 0.2675 0.1572
|
4 |
+
3 22:00:43 0.0000 0.1851 0.2334 0.6388 0.2211 0.3285 0.1991
|
5 |
+
4 22:01:08 0.0000 0.1695 0.2071 0.6017 0.3729 0.4605 0.3075
|
6 |
+
5 22:01:34 0.0000 0.1572 0.2033 0.6112 0.3719 0.4624 0.3101
|
7 |
+
6 22:02:00 0.0000 0.1497 0.1934 0.5889 0.4277 0.4955 0.3402
|
8 |
+
7 22:02:26 0.0000 0.1451 0.2012 0.6035 0.4246 0.4985 0.3414
|
9 |
+
8 22:02:52 0.0000 0.1412 0.1880 0.5720 0.4680 0.5148 0.3581
|
10 |
+
9 22:03:18 0.0000 0.1361 0.1962 0.6006 0.4380 0.5066 0.3481
|
11 |
+
10 22:03:44 0.0000 0.1344 0.1904 0.5875 0.4649 0.5190 0.3617
|
runs/events.out.tfevents.1697666365.46dc0c540dd0.3571.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:95cc72a1160d12962a26b9a334449f081d2a0e58138af8254b43ba6bbbe54892
|
3 |
+
size 808480
|
test.tsv
ADDED
The diff for this file is too large to render.
See raw diff
|
|
training.log
ADDED
@@ -0,0 +1,245 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2023-10-18 21:59:25,264 ----------------------------------------------------------------------------------------------------
|
2 |
+
2023-10-18 21:59:25,264 Model: "SequenceTagger(
|
3 |
+
(embeddings): TransformerWordEmbeddings(
|
4 |
+
(model): BertModel(
|
5 |
+
(embeddings): BertEmbeddings(
|
6 |
+
(word_embeddings): Embedding(32001, 128)
|
7 |
+
(position_embeddings): Embedding(512, 128)
|
8 |
+
(token_type_embeddings): Embedding(2, 128)
|
9 |
+
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
|
10 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
11 |
+
)
|
12 |
+
(encoder): BertEncoder(
|
13 |
+
(layer): ModuleList(
|
14 |
+
(0-1): 2 x BertLayer(
|
15 |
+
(attention): BertAttention(
|
16 |
+
(self): BertSelfAttention(
|
17 |
+
(query): Linear(in_features=128, out_features=128, bias=True)
|
18 |
+
(key): Linear(in_features=128, out_features=128, bias=True)
|
19 |
+
(value): Linear(in_features=128, out_features=128, bias=True)
|
20 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
21 |
+
)
|
22 |
+
(output): BertSelfOutput(
|
23 |
+
(dense): Linear(in_features=128, out_features=128, bias=True)
|
24 |
+
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
|
25 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
26 |
+
)
|
27 |
+
)
|
28 |
+
(intermediate): BertIntermediate(
|
29 |
+
(dense): Linear(in_features=128, out_features=512, bias=True)
|
30 |
+
(intermediate_act_fn): GELUActivation()
|
31 |
+
)
|
32 |
+
(output): BertOutput(
|
33 |
+
(dense): Linear(in_features=512, out_features=128, bias=True)
|
34 |
+
(LayerNorm): LayerNorm((128,), eps=1e-12, elementwise_affine=True)
|
35 |
+
(dropout): Dropout(p=0.1, inplace=False)
|
36 |
+
)
|
37 |
+
)
|
38 |
+
)
|
39 |
+
)
|
40 |
+
(pooler): BertPooler(
|
41 |
+
(dense): Linear(in_features=128, out_features=128, bias=True)
|
42 |
+
(activation): Tanh()
|
43 |
+
)
|
44 |
+
)
|
45 |
+
)
|
46 |
+
(locked_dropout): LockedDropout(p=0.5)
|
47 |
+
(linear): Linear(in_features=128, out_features=13, bias=True)
|
48 |
+
(loss_function): CrossEntropyLoss()
|
49 |
+
)"
|
50 |
+
2023-10-18 21:59:25,264 ----------------------------------------------------------------------------------------------------
|
51 |
+
2023-10-18 21:59:25,264 MultiCorpus: 5777 train + 722 dev + 723 test sentences
|
52 |
+
- NER_ICDAR_EUROPEANA Corpus: 5777 train + 722 dev + 723 test sentences - /root/.flair/datasets/ner_icdar_europeana/nl
|
53 |
+
2023-10-18 21:59:25,264 ----------------------------------------------------------------------------------------------------
|
54 |
+
2023-10-18 21:59:25,264 Train: 5777 sentences
|
55 |
+
2023-10-18 21:59:25,264 (train_with_dev=False, train_with_test=False)
|
56 |
+
2023-10-18 21:59:25,264 ----------------------------------------------------------------------------------------------------
|
57 |
+
2023-10-18 21:59:25,264 Training Params:
|
58 |
+
2023-10-18 21:59:25,264 - learning_rate: "3e-05"
|
59 |
+
2023-10-18 21:59:25,264 - mini_batch_size: "4"
|
60 |
+
2023-10-18 21:59:25,264 - max_epochs: "10"
|
61 |
+
2023-10-18 21:59:25,264 - shuffle: "True"
|
62 |
+
2023-10-18 21:59:25,264 ----------------------------------------------------------------------------------------------------
|
63 |
+
2023-10-18 21:59:25,265 Plugins:
|
64 |
+
2023-10-18 21:59:25,265 - TensorboardLogger
|
65 |
+
2023-10-18 21:59:25,265 - LinearScheduler | warmup_fraction: '0.1'
|
66 |
+
2023-10-18 21:59:25,265 ----------------------------------------------------------------------------------------------------
|
67 |
+
2023-10-18 21:59:25,265 Final evaluation on model from best epoch (best-model.pt)
|
68 |
+
2023-10-18 21:59:25,265 - metric: "('micro avg', 'f1-score')"
|
69 |
+
2023-10-18 21:59:25,265 ----------------------------------------------------------------------------------------------------
|
70 |
+
2023-10-18 21:59:25,265 Computation:
|
71 |
+
2023-10-18 21:59:25,265 - compute on device: cuda:0
|
72 |
+
2023-10-18 21:59:25,265 - embedding storage: none
|
73 |
+
2023-10-18 21:59:25,265 ----------------------------------------------------------------------------------------------------
|
74 |
+
2023-10-18 21:59:25,265 Model training base path: "hmbench-icdar/nl-dbmdz/bert-tiny-historic-multilingual-cased-bs4-wsFalse-e10-lr3e-05-poolingfirst-layers-1-crfFalse-1"
|
75 |
+
2023-10-18 21:59:25,265 ----------------------------------------------------------------------------------------------------
|
76 |
+
2023-10-18 21:59:25,265 ----------------------------------------------------------------------------------------------------
|
77 |
+
2023-10-18 21:59:25,265 Logging anything other than scalars to TensorBoard is currently not supported.
|
78 |
+
2023-10-18 21:59:28,161 epoch 1 - iter 144/1445 - loss 3.12033290 - time (sec): 2.90 - samples/sec: 5961.15 - lr: 0.000003 - momentum: 0.000000
|
79 |
+
2023-10-18 21:59:30,644 epoch 1 - iter 288/1445 - loss 2.73127250 - time (sec): 5.38 - samples/sec: 6644.09 - lr: 0.000006 - momentum: 0.000000
|
80 |
+
2023-10-18 21:59:33,079 epoch 1 - iter 432/1445 - loss 2.29391689 - time (sec): 7.81 - samples/sec: 6836.85 - lr: 0.000009 - momentum: 0.000000
|
81 |
+
2023-10-18 21:59:35,481 epoch 1 - iter 576/1445 - loss 1.88020693 - time (sec): 10.22 - samples/sec: 6941.42 - lr: 0.000012 - momentum: 0.000000
|
82 |
+
2023-10-18 21:59:37,869 epoch 1 - iter 720/1445 - loss 1.56848815 - time (sec): 12.60 - samples/sec: 7065.83 - lr: 0.000015 - momentum: 0.000000
|
83 |
+
2023-10-18 21:59:40,300 epoch 1 - iter 864/1445 - loss 1.35275662 - time (sec): 15.03 - samples/sec: 7127.64 - lr: 0.000018 - momentum: 0.000000
|
84 |
+
2023-10-18 21:59:42,672 epoch 1 - iter 1008/1445 - loss 1.20771003 - time (sec): 17.41 - samples/sec: 7171.76 - lr: 0.000021 - momentum: 0.000000
|
85 |
+
2023-10-18 21:59:45,014 epoch 1 - iter 1152/1445 - loss 1.09910210 - time (sec): 19.75 - samples/sec: 7206.89 - lr: 0.000024 - momentum: 0.000000
|
86 |
+
2023-10-18 21:59:47,428 epoch 1 - iter 1296/1445 - loss 1.01552899 - time (sec): 22.16 - samples/sec: 7156.37 - lr: 0.000027 - momentum: 0.000000
|
87 |
+
2023-10-18 21:59:49,785 epoch 1 - iter 1440/1445 - loss 0.94420577 - time (sec): 24.52 - samples/sec: 7157.73 - lr: 0.000030 - momentum: 0.000000
|
88 |
+
2023-10-18 21:59:49,886 ----------------------------------------------------------------------------------------------------
|
89 |
+
2023-10-18 21:59:49,886 EPOCH 1 done: loss 0.9422 - lr: 0.000030
|
90 |
+
2023-10-18 21:59:51,091 DEV : loss 0.3452602028846741 - f1-score (micro avg) 0.0
|
91 |
+
2023-10-18 21:59:51,104 ----------------------------------------------------------------------------------------------------
|
92 |
+
2023-10-18 21:59:53,490 epoch 2 - iter 144/1445 - loss 0.24685467 - time (sec): 2.39 - samples/sec: 7798.78 - lr: 0.000030 - momentum: 0.000000
|
93 |
+
2023-10-18 21:59:55,937 epoch 2 - iter 288/1445 - loss 0.25413853 - time (sec): 4.83 - samples/sec: 7406.13 - lr: 0.000029 - momentum: 0.000000
|
94 |
+
2023-10-18 21:59:58,336 epoch 2 - iter 432/1445 - loss 0.25897934 - time (sec): 7.23 - samples/sec: 7446.18 - lr: 0.000029 - momentum: 0.000000
|
95 |
+
2023-10-18 22:00:00,740 epoch 2 - iter 576/1445 - loss 0.24813039 - time (sec): 9.63 - samples/sec: 7436.00 - lr: 0.000029 - momentum: 0.000000
|
96 |
+
2023-10-18 22:00:03,129 epoch 2 - iter 720/1445 - loss 0.23727449 - time (sec): 12.02 - samples/sec: 7386.31 - lr: 0.000028 - momentum: 0.000000
|
97 |
+
2023-10-18 22:00:05,586 epoch 2 - iter 864/1445 - loss 0.23177537 - time (sec): 14.48 - samples/sec: 7423.74 - lr: 0.000028 - momentum: 0.000000
|
98 |
+
2023-10-18 22:00:07,945 epoch 2 - iter 1008/1445 - loss 0.23158786 - time (sec): 16.84 - samples/sec: 7342.57 - lr: 0.000028 - momentum: 0.000000
|
99 |
+
2023-10-18 22:00:10,271 epoch 2 - iter 1152/1445 - loss 0.22843822 - time (sec): 19.17 - samples/sec: 7315.10 - lr: 0.000027 - momentum: 0.000000
|
100 |
+
2023-10-18 22:00:12,662 epoch 2 - iter 1296/1445 - loss 0.22600116 - time (sec): 21.56 - samples/sec: 7317.55 - lr: 0.000027 - momentum: 0.000000
|
101 |
+
2023-10-18 22:00:15,121 epoch 2 - iter 1440/1445 - loss 0.22146316 - time (sec): 24.02 - samples/sec: 7319.46 - lr: 0.000027 - momentum: 0.000000
|
102 |
+
2023-10-18 22:00:15,193 ----------------------------------------------------------------------------------------------------
|
103 |
+
2023-10-18 22:00:15,193 EPOCH 2 done: loss 0.2217 - lr: 0.000027
|
104 |
+
2023-10-18 22:00:17,219 DEV : loss 0.24708838760852814 - f1-score (micro avg) 0.2675
|
105 |
+
2023-10-18 22:00:17,232 saving best model
|
106 |
+
2023-10-18 22:00:17,269 ----------------------------------------------------------------------------------------------------
|
107 |
+
2023-10-18 22:00:19,622 epoch 3 - iter 144/1445 - loss 0.19186803 - time (sec): 2.35 - samples/sec: 7626.06 - lr: 0.000026 - momentum: 0.000000
|
108 |
+
2023-10-18 22:00:21,975 epoch 3 - iter 288/1445 - loss 0.19554672 - time (sec): 4.71 - samples/sec: 7385.55 - lr: 0.000026 - momentum: 0.000000
|
109 |
+
2023-10-18 22:00:24,368 epoch 3 - iter 432/1445 - loss 0.19188111 - time (sec): 7.10 - samples/sec: 7357.27 - lr: 0.000026 - momentum: 0.000000
|
110 |
+
2023-10-18 22:00:26,835 epoch 3 - iter 576/1445 - loss 0.18064718 - time (sec): 9.57 - samples/sec: 7401.41 - lr: 0.000025 - momentum: 0.000000
|
111 |
+
2023-10-18 22:00:29,206 epoch 3 - iter 720/1445 - loss 0.18242624 - time (sec): 11.94 - samples/sec: 7368.17 - lr: 0.000025 - momentum: 0.000000
|
112 |
+
2023-10-18 22:00:31,615 epoch 3 - iter 864/1445 - loss 0.18199558 - time (sec): 14.35 - samples/sec: 7352.76 - lr: 0.000025 - momentum: 0.000000
|
113 |
+
2023-10-18 22:00:33,950 epoch 3 - iter 1008/1445 - loss 0.18404106 - time (sec): 16.68 - samples/sec: 7321.89 - lr: 0.000024 - momentum: 0.000000
|
114 |
+
2023-10-18 22:00:36,390 epoch 3 - iter 1152/1445 - loss 0.18658269 - time (sec): 19.12 - samples/sec: 7317.07 - lr: 0.000024 - momentum: 0.000000
|
115 |
+
2023-10-18 22:00:38,837 epoch 3 - iter 1296/1445 - loss 0.18394355 - time (sec): 21.57 - samples/sec: 7323.81 - lr: 0.000024 - momentum: 0.000000
|
116 |
+
2023-10-18 22:00:41,224 epoch 3 - iter 1440/1445 - loss 0.18521142 - time (sec): 23.95 - samples/sec: 7337.40 - lr: 0.000023 - momentum: 0.000000
|
117 |
+
2023-10-18 22:00:41,301 ----------------------------------------------------------------------------------------------------
|
118 |
+
2023-10-18 22:00:41,301 EPOCH 3 done: loss 0.1851 - lr: 0.000023
|
119 |
+
2023-10-18 22:00:43,027 DEV : loss 0.23336394131183624 - f1-score (micro avg) 0.3285
|
120 |
+
2023-10-18 22:00:43,040 saving best model
|
121 |
+
2023-10-18 22:00:43,076 ----------------------------------------------------------------------------------------------------
|
122 |
+
2023-10-18 22:00:45,225 epoch 4 - iter 144/1445 - loss 0.16142706 - time (sec): 2.15 - samples/sec: 8207.22 - lr: 0.000023 - momentum: 0.000000
|
123 |
+
2023-10-18 22:00:47,471 epoch 4 - iter 288/1445 - loss 0.16014748 - time (sec): 4.39 - samples/sec: 7846.45 - lr: 0.000023 - momentum: 0.000000
|
124 |
+
2023-10-18 22:00:49,864 epoch 4 - iter 432/1445 - loss 0.16892760 - time (sec): 6.79 - samples/sec: 7717.37 - lr: 0.000022 - momentum: 0.000000
|
125 |
+
2023-10-18 22:00:52,300 epoch 4 - iter 576/1445 - loss 0.16482026 - time (sec): 9.22 - samples/sec: 7746.37 - lr: 0.000022 - momentum: 0.000000
|
126 |
+
2023-10-18 22:00:54,657 epoch 4 - iter 720/1445 - loss 0.16393038 - time (sec): 11.58 - samples/sec: 7650.53 - lr: 0.000022 - momentum: 0.000000
|
127 |
+
2023-10-18 22:00:57,127 epoch 4 - iter 864/1445 - loss 0.16779343 - time (sec): 14.05 - samples/sec: 7605.73 - lr: 0.000021 - momentum: 0.000000
|
128 |
+
2023-10-18 22:00:59,537 epoch 4 - iter 1008/1445 - loss 0.16750567 - time (sec): 16.46 - samples/sec: 7594.23 - lr: 0.000021 - momentum: 0.000000
|
129 |
+
2023-10-18 22:01:01,939 epoch 4 - iter 1152/1445 - loss 0.16658219 - time (sec): 18.86 - samples/sec: 7547.51 - lr: 0.000021 - momentum: 0.000000
|
130 |
+
2023-10-18 22:01:04,279 epoch 4 - iter 1296/1445 - loss 0.16602146 - time (sec): 21.20 - samples/sec: 7487.52 - lr: 0.000020 - momentum: 0.000000
|
131 |
+
2023-10-18 22:01:06,702 epoch 4 - iter 1440/1445 - loss 0.16955053 - time (sec): 23.63 - samples/sec: 7436.03 - lr: 0.000020 - momentum: 0.000000
|
132 |
+
2023-10-18 22:01:06,782 ----------------------------------------------------------------------------------------------------
|
133 |
+
2023-10-18 22:01:06,783 EPOCH 4 done: loss 0.1695 - lr: 0.000020
|
134 |
+
2023-10-18 22:01:08,517 DEV : loss 0.20714794099330902 - f1-score (micro avg) 0.4605
|
135 |
+
2023-10-18 22:01:08,531 saving best model
|
136 |
+
2023-10-18 22:01:08,569 ----------------------------------------------------------------------------------------------------
|
137 |
+
2023-10-18 22:01:10,990 epoch 5 - iter 144/1445 - loss 0.17135337 - time (sec): 2.42 - samples/sec: 7517.29 - lr: 0.000020 - momentum: 0.000000
|
138 |
+
2023-10-18 22:01:13,359 epoch 5 - iter 288/1445 - loss 0.16640659 - time (sec): 4.79 - samples/sec: 7503.63 - lr: 0.000019 - momentum: 0.000000
|
139 |
+
2023-10-18 22:01:15,642 epoch 5 - iter 432/1445 - loss 0.16362156 - time (sec): 7.07 - samples/sec: 7310.40 - lr: 0.000019 - momentum: 0.000000
|
140 |
+
2023-10-18 22:01:18,062 epoch 5 - iter 576/1445 - loss 0.16395372 - time (sec): 9.49 - samples/sec: 7193.45 - lr: 0.000019 - momentum: 0.000000
|
141 |
+
2023-10-18 22:01:20,548 epoch 5 - iter 720/1445 - loss 0.16211208 - time (sec): 11.98 - samples/sec: 7096.31 - lr: 0.000018 - momentum: 0.000000
|
142 |
+
2023-10-18 22:01:22,982 epoch 5 - iter 864/1445 - loss 0.15978012 - time (sec): 14.41 - samples/sec: 7198.61 - lr: 0.000018 - momentum: 0.000000
|
143 |
+
2023-10-18 22:01:25,474 epoch 5 - iter 1008/1445 - loss 0.15826501 - time (sec): 16.90 - samples/sec: 7191.27 - lr: 0.000018 - momentum: 0.000000
|
144 |
+
2023-10-18 22:01:27,874 epoch 5 - iter 1152/1445 - loss 0.15655713 - time (sec): 19.30 - samples/sec: 7210.39 - lr: 0.000017 - momentum: 0.000000
|
145 |
+
2023-10-18 22:01:30,281 epoch 5 - iter 1296/1445 - loss 0.15830919 - time (sec): 21.71 - samples/sec: 7226.71 - lr: 0.000017 - momentum: 0.000000
|
146 |
+
2023-10-18 22:01:32,711 epoch 5 - iter 1440/1445 - loss 0.15702601 - time (sec): 24.14 - samples/sec: 7273.75 - lr: 0.000017 - momentum: 0.000000
|
147 |
+
2023-10-18 22:01:32,787 ----------------------------------------------------------------------------------------------------
|
148 |
+
2023-10-18 22:01:32,787 EPOCH 5 done: loss 0.1572 - lr: 0.000017
|
149 |
+
2023-10-18 22:01:34,853 DEV : loss 0.20330089330673218 - f1-score (micro avg) 0.4624
|
150 |
+
2023-10-18 22:01:34,867 saving best model
|
151 |
+
2023-10-18 22:01:34,902 ----------------------------------------------------------------------------------------------------
|
152 |
+
2023-10-18 22:01:37,251 epoch 6 - iter 144/1445 - loss 0.14731954 - time (sec): 2.35 - samples/sec: 7252.73 - lr: 0.000016 - momentum: 0.000000
|
153 |
+
2023-10-18 22:01:39,669 epoch 6 - iter 288/1445 - loss 0.14810579 - time (sec): 4.77 - samples/sec: 7204.92 - lr: 0.000016 - momentum: 0.000000
|
154 |
+
2023-10-18 22:01:42,095 epoch 6 - iter 432/1445 - loss 0.16055762 - time (sec): 7.19 - samples/sec: 7248.03 - lr: 0.000016 - momentum: 0.000000
|
155 |
+
2023-10-18 22:01:44,443 epoch 6 - iter 576/1445 - loss 0.15930872 - time (sec): 9.54 - samples/sec: 7170.84 - lr: 0.000015 - momentum: 0.000000
|
156 |
+
2023-10-18 22:01:46,848 epoch 6 - iter 720/1445 - loss 0.15428148 - time (sec): 11.95 - samples/sec: 7244.08 - lr: 0.000015 - momentum: 0.000000
|
157 |
+
2023-10-18 22:01:49,304 epoch 6 - iter 864/1445 - loss 0.15117637 - time (sec): 14.40 - samples/sec: 7191.84 - lr: 0.000015 - momentum: 0.000000
|
158 |
+
2023-10-18 22:01:51,794 epoch 6 - iter 1008/1445 - loss 0.15230089 - time (sec): 16.89 - samples/sec: 7263.19 - lr: 0.000014 - momentum: 0.000000
|
159 |
+
2023-10-18 22:01:54,316 epoch 6 - iter 1152/1445 - loss 0.15092891 - time (sec): 19.41 - samples/sec: 7250.15 - lr: 0.000014 - momentum: 0.000000
|
160 |
+
2023-10-18 22:01:56,735 epoch 6 - iter 1296/1445 - loss 0.15173662 - time (sec): 21.83 - samples/sec: 7287.27 - lr: 0.000014 - momentum: 0.000000
|
161 |
+
2023-10-18 22:01:59,087 epoch 6 - iter 1440/1445 - loss 0.15020089 - time (sec): 24.18 - samples/sec: 7256.78 - lr: 0.000013 - momentum: 0.000000
|
162 |
+
2023-10-18 22:01:59,166 ----------------------------------------------------------------------------------------------------
|
163 |
+
2023-10-18 22:01:59,166 EPOCH 6 done: loss 0.1497 - lr: 0.000013
|
164 |
+
2023-10-18 22:02:00,908 DEV : loss 0.1934184432029724 - f1-score (micro avg) 0.4955
|
165 |
+
2023-10-18 22:02:00,921 saving best model
|
166 |
+
2023-10-18 22:02:00,960 ----------------------------------------------------------------------------------------------------
|
167 |
+
2023-10-18 22:02:03,379 epoch 7 - iter 144/1445 - loss 0.14798022 - time (sec): 2.42 - samples/sec: 6988.07 - lr: 0.000013 - momentum: 0.000000
|
168 |
+
2023-10-18 22:02:05,759 epoch 7 - iter 288/1445 - loss 0.15004788 - time (sec): 4.80 - samples/sec: 7345.00 - lr: 0.000013 - momentum: 0.000000
|
169 |
+
2023-10-18 22:02:07,869 epoch 7 - iter 432/1445 - loss 0.14819919 - time (sec): 6.91 - samples/sec: 7619.50 - lr: 0.000012 - momentum: 0.000000
|
170 |
+
2023-10-18 22:02:09,992 epoch 7 - iter 576/1445 - loss 0.15004725 - time (sec): 9.03 - samples/sec: 7717.58 - lr: 0.000012 - momentum: 0.000000
|
171 |
+
2023-10-18 22:02:12,262 epoch 7 - iter 720/1445 - loss 0.14823649 - time (sec): 11.30 - samples/sec: 7662.44 - lr: 0.000012 - momentum: 0.000000
|
172 |
+
2023-10-18 22:02:14,748 epoch 7 - iter 864/1445 - loss 0.14691283 - time (sec): 13.79 - samples/sec: 7636.51 - lr: 0.000011 - momentum: 0.000000
|
173 |
+
2023-10-18 22:02:17,084 epoch 7 - iter 1008/1445 - loss 0.14491328 - time (sec): 16.12 - samples/sec: 7604.78 - lr: 0.000011 - momentum: 0.000000
|
174 |
+
2023-10-18 22:02:19,390 epoch 7 - iter 1152/1445 - loss 0.14664592 - time (sec): 18.43 - samples/sec: 7537.25 - lr: 0.000011 - momentum: 0.000000
|
175 |
+
2023-10-18 22:02:21,922 epoch 7 - iter 1296/1445 - loss 0.14689743 - time (sec): 20.96 - samples/sec: 7496.75 - lr: 0.000010 - momentum: 0.000000
|
176 |
+
2023-10-18 22:02:24,541 epoch 7 - iter 1440/1445 - loss 0.14542815 - time (sec): 23.58 - samples/sec: 7446.12 - lr: 0.000010 - momentum: 0.000000
|
177 |
+
2023-10-18 22:02:24,621 ----------------------------------------------------------------------------------------------------
|
178 |
+
2023-10-18 22:02:24,621 EPOCH 7 done: loss 0.1451 - lr: 0.000010
|
179 |
+
2023-10-18 22:02:26,376 DEV : loss 0.2011541873216629 - f1-score (micro avg) 0.4985
|
180 |
+
2023-10-18 22:02:26,390 saving best model
|
181 |
+
2023-10-18 22:02:26,429 ----------------------------------------------------------------------------------------------------
|
182 |
+
2023-10-18 22:02:28,755 epoch 8 - iter 144/1445 - loss 0.13892356 - time (sec): 2.33 - samples/sec: 6931.81 - lr: 0.000010 - momentum: 0.000000
|
183 |
+
2023-10-18 22:02:31,157 epoch 8 - iter 288/1445 - loss 0.16270806 - time (sec): 4.73 - samples/sec: 7224.99 - lr: 0.000009 - momentum: 0.000000
|
184 |
+
2023-10-18 22:02:33,593 epoch 8 - iter 432/1445 - loss 0.14932146 - time (sec): 7.16 - samples/sec: 7397.91 - lr: 0.000009 - momentum: 0.000000
|
185 |
+
2023-10-18 22:02:35,992 epoch 8 - iter 576/1445 - loss 0.14416102 - time (sec): 9.56 - samples/sec: 7367.11 - lr: 0.000009 - momentum: 0.000000
|
186 |
+
2023-10-18 22:02:38,357 epoch 8 - iter 720/1445 - loss 0.14309548 - time (sec): 11.93 - samples/sec: 7394.65 - lr: 0.000008 - momentum: 0.000000
|
187 |
+
2023-10-18 22:02:40,780 epoch 8 - iter 864/1445 - loss 0.13966555 - time (sec): 14.35 - samples/sec: 7431.66 - lr: 0.000008 - momentum: 0.000000
|
188 |
+
2023-10-18 22:02:42,922 epoch 8 - iter 1008/1445 - loss 0.13959740 - time (sec): 16.49 - samples/sec: 7470.49 - lr: 0.000008 - momentum: 0.000000
|
189 |
+
2023-10-18 22:02:45,311 epoch 8 - iter 1152/1445 - loss 0.13878179 - time (sec): 18.88 - samples/sec: 7479.51 - lr: 0.000007 - momentum: 0.000000
|
190 |
+
2023-10-18 22:02:47,754 epoch 8 - iter 1296/1445 - loss 0.13924233 - time (sec): 21.32 - samples/sec: 7414.70 - lr: 0.000007 - momentum: 0.000000
|
191 |
+
2023-10-18 22:02:50,270 epoch 8 - iter 1440/1445 - loss 0.14136705 - time (sec): 23.84 - samples/sec: 7374.61 - lr: 0.000007 - momentum: 0.000000
|
192 |
+
2023-10-18 22:02:50,344 ----------------------------------------------------------------------------------------------------
|
193 |
+
2023-10-18 22:02:50,345 EPOCH 8 done: loss 0.1412 - lr: 0.000007
|
194 |
+
2023-10-18 22:02:52,412 DEV : loss 0.18803337216377258 - f1-score (micro avg) 0.5148
|
195 |
+
2023-10-18 22:02:52,425 saving best model
|
196 |
+
2023-10-18 22:02:52,462 ----------------------------------------------------------------------------------------------------
|
197 |
+
2023-10-18 22:02:54,997 epoch 9 - iter 144/1445 - loss 0.12702014 - time (sec): 2.53 - samples/sec: 7692.24 - lr: 0.000006 - momentum: 0.000000
|
198 |
+
2023-10-18 22:02:57,360 epoch 9 - iter 288/1445 - loss 0.12009535 - time (sec): 4.90 - samples/sec: 7506.16 - lr: 0.000006 - momentum: 0.000000
|
199 |
+
2023-10-18 22:02:59,745 epoch 9 - iter 432/1445 - loss 0.12369073 - time (sec): 7.28 - samples/sec: 7346.44 - lr: 0.000006 - momentum: 0.000000
|
200 |
+
2023-10-18 22:03:02,129 epoch 9 - iter 576/1445 - loss 0.13008475 - time (sec): 9.67 - samples/sec: 7319.34 - lr: 0.000005 - momentum: 0.000000
|
201 |
+
2023-10-18 22:03:04,591 epoch 9 - iter 720/1445 - loss 0.13416094 - time (sec): 12.13 - samples/sec: 7290.53 - lr: 0.000005 - momentum: 0.000000
|
202 |
+
2023-10-18 22:03:07,023 epoch 9 - iter 864/1445 - loss 0.13690020 - time (sec): 14.56 - samples/sec: 7193.92 - lr: 0.000005 - momentum: 0.000000
|
203 |
+
2023-10-18 22:03:09,360 epoch 9 - iter 1008/1445 - loss 0.13729848 - time (sec): 16.90 - samples/sec: 7163.42 - lr: 0.000004 - momentum: 0.000000
|
204 |
+
2023-10-18 22:03:11,816 epoch 9 - iter 1152/1445 - loss 0.13654477 - time (sec): 19.35 - samples/sec: 7275.35 - lr: 0.000004 - momentum: 0.000000
|
205 |
+
2023-10-18 22:03:14,290 epoch 9 - iter 1296/1445 - loss 0.13733817 - time (sec): 21.83 - samples/sec: 7259.12 - lr: 0.000004 - momentum: 0.000000
|
206 |
+
2023-10-18 22:03:16,677 epoch 9 - iter 1440/1445 - loss 0.13616533 - time (sec): 24.21 - samples/sec: 7251.88 - lr: 0.000003 - momentum: 0.000000
|
207 |
+
2023-10-18 22:03:16,759 ----------------------------------------------------------------------------------------------------
|
208 |
+
2023-10-18 22:03:16,759 EPOCH 9 done: loss 0.1361 - lr: 0.000003
|
209 |
+
2023-10-18 22:03:18,515 DEV : loss 0.19624893367290497 - f1-score (micro avg) 0.5066
|
210 |
+
2023-10-18 22:03:18,528 ----------------------------------------------------------------------------------------------------
|
211 |
+
2023-10-18 22:03:20,907 epoch 10 - iter 144/1445 - loss 0.11586494 - time (sec): 2.38 - samples/sec: 7185.66 - lr: 0.000003 - momentum: 0.000000
|
212 |
+
2023-10-18 22:03:23,303 epoch 10 - iter 288/1445 - loss 0.13433802 - time (sec): 4.77 - samples/sec: 7102.63 - lr: 0.000003 - momentum: 0.000000
|
213 |
+
2023-10-18 22:03:25,719 epoch 10 - iter 432/1445 - loss 0.13187069 - time (sec): 7.19 - samples/sec: 7258.19 - lr: 0.000002 - momentum: 0.000000
|
214 |
+
2023-10-18 22:03:28,127 epoch 10 - iter 576/1445 - loss 0.13364114 - time (sec): 9.60 - samples/sec: 7234.28 - lr: 0.000002 - momentum: 0.000000
|
215 |
+
2023-10-18 22:03:30,575 epoch 10 - iter 720/1445 - loss 0.13965295 - time (sec): 12.05 - samples/sec: 7334.22 - lr: 0.000002 - momentum: 0.000000
|
216 |
+
2023-10-18 22:03:32,991 epoch 10 - iter 864/1445 - loss 0.13925682 - time (sec): 14.46 - samples/sec: 7289.69 - lr: 0.000001 - momentum: 0.000000
|
217 |
+
2023-10-18 22:03:35,380 epoch 10 - iter 1008/1445 - loss 0.13652005 - time (sec): 16.85 - samples/sec: 7336.21 - lr: 0.000001 - momentum: 0.000000
|
218 |
+
2023-10-18 22:03:37,739 epoch 10 - iter 1152/1445 - loss 0.13504195 - time (sec): 19.21 - samples/sec: 7338.11 - lr: 0.000001 - momentum: 0.000000
|
219 |
+
2023-10-18 22:03:40,156 epoch 10 - iter 1296/1445 - loss 0.13339496 - time (sec): 21.63 - samples/sec: 7293.80 - lr: 0.000000 - momentum: 0.000000
|
220 |
+
2023-10-18 22:03:42,560 epoch 10 - iter 1440/1445 - loss 0.13422751 - time (sec): 24.03 - samples/sec: 7307.44 - lr: 0.000000 - momentum: 0.000000
|
221 |
+
2023-10-18 22:03:42,644 ----------------------------------------------------------------------------------------------------
|
222 |
+
2023-10-18 22:03:42,644 EPOCH 10 done: loss 0.1344 - lr: 0.000000
|
223 |
+
2023-10-18 22:03:44,403 DEV : loss 0.19043943285942078 - f1-score (micro avg) 0.519
|
224 |
+
2023-10-18 22:03:44,417 saving best model
|
225 |
+
2023-10-18 22:03:44,484 ----------------------------------------------------------------------------------------------------
|
226 |
+
2023-10-18 22:03:44,484 Loading model from best epoch ...
|
227 |
+
2023-10-18 22:03:44,562 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG
|
228 |
+
2023-10-18 22:03:45,832
|
229 |
+
Results:
|
230 |
+
- F-score (micro) 0.5212
|
231 |
+
- F-score (macro) 0.3655
|
232 |
+
- Accuracy 0.3643
|
233 |
+
|
234 |
+
By class:
|
235 |
+
precision recall f1-score support
|
236 |
+
|
237 |
+
LOC 0.5700 0.6397 0.6029 458
|
238 |
+
PER 0.5911 0.3838 0.4654 482
|
239 |
+
ORG 0.5000 0.0145 0.0282 69
|
240 |
+
|
241 |
+
micro avg 0.5778 0.4747 0.5212 1009
|
242 |
+
macro avg 0.5537 0.3460 0.3655 1009
|
243 |
+
weighted avg 0.5753 0.4747 0.4979 1009
|
244 |
+
|
245 |
+
2023-10-18 22:03:45,832 ----------------------------------------------------------------------------------------------------
|