stefan-it's picture
Upload folder using huggingface_hub
6b7fdb0
raw
history blame contribute delete
No virus
23.9 kB
2023-10-13 09:09:28,796 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:28,797 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): BertModel(
(embeddings): BertEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): BertEncoder(
(layer): ModuleList(
(0-11): 12 x BertLayer(
(attention): BertAttention(
(self): BertSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): BertSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): BertIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): BertOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
(pooler): BertPooler(
(dense): Linear(in_features=768, out_features=768, bias=True)
(activation): Tanh()
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=25, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-13 09:09:28,797 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:28,797 MultiCorpus: 1214 train + 266 dev + 251 test sentences
- NER_HIPE_2022 Corpus: 1214 train + 266 dev + 251 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/ajmc/en/with_doc_seperator
2023-10-13 09:09:28,797 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:28,797 Train: 1214 sentences
2023-10-13 09:09:28,797 (train_with_dev=False, train_with_test=False)
2023-10-13 09:09:28,797 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:28,797 Training Params:
2023-10-13 09:09:28,797 - learning_rate: "5e-05"
2023-10-13 09:09:28,797 - mini_batch_size: "8"
2023-10-13 09:09:28,797 - max_epochs: "10"
2023-10-13 09:09:28,798 - shuffle: "True"
2023-10-13 09:09:28,798 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:28,798 Plugins:
2023-10-13 09:09:28,798 - LinearScheduler | warmup_fraction: '0.1'
2023-10-13 09:09:28,798 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:28,798 Final evaluation on model from best epoch (best-model.pt)
2023-10-13 09:09:28,798 - metric: "('micro avg', 'f1-score')"
2023-10-13 09:09:28,798 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:28,798 Computation:
2023-10-13 09:09:28,798 - compute on device: cuda:0
2023-10-13 09:09:28,798 - embedding storage: none
2023-10-13 09:09:28,798 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:28,798 Model training base path: "hmbench-ajmc/en-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1"
2023-10-13 09:09:28,798 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:28,798 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:29,632 epoch 1 - iter 15/152 - loss 3.35979202 - time (sec): 0.83 - samples/sec: 3885.17 - lr: 0.000005 - momentum: 0.000000
2023-10-13 09:09:30,482 epoch 1 - iter 30/152 - loss 2.97523619 - time (sec): 1.68 - samples/sec: 3790.39 - lr: 0.000010 - momentum: 0.000000
2023-10-13 09:09:31,333 epoch 1 - iter 45/152 - loss 2.29707337 - time (sec): 2.53 - samples/sec: 3837.10 - lr: 0.000014 - momentum: 0.000000
2023-10-13 09:09:32,190 epoch 1 - iter 60/152 - loss 1.92992957 - time (sec): 3.39 - samples/sec: 3791.47 - lr: 0.000019 - momentum: 0.000000
2023-10-13 09:09:33,021 epoch 1 - iter 75/152 - loss 1.69701974 - time (sec): 4.22 - samples/sec: 3765.82 - lr: 0.000024 - momentum: 0.000000
2023-10-13 09:09:33,837 epoch 1 - iter 90/152 - loss 1.54581509 - time (sec): 5.04 - samples/sec: 3701.96 - lr: 0.000029 - momentum: 0.000000
2023-10-13 09:09:34,706 epoch 1 - iter 105/152 - loss 1.38877361 - time (sec): 5.91 - samples/sec: 3708.40 - lr: 0.000034 - momentum: 0.000000
2023-10-13 09:09:35,506 epoch 1 - iter 120/152 - loss 1.27387563 - time (sec): 6.71 - samples/sec: 3693.97 - lr: 0.000039 - momentum: 0.000000
2023-10-13 09:09:36,357 epoch 1 - iter 135/152 - loss 1.18331584 - time (sec): 7.56 - samples/sec: 3648.17 - lr: 0.000044 - momentum: 0.000000
2023-10-13 09:09:37,187 epoch 1 - iter 150/152 - loss 1.09949240 - time (sec): 8.39 - samples/sec: 3651.21 - lr: 0.000049 - momentum: 0.000000
2023-10-13 09:09:37,298 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:37,298 EPOCH 1 done: loss 1.0907 - lr: 0.000049
2023-10-13 09:09:38,168 DEV : loss 0.2957800626754761 - f1-score (micro avg) 0.4885
2023-10-13 09:09:38,175 saving best model
2023-10-13 09:09:38,524 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:39,376 epoch 2 - iter 15/152 - loss 0.25465290 - time (sec): 0.85 - samples/sec: 3632.43 - lr: 0.000049 - momentum: 0.000000
2023-10-13 09:09:40,210 epoch 2 - iter 30/152 - loss 0.27293924 - time (sec): 1.68 - samples/sec: 3555.08 - lr: 0.000049 - momentum: 0.000000
2023-10-13 09:09:41,115 epoch 2 - iter 45/152 - loss 0.23308303 - time (sec): 2.59 - samples/sec: 3561.21 - lr: 0.000048 - momentum: 0.000000
2023-10-13 09:09:41,917 epoch 2 - iter 60/152 - loss 0.22215108 - time (sec): 3.39 - samples/sec: 3559.73 - lr: 0.000048 - momentum: 0.000000
2023-10-13 09:09:42,795 epoch 2 - iter 75/152 - loss 0.20497190 - time (sec): 4.27 - samples/sec: 3604.23 - lr: 0.000047 - momentum: 0.000000
2023-10-13 09:09:43,633 epoch 2 - iter 90/152 - loss 0.19515449 - time (sec): 5.11 - samples/sec: 3623.53 - lr: 0.000047 - momentum: 0.000000
2023-10-13 09:09:44,478 epoch 2 - iter 105/152 - loss 0.19299294 - time (sec): 5.95 - samples/sec: 3663.87 - lr: 0.000046 - momentum: 0.000000
2023-10-13 09:09:45,305 epoch 2 - iter 120/152 - loss 0.19413229 - time (sec): 6.78 - samples/sec: 3646.02 - lr: 0.000046 - momentum: 0.000000
2023-10-13 09:09:46,095 epoch 2 - iter 135/152 - loss 0.18956413 - time (sec): 7.57 - samples/sec: 3636.33 - lr: 0.000045 - momentum: 0.000000
2023-10-13 09:09:46,946 epoch 2 - iter 150/152 - loss 0.17975047 - time (sec): 8.42 - samples/sec: 3650.04 - lr: 0.000045 - momentum: 0.000000
2023-10-13 09:09:47,040 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:47,040 EPOCH 2 done: loss 0.1793 - lr: 0.000045
2023-10-13 09:09:47,982 DEV : loss 0.16208764910697937 - f1-score (micro avg) 0.7581
2023-10-13 09:09:47,989 saving best model
2023-10-13 09:09:48,446 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:49,253 epoch 3 - iter 15/152 - loss 0.06399186 - time (sec): 0.80 - samples/sec: 3747.15 - lr: 0.000044 - momentum: 0.000000
2023-10-13 09:09:50,103 epoch 3 - iter 30/152 - loss 0.08228387 - time (sec): 1.65 - samples/sec: 3672.65 - lr: 0.000043 - momentum: 0.000000
2023-10-13 09:09:50,905 epoch 3 - iter 45/152 - loss 0.08997427 - time (sec): 2.45 - samples/sec: 3675.16 - lr: 0.000043 - momentum: 0.000000
2023-10-13 09:09:51,750 epoch 3 - iter 60/152 - loss 0.08902809 - time (sec): 3.30 - samples/sec: 3663.19 - lr: 0.000042 - momentum: 0.000000
2023-10-13 09:09:52,573 epoch 3 - iter 75/152 - loss 0.10342704 - time (sec): 4.12 - samples/sec: 3634.37 - lr: 0.000042 - momentum: 0.000000
2023-10-13 09:09:53,381 epoch 3 - iter 90/152 - loss 0.09897941 - time (sec): 4.93 - samples/sec: 3661.69 - lr: 0.000041 - momentum: 0.000000
2023-10-13 09:09:54,270 epoch 3 - iter 105/152 - loss 0.09691788 - time (sec): 5.82 - samples/sec: 3700.27 - lr: 0.000041 - momentum: 0.000000
2023-10-13 09:09:55,068 epoch 3 - iter 120/152 - loss 0.09665511 - time (sec): 6.62 - samples/sec: 3724.59 - lr: 0.000040 - momentum: 0.000000
2023-10-13 09:09:55,902 epoch 3 - iter 135/152 - loss 0.09222420 - time (sec): 7.45 - samples/sec: 3691.20 - lr: 0.000040 - momentum: 0.000000
2023-10-13 09:09:56,765 epoch 3 - iter 150/152 - loss 0.08979963 - time (sec): 8.31 - samples/sec: 3689.83 - lr: 0.000039 - momentum: 0.000000
2023-10-13 09:09:56,870 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:56,870 EPOCH 3 done: loss 0.0916 - lr: 0.000039
2023-10-13 09:09:57,771 DEV : loss 0.16738569736480713 - f1-score (micro avg) 0.7862
2023-10-13 09:09:57,777 saving best model
2023-10-13 09:09:58,238 ----------------------------------------------------------------------------------------------------
2023-10-13 09:09:59,074 epoch 4 - iter 15/152 - loss 0.02563417 - time (sec): 0.84 - samples/sec: 3606.35 - lr: 0.000038 - momentum: 0.000000
2023-10-13 09:09:59,951 epoch 4 - iter 30/152 - loss 0.06874663 - time (sec): 1.71 - samples/sec: 3579.22 - lr: 0.000038 - momentum: 0.000000
2023-10-13 09:10:00,832 epoch 4 - iter 45/152 - loss 0.06275029 - time (sec): 2.59 - samples/sec: 3550.94 - lr: 0.000037 - momentum: 0.000000
2023-10-13 09:10:01,669 epoch 4 - iter 60/152 - loss 0.06394701 - time (sec): 3.43 - samples/sec: 3551.45 - lr: 0.000037 - momentum: 0.000000
2023-10-13 09:10:02,520 epoch 4 - iter 75/152 - loss 0.06405276 - time (sec): 4.28 - samples/sec: 3525.09 - lr: 0.000036 - momentum: 0.000000
2023-10-13 09:10:03,394 epoch 4 - iter 90/152 - loss 0.06270428 - time (sec): 5.15 - samples/sec: 3536.04 - lr: 0.000036 - momentum: 0.000000
2023-10-13 09:10:04,202 epoch 4 - iter 105/152 - loss 0.05895619 - time (sec): 5.96 - samples/sec: 3563.71 - lr: 0.000035 - momentum: 0.000000
2023-10-13 09:10:05,036 epoch 4 - iter 120/152 - loss 0.05814145 - time (sec): 6.80 - samples/sec: 3543.97 - lr: 0.000035 - momentum: 0.000000
2023-10-13 09:10:05,908 epoch 4 - iter 135/152 - loss 0.05680622 - time (sec): 7.67 - samples/sec: 3582.90 - lr: 0.000034 - momentum: 0.000000
2023-10-13 09:10:06,773 epoch 4 - iter 150/152 - loss 0.06201324 - time (sec): 8.53 - samples/sec: 3586.91 - lr: 0.000034 - momentum: 0.000000
2023-10-13 09:10:06,883 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:06,883 EPOCH 4 done: loss 0.0617 - lr: 0.000034
2023-10-13 09:10:07,810 DEV : loss 0.16315597295761108 - f1-score (micro avg) 0.82
2023-10-13 09:10:07,817 saving best model
2023-10-13 09:10:08,277 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:09,174 epoch 5 - iter 15/152 - loss 0.05221666 - time (sec): 0.90 - samples/sec: 3724.79 - lr: 0.000033 - momentum: 0.000000
2023-10-13 09:10:10,024 epoch 5 - iter 30/152 - loss 0.04555180 - time (sec): 1.74 - samples/sec: 3537.54 - lr: 0.000032 - momentum: 0.000000
2023-10-13 09:10:10,897 epoch 5 - iter 45/152 - loss 0.04165064 - time (sec): 2.62 - samples/sec: 3617.27 - lr: 0.000032 - momentum: 0.000000
2023-10-13 09:10:11,721 epoch 5 - iter 60/152 - loss 0.03870084 - time (sec): 3.44 - samples/sec: 3651.29 - lr: 0.000031 - momentum: 0.000000
2023-10-13 09:10:12,558 epoch 5 - iter 75/152 - loss 0.03764787 - time (sec): 4.28 - samples/sec: 3630.18 - lr: 0.000031 - momentum: 0.000000
2023-10-13 09:10:13,354 epoch 5 - iter 90/152 - loss 0.03734983 - time (sec): 5.08 - samples/sec: 3677.86 - lr: 0.000030 - momentum: 0.000000
2023-10-13 09:10:14,177 epoch 5 - iter 105/152 - loss 0.03763953 - time (sec): 5.90 - samples/sec: 3665.26 - lr: 0.000030 - momentum: 0.000000
2023-10-13 09:10:15,039 epoch 5 - iter 120/152 - loss 0.03592508 - time (sec): 6.76 - samples/sec: 3695.25 - lr: 0.000029 - momentum: 0.000000
2023-10-13 09:10:15,875 epoch 5 - iter 135/152 - loss 0.03675536 - time (sec): 7.60 - samples/sec: 3667.50 - lr: 0.000029 - momentum: 0.000000
2023-10-13 09:10:16,730 epoch 5 - iter 150/152 - loss 0.04242836 - time (sec): 8.45 - samples/sec: 3632.81 - lr: 0.000028 - momentum: 0.000000
2023-10-13 09:10:16,834 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:16,834 EPOCH 5 done: loss 0.0422 - lr: 0.000028
2023-10-13 09:10:17,750 DEV : loss 0.17263589799404144 - f1-score (micro avg) 0.8308
2023-10-13 09:10:17,757 saving best model
2023-10-13 09:10:18,129 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:18,994 epoch 6 - iter 15/152 - loss 0.02172828 - time (sec): 0.86 - samples/sec: 3865.69 - lr: 0.000027 - momentum: 0.000000
2023-10-13 09:10:19,809 epoch 6 - iter 30/152 - loss 0.02300721 - time (sec): 1.68 - samples/sec: 3623.76 - lr: 0.000027 - momentum: 0.000000
2023-10-13 09:10:20,636 epoch 6 - iter 45/152 - loss 0.02518072 - time (sec): 2.51 - samples/sec: 3665.85 - lr: 0.000026 - momentum: 0.000000
2023-10-13 09:10:21,501 epoch 6 - iter 60/152 - loss 0.02702690 - time (sec): 3.37 - samples/sec: 3762.29 - lr: 0.000026 - momentum: 0.000000
2023-10-13 09:10:22,310 epoch 6 - iter 75/152 - loss 0.02831834 - time (sec): 4.18 - samples/sec: 3723.15 - lr: 0.000025 - momentum: 0.000000
2023-10-13 09:10:23,111 epoch 6 - iter 90/152 - loss 0.03230845 - time (sec): 4.98 - samples/sec: 3723.02 - lr: 0.000025 - momentum: 0.000000
2023-10-13 09:10:23,909 epoch 6 - iter 105/152 - loss 0.03020072 - time (sec): 5.78 - samples/sec: 3670.96 - lr: 0.000024 - momentum: 0.000000
2023-10-13 09:10:24,741 epoch 6 - iter 120/152 - loss 0.02981504 - time (sec): 6.61 - samples/sec: 3686.13 - lr: 0.000024 - momentum: 0.000000
2023-10-13 09:10:25,603 epoch 6 - iter 135/152 - loss 0.03223052 - time (sec): 7.47 - samples/sec: 3690.85 - lr: 0.000023 - momentum: 0.000000
2023-10-13 09:10:26,441 epoch 6 - iter 150/152 - loss 0.03309127 - time (sec): 8.31 - samples/sec: 3681.50 - lr: 0.000022 - momentum: 0.000000
2023-10-13 09:10:26,539 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:26,539 EPOCH 6 done: loss 0.0331 - lr: 0.000022
2023-10-13 09:10:27,448 DEV : loss 0.1962728649377823 - f1-score (micro avg) 0.8504
2023-10-13 09:10:27,454 saving best model
2023-10-13 09:10:27,909 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:28,733 epoch 7 - iter 15/152 - loss 0.01559430 - time (sec): 0.82 - samples/sec: 3429.20 - lr: 0.000022 - momentum: 0.000000
2023-10-13 09:10:29,621 epoch 7 - iter 30/152 - loss 0.01791985 - time (sec): 1.71 - samples/sec: 3440.95 - lr: 0.000021 - momentum: 0.000000
2023-10-13 09:10:30,471 epoch 7 - iter 45/152 - loss 0.02935294 - time (sec): 2.56 - samples/sec: 3477.22 - lr: 0.000021 - momentum: 0.000000
2023-10-13 09:10:31,361 epoch 7 - iter 60/152 - loss 0.02590223 - time (sec): 3.45 - samples/sec: 3591.98 - lr: 0.000020 - momentum: 0.000000
2023-10-13 09:10:32,191 epoch 7 - iter 75/152 - loss 0.02328655 - time (sec): 4.28 - samples/sec: 3549.12 - lr: 0.000020 - momentum: 0.000000
2023-10-13 09:10:33,094 epoch 7 - iter 90/152 - loss 0.02249471 - time (sec): 5.18 - samples/sec: 3547.43 - lr: 0.000019 - momentum: 0.000000
2023-10-13 09:10:33,922 epoch 7 - iter 105/152 - loss 0.02199476 - time (sec): 6.01 - samples/sec: 3517.66 - lr: 0.000019 - momentum: 0.000000
2023-10-13 09:10:34,750 epoch 7 - iter 120/152 - loss 0.02557431 - time (sec): 6.84 - samples/sec: 3529.03 - lr: 0.000018 - momentum: 0.000000
2023-10-13 09:10:35,604 epoch 7 - iter 135/152 - loss 0.02552682 - time (sec): 7.69 - samples/sec: 3549.49 - lr: 0.000017 - momentum: 0.000000
2023-10-13 09:10:36,424 epoch 7 - iter 150/152 - loss 0.02449646 - time (sec): 8.51 - samples/sec: 3592.22 - lr: 0.000017 - momentum: 0.000000
2023-10-13 09:10:36,555 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:36,555 EPOCH 7 done: loss 0.0242 - lr: 0.000017
2023-10-13 09:10:37,477 DEV : loss 0.20093460381031036 - f1-score (micro avg) 0.8281
2023-10-13 09:10:37,483 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:38,286 epoch 8 - iter 15/152 - loss 0.01442852 - time (sec): 0.80 - samples/sec: 3629.98 - lr: 0.000016 - momentum: 0.000000
2023-10-13 09:10:39,150 epoch 8 - iter 30/152 - loss 0.01078923 - time (sec): 1.67 - samples/sec: 3511.76 - lr: 0.000016 - momentum: 0.000000
2023-10-13 09:10:39,985 epoch 8 - iter 45/152 - loss 0.00982792 - time (sec): 2.50 - samples/sec: 3586.61 - lr: 0.000015 - momentum: 0.000000
2023-10-13 09:10:40,829 epoch 8 - iter 60/152 - loss 0.01492448 - time (sec): 3.34 - samples/sec: 3619.99 - lr: 0.000015 - momentum: 0.000000
2023-10-13 09:10:41,687 epoch 8 - iter 75/152 - loss 0.01430336 - time (sec): 4.20 - samples/sec: 3641.00 - lr: 0.000014 - momentum: 0.000000
2023-10-13 09:10:42,500 epoch 8 - iter 90/152 - loss 0.01323106 - time (sec): 5.02 - samples/sec: 3603.54 - lr: 0.000014 - momentum: 0.000000
2023-10-13 09:10:43,329 epoch 8 - iter 105/152 - loss 0.01352657 - time (sec): 5.84 - samples/sec: 3627.39 - lr: 0.000013 - momentum: 0.000000
2023-10-13 09:10:44,252 epoch 8 - iter 120/152 - loss 0.01221659 - time (sec): 6.77 - samples/sec: 3600.36 - lr: 0.000012 - momentum: 0.000000
2023-10-13 09:10:45,121 epoch 8 - iter 135/152 - loss 0.01867664 - time (sec): 7.64 - samples/sec: 3624.95 - lr: 0.000012 - momentum: 0.000000
2023-10-13 09:10:45,982 epoch 8 - iter 150/152 - loss 0.02019338 - time (sec): 8.50 - samples/sec: 3607.96 - lr: 0.000011 - momentum: 0.000000
2023-10-13 09:10:46,079 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:46,079 EPOCH 8 done: loss 0.0200 - lr: 0.000011
2023-10-13 09:10:47,024 DEV : loss 0.20136715471744537 - f1-score (micro avg) 0.8422
2023-10-13 09:10:47,030 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:47,824 epoch 9 - iter 15/152 - loss 0.01472138 - time (sec): 0.79 - samples/sec: 4058.82 - lr: 0.000011 - momentum: 0.000000
2023-10-13 09:10:48,637 epoch 9 - iter 30/152 - loss 0.00849322 - time (sec): 1.61 - samples/sec: 3879.77 - lr: 0.000010 - momentum: 0.000000
2023-10-13 09:10:49,466 epoch 9 - iter 45/152 - loss 0.00695289 - time (sec): 2.44 - samples/sec: 3752.05 - lr: 0.000010 - momentum: 0.000000
2023-10-13 09:10:50,336 epoch 9 - iter 60/152 - loss 0.01413121 - time (sec): 3.30 - samples/sec: 3756.35 - lr: 0.000009 - momentum: 0.000000
2023-10-13 09:10:51,210 epoch 9 - iter 75/152 - loss 0.01376909 - time (sec): 4.18 - samples/sec: 3734.93 - lr: 0.000009 - momentum: 0.000000
2023-10-13 09:10:52,016 epoch 9 - iter 90/152 - loss 0.01288286 - time (sec): 4.99 - samples/sec: 3725.09 - lr: 0.000008 - momentum: 0.000000
2023-10-13 09:10:52,844 epoch 9 - iter 105/152 - loss 0.01332195 - time (sec): 5.81 - samples/sec: 3697.20 - lr: 0.000007 - momentum: 0.000000
2023-10-13 09:10:53,783 epoch 9 - iter 120/152 - loss 0.01263024 - time (sec): 6.75 - samples/sec: 3672.97 - lr: 0.000007 - momentum: 0.000000
2023-10-13 09:10:54,623 epoch 9 - iter 135/152 - loss 0.01344767 - time (sec): 7.59 - samples/sec: 3652.73 - lr: 0.000006 - momentum: 0.000000
2023-10-13 09:10:55,530 epoch 9 - iter 150/152 - loss 0.01364812 - time (sec): 8.50 - samples/sec: 3617.04 - lr: 0.000006 - momentum: 0.000000
2023-10-13 09:10:55,635 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:55,636 EPOCH 9 done: loss 0.0135 - lr: 0.000006
2023-10-13 09:10:56,599 DEV : loss 0.20513303577899933 - f1-score (micro avg) 0.8498
2023-10-13 09:10:56,606 ----------------------------------------------------------------------------------------------------
2023-10-13 09:10:57,463 epoch 10 - iter 15/152 - loss 0.00583894 - time (sec): 0.86 - samples/sec: 3656.04 - lr: 0.000005 - momentum: 0.000000
2023-10-13 09:10:58,338 epoch 10 - iter 30/152 - loss 0.00314485 - time (sec): 1.73 - samples/sec: 3570.93 - lr: 0.000005 - momentum: 0.000000
2023-10-13 09:10:59,163 epoch 10 - iter 45/152 - loss 0.00790455 - time (sec): 2.56 - samples/sec: 3575.98 - lr: 0.000004 - momentum: 0.000000
2023-10-13 09:10:59,994 epoch 10 - iter 60/152 - loss 0.00688247 - time (sec): 3.39 - samples/sec: 3556.84 - lr: 0.000004 - momentum: 0.000000
2023-10-13 09:11:00,831 epoch 10 - iter 75/152 - loss 0.00751489 - time (sec): 4.22 - samples/sec: 3548.25 - lr: 0.000003 - momentum: 0.000000
2023-10-13 09:11:01,692 epoch 10 - iter 90/152 - loss 0.00759445 - time (sec): 5.09 - samples/sec: 3575.24 - lr: 0.000003 - momentum: 0.000000
2023-10-13 09:11:02,555 epoch 10 - iter 105/152 - loss 0.00852360 - time (sec): 5.95 - samples/sec: 3592.63 - lr: 0.000002 - momentum: 0.000000
2023-10-13 09:11:03,425 epoch 10 - iter 120/152 - loss 0.00871039 - time (sec): 6.82 - samples/sec: 3587.89 - lr: 0.000001 - momentum: 0.000000
2023-10-13 09:11:04,315 epoch 10 - iter 135/152 - loss 0.00995946 - time (sec): 7.71 - samples/sec: 3566.60 - lr: 0.000001 - momentum: 0.000000
2023-10-13 09:11:05,188 epoch 10 - iter 150/152 - loss 0.01069417 - time (sec): 8.58 - samples/sec: 3563.27 - lr: 0.000000 - momentum: 0.000000
2023-10-13 09:11:05,294 ----------------------------------------------------------------------------------------------------
2023-10-13 09:11:05,294 EPOCH 10 done: loss 0.0107 - lr: 0.000000
2023-10-13 09:11:06,294 DEV : loss 0.20549806952476501 - f1-score (micro avg) 0.8451
2023-10-13 09:11:06,660 ----------------------------------------------------------------------------------------------------
2023-10-13 09:11:06,661 Loading model from best epoch ...
2023-10-13 09:11:08,009 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-date, B-date, E-date, I-date, S-object, B-object, E-object, I-object
2023-10-13 09:11:08,869
Results:
- F-score (micro) 0.781
- F-score (macro) 0.476
- Accuracy 0.6553
By class:
precision recall f1-score support
scope 0.7186 0.7947 0.7547 151
work 0.7207 0.8421 0.7767 95
pers 0.7982 0.9062 0.8488 96
loc 0.0000 0.0000 0.0000 3
date 0.0000 0.0000 0.0000 3
micro avg 0.7416 0.8247 0.7810 348
macro avg 0.4475 0.5086 0.4760 348
weighted avg 0.7287 0.8247 0.7737 348
2023-10-13 09:11:08,869 ----------------------------------------------------------------------------------------------------