2023-10-15 22:27:17,165 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:17,166 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): BertModel( (embeddings): BertEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): BertEncoder( (layer): ModuleList( (0-11): 12 x BertLayer( (attention): BertAttention( (self): BertSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): BertSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): BertIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): BertOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) (pooler): BertPooler( (dense): Linear(in_features=768, out_features=768, bias=True) (activation): Tanh() ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=17, bias=True) (loss_function): CrossEntropyLoss() )" 2023-10-15 22:27:17,166 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:17,167 MultiCorpus: 20847 train + 1123 dev + 3350 test sentences - NER_HIPE_2022 Corpus: 20847 train + 1123 dev + 3350 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/newseye/de/with_doc_seperator 2023-10-15 22:27:17,167 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:17,167 Train: 20847 sentences 2023-10-15 22:27:17,167 (train_with_dev=False, train_with_test=False) 2023-10-15 22:27:17,167 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:17,167 Training Params: 2023-10-15 22:27:17,167 - learning_rate: "5e-05" 2023-10-15 22:27:17,167 - mini_batch_size: "8" 2023-10-15 22:27:17,167 - max_epochs: "10" 2023-10-15 22:27:17,167 - shuffle: "True" 2023-10-15 22:27:17,167 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:17,167 Plugins: 2023-10-15 22:27:17,167 - LinearScheduler | warmup_fraction: '0.1' 2023-10-15 22:27:17,167 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:17,167 Final evaluation on model from best epoch (best-model.pt) 2023-10-15 22:27:17,167 - metric: "('micro avg', 'f1-score')" 2023-10-15 22:27:17,167 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:17,167 Computation: 2023-10-15 22:27:17,167 - compute on device: cuda:0 2023-10-15 22:27:17,167 - embedding storage: none 2023-10-15 22:27:17,167 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:17,167 Model training base path: "hmbench-newseye/de-dbmdz/bert-base-historic-multilingual-cased-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5" 2023-10-15 22:27:17,167 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:17,167 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:27:35,255 epoch 1 - iter 260/2606 - loss 1.55162857 - time (sec): 18.09 - samples/sec: 1967.99 - lr: 0.000005 - momentum: 0.000000 2023-10-15 22:27:54,274 epoch 1 - iter 520/2606 - loss 0.96412232 - time (sec): 37.11 - samples/sec: 1964.22 - lr: 0.000010 - momentum: 0.000000 2023-10-15 22:28:12,959 epoch 1 - iter 780/2606 - loss 0.73562218 - time (sec): 55.79 - samples/sec: 1943.27 - lr: 0.000015 - momentum: 0.000000 2023-10-15 22:28:31,659 epoch 1 - iter 1040/2606 - loss 0.61663296 - time (sec): 74.49 - samples/sec: 1932.82 - lr: 0.000020 - momentum: 0.000000 2023-10-15 22:28:50,994 epoch 1 - iter 1300/2606 - loss 0.53611543 - time (sec): 93.83 - samples/sec: 1924.35 - lr: 0.000025 - momentum: 0.000000 2023-10-15 22:29:09,993 epoch 1 - iter 1560/2606 - loss 0.47717725 - time (sec): 112.83 - samples/sec: 1932.36 - lr: 0.000030 - momentum: 0.000000 2023-10-15 22:29:28,212 epoch 1 - iter 1820/2606 - loss 0.43977687 - time (sec): 131.04 - samples/sec: 1946.37 - lr: 0.000035 - momentum: 0.000000 2023-10-15 22:29:47,039 epoch 1 - iter 2080/2606 - loss 0.40953012 - time (sec): 149.87 - samples/sec: 1941.48 - lr: 0.000040 - momentum: 0.000000 2023-10-15 22:30:05,821 epoch 1 - iter 2340/2606 - loss 0.38758796 - time (sec): 168.65 - samples/sec: 1935.49 - lr: 0.000045 - momentum: 0.000000 2023-10-15 22:30:25,851 epoch 1 - iter 2600/2606 - loss 0.36513327 - time (sec): 188.68 - samples/sec: 1941.95 - lr: 0.000050 - momentum: 0.000000 2023-10-15 22:30:26,350 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:30:26,350 EPOCH 1 done: loss 0.3646 - lr: 0.000050 2023-10-15 22:30:33,057 DEV : loss 0.12843316793441772 - f1-score (micro avg) 0.316 2023-10-15 22:30:33,086 saving best model 2023-10-15 22:30:33,466 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:30:53,080 epoch 2 - iter 260/2606 - loss 0.16328381 - time (sec): 19.61 - samples/sec: 1977.33 - lr: 0.000049 - momentum: 0.000000 2023-10-15 22:31:11,726 epoch 2 - iter 520/2606 - loss 0.15786652 - time (sec): 38.26 - samples/sec: 1960.14 - lr: 0.000049 - momentum: 0.000000 2023-10-15 22:31:30,523 epoch 2 - iter 780/2606 - loss 0.15096452 - time (sec): 57.06 - samples/sec: 1953.36 - lr: 0.000048 - momentum: 0.000000 2023-10-15 22:31:49,690 epoch 2 - iter 1040/2606 - loss 0.15162785 - time (sec): 76.22 - samples/sec: 1951.95 - lr: 0.000048 - momentum: 0.000000 2023-10-15 22:32:08,501 epoch 2 - iter 1300/2606 - loss 0.15521001 - time (sec): 95.03 - samples/sec: 1945.47 - lr: 0.000047 - momentum: 0.000000 2023-10-15 22:32:27,507 epoch 2 - iter 1560/2606 - loss 0.15206366 - time (sec): 114.04 - samples/sec: 1949.84 - lr: 0.000047 - momentum: 0.000000 2023-10-15 22:32:45,650 epoch 2 - iter 1820/2606 - loss 0.15321996 - time (sec): 132.18 - samples/sec: 1952.09 - lr: 0.000046 - momentum: 0.000000 2023-10-15 22:33:05,338 epoch 2 - iter 2080/2606 - loss 0.15196432 - time (sec): 151.87 - samples/sec: 1955.77 - lr: 0.000046 - momentum: 0.000000 2023-10-15 22:33:22,997 epoch 2 - iter 2340/2606 - loss 0.15199041 - time (sec): 169.53 - samples/sec: 1948.27 - lr: 0.000045 - momentum: 0.000000 2023-10-15 22:33:41,251 epoch 2 - iter 2600/2606 - loss 0.15242040 - time (sec): 187.78 - samples/sec: 1953.00 - lr: 0.000044 - momentum: 0.000000 2023-10-15 22:33:41,598 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:33:41,598 EPOCH 2 done: loss 0.1525 - lr: 0.000044 2023-10-15 22:33:49,877 DEV : loss 0.15268105268478394 - f1-score (micro avg) 0.3282 2023-10-15 22:33:49,905 saving best model 2023-10-15 22:33:51,198 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:34:09,869 epoch 3 - iter 260/2606 - loss 0.13841022 - time (sec): 18.67 - samples/sec: 1935.44 - lr: 0.000044 - momentum: 0.000000 2023-10-15 22:34:27,325 epoch 3 - iter 520/2606 - loss 0.12007742 - time (sec): 36.12 - samples/sec: 1904.26 - lr: 0.000043 - momentum: 0.000000 2023-10-15 22:34:45,146 epoch 3 - iter 780/2606 - loss 0.11917281 - time (sec): 53.94 - samples/sec: 1908.77 - lr: 0.000043 - momentum: 0.000000 2023-10-15 22:35:03,249 epoch 3 - iter 1040/2606 - loss 0.11929376 - time (sec): 72.05 - samples/sec: 1927.33 - lr: 0.000042 - momentum: 0.000000 2023-10-15 22:35:22,165 epoch 3 - iter 1300/2606 - loss 0.11370851 - time (sec): 90.96 - samples/sec: 1935.39 - lr: 0.000042 - momentum: 0.000000 2023-10-15 22:35:41,375 epoch 3 - iter 1560/2606 - loss 0.11243711 - time (sec): 110.17 - samples/sec: 1934.73 - lr: 0.000041 - momentum: 0.000000 2023-10-15 22:36:00,749 epoch 3 - iter 1820/2606 - loss 0.11154229 - time (sec): 129.55 - samples/sec: 1938.83 - lr: 0.000041 - momentum: 0.000000 2023-10-15 22:36:19,809 epoch 3 - iter 2080/2606 - loss 0.11088653 - time (sec): 148.61 - samples/sec: 1934.20 - lr: 0.000040 - momentum: 0.000000 2023-10-15 22:36:39,165 epoch 3 - iter 2340/2606 - loss 0.11020741 - time (sec): 167.96 - samples/sec: 1942.87 - lr: 0.000039 - momentum: 0.000000 2023-10-15 22:36:59,231 epoch 3 - iter 2600/2606 - loss 0.10875412 - time (sec): 188.03 - samples/sec: 1951.27 - lr: 0.000039 - momentum: 0.000000 2023-10-15 22:36:59,592 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:36:59,592 EPOCH 3 done: loss 0.1089 - lr: 0.000039 2023-10-15 22:37:07,822 DEV : loss 0.1973780244588852 - f1-score (micro avg) 0.3449 2023-10-15 22:37:07,851 saving best model 2023-10-15 22:37:08,459 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:37:26,324 epoch 4 - iter 260/2606 - loss 0.06966027 - time (sec): 17.86 - samples/sec: 1961.16 - lr: 0.000038 - momentum: 0.000000 2023-10-15 22:37:44,366 epoch 4 - iter 520/2606 - loss 0.07834214 - time (sec): 35.91 - samples/sec: 1997.72 - lr: 0.000038 - momentum: 0.000000 2023-10-15 22:38:04,320 epoch 4 - iter 780/2606 - loss 0.07562032 - time (sec): 55.86 - samples/sec: 1973.69 - lr: 0.000037 - momentum: 0.000000 2023-10-15 22:38:24,283 epoch 4 - iter 1040/2606 - loss 0.07734676 - time (sec): 75.82 - samples/sec: 1973.03 - lr: 0.000037 - momentum: 0.000000 2023-10-15 22:38:43,354 epoch 4 - iter 1300/2606 - loss 0.07949064 - time (sec): 94.89 - samples/sec: 1971.18 - lr: 0.000036 - momentum: 0.000000 2023-10-15 22:39:01,902 epoch 4 - iter 1560/2606 - loss 0.08030660 - time (sec): 113.44 - samples/sec: 1963.18 - lr: 0.000036 - momentum: 0.000000 2023-10-15 22:39:21,242 epoch 4 - iter 1820/2606 - loss 0.07936732 - time (sec): 132.78 - samples/sec: 1949.28 - lr: 0.000035 - momentum: 0.000000 2023-10-15 22:39:40,653 epoch 4 - iter 2080/2606 - loss 0.07794236 - time (sec): 152.19 - samples/sec: 1948.31 - lr: 0.000034 - momentum: 0.000000 2023-10-15 22:39:58,840 epoch 4 - iter 2340/2606 - loss 0.07686992 - time (sec): 170.38 - samples/sec: 1942.92 - lr: 0.000034 - momentum: 0.000000 2023-10-15 22:40:17,812 epoch 4 - iter 2600/2606 - loss 0.07640796 - time (sec): 189.35 - samples/sec: 1936.72 - lr: 0.000033 - momentum: 0.000000 2023-10-15 22:40:18,202 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:40:18,203 EPOCH 4 done: loss 0.0764 - lr: 0.000033 2023-10-15 22:40:26,731 DEV : loss 0.2757483124732971 - f1-score (micro avg) 0.3322 2023-10-15 22:40:26,762 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:40:46,807 epoch 5 - iter 260/2606 - loss 0.05178851 - time (sec): 20.04 - samples/sec: 1921.49 - lr: 0.000033 - momentum: 0.000000 2023-10-15 22:41:05,169 epoch 5 - iter 520/2606 - loss 0.05717423 - time (sec): 38.41 - samples/sec: 1904.14 - lr: 0.000032 - momentum: 0.000000 2023-10-15 22:41:23,526 epoch 5 - iter 780/2606 - loss 0.05742065 - time (sec): 56.76 - samples/sec: 1913.71 - lr: 0.000032 - momentum: 0.000000 2023-10-15 22:41:43,051 epoch 5 - iter 1040/2606 - loss 0.05696528 - time (sec): 76.29 - samples/sec: 1931.89 - lr: 0.000031 - momentum: 0.000000 2023-10-15 22:42:03,807 epoch 5 - iter 1300/2606 - loss 0.05634791 - time (sec): 97.04 - samples/sec: 1912.72 - lr: 0.000031 - momentum: 0.000000 2023-10-15 22:42:22,326 epoch 5 - iter 1560/2606 - loss 0.05592339 - time (sec): 115.56 - samples/sec: 1914.46 - lr: 0.000030 - momentum: 0.000000 2023-10-15 22:42:41,680 epoch 5 - iter 1820/2606 - loss 0.05667760 - time (sec): 134.92 - samples/sec: 1899.98 - lr: 0.000029 - momentum: 0.000000 2023-10-15 22:43:01,527 epoch 5 - iter 2080/2606 - loss 0.05618848 - time (sec): 154.76 - samples/sec: 1902.76 - lr: 0.000029 - momentum: 0.000000 2023-10-15 22:43:19,937 epoch 5 - iter 2340/2606 - loss 0.05550520 - time (sec): 173.17 - samples/sec: 1908.88 - lr: 0.000028 - momentum: 0.000000 2023-10-15 22:43:38,771 epoch 5 - iter 2600/2606 - loss 0.05582007 - time (sec): 192.01 - samples/sec: 1909.87 - lr: 0.000028 - momentum: 0.000000 2023-10-15 22:43:39,185 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:43:39,185 EPOCH 5 done: loss 0.0558 - lr: 0.000028 2023-10-15 22:43:47,656 DEV : loss 0.2588236331939697 - f1-score (micro avg) 0.3881 2023-10-15 22:43:47,689 saving best model 2023-10-15 22:43:48,321 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:44:07,112 epoch 6 - iter 260/2606 - loss 0.03581917 - time (sec): 18.79 - samples/sec: 1939.80 - lr: 0.000027 - momentum: 0.000000 2023-10-15 22:44:25,901 epoch 6 - iter 520/2606 - loss 0.03857227 - time (sec): 37.58 - samples/sec: 1955.95 - lr: 0.000027 - momentum: 0.000000 2023-10-15 22:44:45,197 epoch 6 - iter 780/2606 - loss 0.03718849 - time (sec): 56.87 - samples/sec: 1938.69 - lr: 0.000026 - momentum: 0.000000 2023-10-15 22:45:04,355 epoch 6 - iter 1040/2606 - loss 0.03795267 - time (sec): 76.03 - samples/sec: 1941.79 - lr: 0.000026 - momentum: 0.000000 2023-10-15 22:45:22,754 epoch 6 - iter 1300/2606 - loss 0.03927938 - time (sec): 94.43 - samples/sec: 1939.52 - lr: 0.000025 - momentum: 0.000000 2023-10-15 22:45:41,969 epoch 6 - iter 1560/2606 - loss 0.04094059 - time (sec): 113.64 - samples/sec: 1940.09 - lr: 0.000024 - momentum: 0.000000 2023-10-15 22:46:02,357 epoch 6 - iter 1820/2606 - loss 0.04042786 - time (sec): 134.03 - samples/sec: 1930.67 - lr: 0.000024 - momentum: 0.000000 2023-10-15 22:46:21,777 epoch 6 - iter 2080/2606 - loss 0.04023301 - time (sec): 153.45 - samples/sec: 1932.81 - lr: 0.000023 - momentum: 0.000000 2023-10-15 22:46:41,120 epoch 6 - iter 2340/2606 - loss 0.04049483 - time (sec): 172.80 - samples/sec: 1926.06 - lr: 0.000023 - momentum: 0.000000 2023-10-15 22:46:59,172 epoch 6 - iter 2600/2606 - loss 0.04034539 - time (sec): 190.85 - samples/sec: 1918.45 - lr: 0.000022 - momentum: 0.000000 2023-10-15 22:46:59,704 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:46:59,704 EPOCH 6 done: loss 0.0404 - lr: 0.000022 2023-10-15 22:47:07,922 DEV : loss 0.32239729166030884 - f1-score (micro avg) 0.372 2023-10-15 22:47:07,949 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:47:25,980 epoch 7 - iter 260/2606 - loss 0.03480989 - time (sec): 18.03 - samples/sec: 1933.48 - lr: 0.000022 - momentum: 0.000000 2023-10-15 22:47:44,510 epoch 7 - iter 520/2606 - loss 0.02997614 - time (sec): 36.56 - samples/sec: 1954.61 - lr: 0.000021 - momentum: 0.000000 2023-10-15 22:48:04,165 epoch 7 - iter 780/2606 - loss 0.03233103 - time (sec): 56.21 - samples/sec: 1925.02 - lr: 0.000021 - momentum: 0.000000 2023-10-15 22:48:24,145 epoch 7 - iter 1040/2606 - loss 0.03295900 - time (sec): 76.19 - samples/sec: 1934.66 - lr: 0.000020 - momentum: 0.000000 2023-10-15 22:48:43,108 epoch 7 - iter 1300/2606 - loss 0.03274939 - time (sec): 95.16 - samples/sec: 1926.80 - lr: 0.000019 - momentum: 0.000000 2023-10-15 22:49:02,094 epoch 7 - iter 1560/2606 - loss 0.03398439 - time (sec): 114.14 - samples/sec: 1930.83 - lr: 0.000019 - momentum: 0.000000 2023-10-15 22:49:21,082 epoch 7 - iter 1820/2606 - loss 0.03293724 - time (sec): 133.13 - samples/sec: 1940.00 - lr: 0.000018 - momentum: 0.000000 2023-10-15 22:49:40,117 epoch 7 - iter 2080/2606 - loss 0.03190338 - time (sec): 152.17 - samples/sec: 1943.14 - lr: 0.000018 - momentum: 0.000000 2023-10-15 22:49:58,402 epoch 7 - iter 2340/2606 - loss 0.03087084 - time (sec): 170.45 - samples/sec: 1932.03 - lr: 0.000017 - momentum: 0.000000 2023-10-15 22:50:18,500 epoch 7 - iter 2600/2606 - loss 0.03081740 - time (sec): 190.55 - samples/sec: 1922.45 - lr: 0.000017 - momentum: 0.000000 2023-10-15 22:50:19,063 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:50:19,063 EPOCH 7 done: loss 0.0308 - lr: 0.000017 2023-10-15 22:50:27,425 DEV : loss 0.4393313229084015 - f1-score (micro avg) 0.3524 2023-10-15 22:50:27,461 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:50:47,488 epoch 8 - iter 260/2606 - loss 0.02062903 - time (sec): 20.03 - samples/sec: 1948.84 - lr: 0.000016 - momentum: 0.000000 2023-10-15 22:51:06,776 epoch 8 - iter 520/2606 - loss 0.02059864 - time (sec): 39.31 - samples/sec: 1957.39 - lr: 0.000016 - momentum: 0.000000 2023-10-15 22:51:25,601 epoch 8 - iter 780/2606 - loss 0.02243684 - time (sec): 58.14 - samples/sec: 1942.91 - lr: 0.000015 - momentum: 0.000000 2023-10-15 22:51:44,843 epoch 8 - iter 1040/2606 - loss 0.02218227 - time (sec): 77.38 - samples/sec: 1932.78 - lr: 0.000014 - momentum: 0.000000 2023-10-15 22:52:04,003 epoch 8 - iter 1300/2606 - loss 0.02168277 - time (sec): 96.54 - samples/sec: 1928.59 - lr: 0.000014 - momentum: 0.000000 2023-10-15 22:52:23,397 epoch 8 - iter 1560/2606 - loss 0.02361754 - time (sec): 115.93 - samples/sec: 1929.53 - lr: 0.000013 - momentum: 0.000000 2023-10-15 22:52:41,555 epoch 8 - iter 1820/2606 - loss 0.02366463 - time (sec): 134.09 - samples/sec: 1934.57 - lr: 0.000013 - momentum: 0.000000 2023-10-15 22:53:00,142 epoch 8 - iter 2080/2606 - loss 0.02356939 - time (sec): 152.68 - samples/sec: 1934.05 - lr: 0.000012 - momentum: 0.000000 2023-10-15 22:53:18,836 epoch 8 - iter 2340/2606 - loss 0.02340726 - time (sec): 171.37 - samples/sec: 1924.57 - lr: 0.000012 - momentum: 0.000000 2023-10-15 22:53:37,933 epoch 8 - iter 2600/2606 - loss 0.02280304 - time (sec): 190.47 - samples/sec: 1926.00 - lr: 0.000011 - momentum: 0.000000 2023-10-15 22:53:38,313 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:53:38,313 EPOCH 8 done: loss 0.0228 - lr: 0.000011 2023-10-15 22:53:47,439 DEV : loss 0.4546719491481781 - f1-score (micro avg) 0.3671 2023-10-15 22:53:47,466 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:54:06,558 epoch 9 - iter 260/2606 - loss 0.01469418 - time (sec): 19.09 - samples/sec: 1971.09 - lr: 0.000011 - momentum: 0.000000 2023-10-15 22:54:24,684 epoch 9 - iter 520/2606 - loss 0.01564107 - time (sec): 37.22 - samples/sec: 1948.48 - lr: 0.000010 - momentum: 0.000000 2023-10-15 22:54:43,192 epoch 9 - iter 780/2606 - loss 0.01644586 - time (sec): 55.72 - samples/sec: 1925.64 - lr: 0.000009 - momentum: 0.000000 2023-10-15 22:55:01,624 epoch 9 - iter 1040/2606 - loss 0.01628689 - time (sec): 74.16 - samples/sec: 1925.25 - lr: 0.000009 - momentum: 0.000000 2023-10-15 22:55:21,440 epoch 9 - iter 1300/2606 - loss 0.01534667 - time (sec): 93.97 - samples/sec: 1929.47 - lr: 0.000008 - momentum: 0.000000 2023-10-15 22:55:40,355 epoch 9 - iter 1560/2606 - loss 0.01439162 - time (sec): 112.89 - samples/sec: 1935.81 - lr: 0.000008 - momentum: 0.000000 2023-10-15 22:55:59,468 epoch 9 - iter 1820/2606 - loss 0.01446621 - time (sec): 132.00 - samples/sec: 1933.75 - lr: 0.000007 - momentum: 0.000000 2023-10-15 22:56:18,466 epoch 9 - iter 2080/2606 - loss 0.01388150 - time (sec): 151.00 - samples/sec: 1938.34 - lr: 0.000007 - momentum: 0.000000 2023-10-15 22:56:38,019 epoch 9 - iter 2340/2606 - loss 0.01387827 - time (sec): 170.55 - samples/sec: 1938.36 - lr: 0.000006 - momentum: 0.000000 2023-10-15 22:56:56,829 epoch 9 - iter 2600/2606 - loss 0.01405118 - time (sec): 189.36 - samples/sec: 1935.71 - lr: 0.000006 - momentum: 0.000000 2023-10-15 22:56:57,290 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:56:57,290 EPOCH 9 done: loss 0.0140 - lr: 0.000006 2023-10-15 22:57:06,397 DEV : loss 0.46497806906700134 - f1-score (micro avg) 0.3541 2023-10-15 22:57:06,426 ---------------------------------------------------------------------------------------------------- 2023-10-15 22:57:24,867 epoch 10 - iter 260/2606 - loss 0.01004813 - time (sec): 18.44 - samples/sec: 1931.32 - lr: 0.000005 - momentum: 0.000000 2023-10-15 22:57:43,792 epoch 10 - iter 520/2606 - loss 0.01164960 - time (sec): 37.36 - samples/sec: 1912.24 - lr: 0.000004 - momentum: 0.000000 2023-10-15 22:58:02,149 epoch 10 - iter 780/2606 - loss 0.01022217 - time (sec): 55.72 - samples/sec: 1914.91 - lr: 0.000004 - momentum: 0.000000 2023-10-15 22:58:20,621 epoch 10 - iter 1040/2606 - loss 0.01089894 - time (sec): 74.19 - samples/sec: 1922.29 - lr: 0.000003 - momentum: 0.000000 2023-10-15 22:58:39,283 epoch 10 - iter 1300/2606 - loss 0.01043785 - time (sec): 92.86 - samples/sec: 1917.68 - lr: 0.000003 - momentum: 0.000000 2023-10-15 22:58:58,972 epoch 10 - iter 1560/2606 - loss 0.01058729 - time (sec): 112.55 - samples/sec: 1924.74 - lr: 0.000002 - momentum: 0.000000 2023-10-15 22:59:18,245 epoch 10 - iter 1820/2606 - loss 0.01038180 - time (sec): 131.82 - samples/sec: 1931.33 - lr: 0.000002 - momentum: 0.000000 2023-10-15 22:59:38,495 epoch 10 - iter 2080/2606 - loss 0.00998178 - time (sec): 152.07 - samples/sec: 1930.07 - lr: 0.000001 - momentum: 0.000000 2023-10-15 22:59:57,836 epoch 10 - iter 2340/2606 - loss 0.01023802 - time (sec): 171.41 - samples/sec: 1931.92 - lr: 0.000001 - momentum: 0.000000 2023-10-15 23:00:16,128 epoch 10 - iter 2600/2606 - loss 0.01035035 - time (sec): 189.70 - samples/sec: 1933.34 - lr: 0.000000 - momentum: 0.000000 2023-10-15 23:00:16,510 ---------------------------------------------------------------------------------------------------- 2023-10-15 23:00:16,510 EPOCH 10 done: loss 0.0103 - lr: 0.000000 2023-10-15 23:00:25,551 DEV : loss 0.4552249312400818 - f1-score (micro avg) 0.3675 2023-10-15 23:00:25,955 ---------------------------------------------------------------------------------------------------- 2023-10-15 23:00:25,956 Loading model from best epoch ... 2023-10-15 23:00:27,429 SequenceTagger predicts: Dictionary with 17 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG, S-HumanProd, B-HumanProd, E-HumanProd, I-HumanProd 2023-10-15 23:00:42,906 Results: - F-score (micro) 0.4196 - F-score (macro) 0.2764 - Accuracy 0.2689 By class: precision recall f1-score support LOC 0.4960 0.4547 0.4744 1214 PER 0.3994 0.4567 0.4261 808 ORG 0.2405 0.1785 0.2049 353 HumanProd 0.0000 0.0000 0.0000 15 micro avg 0.4278 0.4117 0.4196 2390 macro avg 0.2839 0.2725 0.2764 2390 weighted avg 0.4224 0.4117 0.4153 2390 2023-10-15 23:00:42,906 ----------------------------------------------------------------------------------------------------