stefan-it's picture
Upload folder using huggingface_hub
9730304
2023-10-17 17:11:32,078 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,079 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): ElectraModel(
(embeddings): ElectraEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): ElectraEncoder(
(layer): ModuleList(
(0-11): 12 x ElectraLayer(
(attention): ElectraAttention(
(self): ElectraSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): ElectraSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): ElectraIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): ElectraOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=13, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-17 17:11:32,079 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,079 MultiCorpus: 14465 train + 1392 dev + 2432 test sentences
- NER_HIPE_2022 Corpus: 14465 train + 1392 dev + 2432 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/letemps/fr/with_doc_seperator
2023-10-17 17:11:32,079 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,079 Train: 14465 sentences
2023-10-17 17:11:32,079 (train_with_dev=False, train_with_test=False)
2023-10-17 17:11:32,079 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,080 Training Params:
2023-10-17 17:11:32,080 - learning_rate: "5e-05"
2023-10-17 17:11:32,080 - mini_batch_size: "8"
2023-10-17 17:11:32,080 - max_epochs: "10"
2023-10-17 17:11:32,080 - shuffle: "True"
2023-10-17 17:11:32,080 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,080 Plugins:
2023-10-17 17:11:32,080 - TensorboardLogger
2023-10-17 17:11:32,080 - LinearScheduler | warmup_fraction: '0.1'
2023-10-17 17:11:32,080 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,080 Final evaluation on model from best epoch (best-model.pt)
2023-10-17 17:11:32,080 - metric: "('micro avg', 'f1-score')"
2023-10-17 17:11:32,080 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,080 Computation:
2023-10-17 17:11:32,080 - compute on device: cuda:0
2023-10-17 17:11:32,080 - embedding storage: none
2023-10-17 17:11:32,080 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,080 Model training base path: "hmbench-letemps/fr-hmteams/teams-base-historic-multilingual-discriminator-bs8-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-4"
2023-10-17 17:11:32,080 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,080 ----------------------------------------------------------------------------------------------------
2023-10-17 17:11:32,080 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-17 17:11:44,801 epoch 1 - iter 180/1809 - loss 1.93561623 - time (sec): 12.72 - samples/sec: 2872.61 - lr: 0.000005 - momentum: 0.000000
2023-10-17 17:11:57,763 epoch 1 - iter 360/1809 - loss 1.05326650 - time (sec): 25.68 - samples/sec: 2943.24 - lr: 0.000010 - momentum: 0.000000
2023-10-17 17:12:10,514 epoch 1 - iter 540/1809 - loss 0.75155809 - time (sec): 38.43 - samples/sec: 2953.08 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:12:23,512 epoch 1 - iter 720/1809 - loss 0.59410555 - time (sec): 51.43 - samples/sec: 2963.13 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:12:36,258 epoch 1 - iter 900/1809 - loss 0.50080498 - time (sec): 64.18 - samples/sec: 2948.84 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:12:48,969 epoch 1 - iter 1080/1809 - loss 0.43858981 - time (sec): 76.89 - samples/sec: 2957.34 - lr: 0.000030 - momentum: 0.000000
2023-10-17 17:13:01,864 epoch 1 - iter 1260/1809 - loss 0.39099624 - time (sec): 89.78 - samples/sec: 2959.58 - lr: 0.000035 - momentum: 0.000000
2023-10-17 17:13:14,940 epoch 1 - iter 1440/1809 - loss 0.35498992 - time (sec): 102.86 - samples/sec: 2964.88 - lr: 0.000040 - momentum: 0.000000
2023-10-17 17:13:27,651 epoch 1 - iter 1620/1809 - loss 0.32835651 - time (sec): 115.57 - samples/sec: 2959.52 - lr: 0.000045 - momentum: 0.000000
2023-10-17 17:13:40,504 epoch 1 - iter 1800/1809 - loss 0.30714743 - time (sec): 128.42 - samples/sec: 2947.39 - lr: 0.000050 - momentum: 0.000000
2023-10-17 17:13:41,091 ----------------------------------------------------------------------------------------------------
2023-10-17 17:13:41,092 EPOCH 1 done: loss 0.3063 - lr: 0.000050
2023-10-17 17:13:46,484 DEV : loss 0.10938248783349991 - f1-score (micro avg) 0.6239
2023-10-17 17:13:46,524 saving best model
2023-10-17 17:13:47,036 ----------------------------------------------------------------------------------------------------
2023-10-17 17:14:00,035 epoch 2 - iter 180/1809 - loss 0.10079900 - time (sec): 13.00 - samples/sec: 2981.06 - lr: 0.000049 - momentum: 0.000000
2023-10-17 17:14:12,944 epoch 2 - iter 360/1809 - loss 0.09271722 - time (sec): 25.91 - samples/sec: 2951.93 - lr: 0.000049 - momentum: 0.000000
2023-10-17 17:14:25,719 epoch 2 - iter 540/1809 - loss 0.08663485 - time (sec): 38.68 - samples/sec: 2966.04 - lr: 0.000048 - momentum: 0.000000
2023-10-17 17:14:38,078 epoch 2 - iter 720/1809 - loss 0.08679926 - time (sec): 51.04 - samples/sec: 2962.84 - lr: 0.000048 - momentum: 0.000000
2023-10-17 17:14:50,907 epoch 2 - iter 900/1809 - loss 0.09124566 - time (sec): 63.87 - samples/sec: 2948.30 - lr: 0.000047 - momentum: 0.000000
2023-10-17 17:15:03,593 epoch 2 - iter 1080/1809 - loss 0.09266770 - time (sec): 76.56 - samples/sec: 2932.10 - lr: 0.000047 - momentum: 0.000000
2023-10-17 17:15:16,588 epoch 2 - iter 1260/1809 - loss 0.09325153 - time (sec): 89.55 - samples/sec: 2929.00 - lr: 0.000046 - momentum: 0.000000
2023-10-17 17:15:29,899 epoch 2 - iter 1440/1809 - loss 0.09224137 - time (sec): 102.86 - samples/sec: 2929.07 - lr: 0.000046 - momentum: 0.000000
2023-10-17 17:15:43,216 epoch 2 - iter 1620/1809 - loss 0.09170900 - time (sec): 116.18 - samples/sec: 2913.77 - lr: 0.000045 - momentum: 0.000000
2023-10-17 17:15:57,487 epoch 2 - iter 1800/1809 - loss 0.09078426 - time (sec): 130.45 - samples/sec: 2897.12 - lr: 0.000044 - momentum: 0.000000
2023-10-17 17:15:58,247 ----------------------------------------------------------------------------------------------------
2023-10-17 17:15:58,248 EPOCH 2 done: loss 0.0907 - lr: 0.000044
2023-10-17 17:16:05,410 DEV : loss 0.1160627156496048 - f1-score (micro avg) 0.6271
2023-10-17 17:16:05,450 saving best model
2023-10-17 17:16:06,018 ----------------------------------------------------------------------------------------------------
2023-10-17 17:16:17,628 epoch 3 - iter 180/1809 - loss 0.06093404 - time (sec): 11.61 - samples/sec: 3292.81 - lr: 0.000044 - momentum: 0.000000
2023-10-17 17:16:29,276 epoch 3 - iter 360/1809 - loss 0.06003849 - time (sec): 23.26 - samples/sec: 3289.97 - lr: 0.000043 - momentum: 0.000000
2023-10-17 17:16:40,839 epoch 3 - iter 540/1809 - loss 0.05954656 - time (sec): 34.82 - samples/sec: 3268.21 - lr: 0.000043 - momentum: 0.000000
2023-10-17 17:16:52,586 epoch 3 - iter 720/1809 - loss 0.06117659 - time (sec): 46.57 - samples/sec: 3255.69 - lr: 0.000042 - momentum: 0.000000
2023-10-17 17:17:04,060 epoch 3 - iter 900/1809 - loss 0.06253581 - time (sec): 58.04 - samples/sec: 3255.61 - lr: 0.000042 - momentum: 0.000000
2023-10-17 17:17:15,528 epoch 3 - iter 1080/1809 - loss 0.06281026 - time (sec): 69.51 - samples/sec: 3261.44 - lr: 0.000041 - momentum: 0.000000
2023-10-17 17:17:27,332 epoch 3 - iter 1260/1809 - loss 0.06335577 - time (sec): 81.31 - samples/sec: 3259.87 - lr: 0.000041 - momentum: 0.000000
2023-10-17 17:17:38,880 epoch 3 - iter 1440/1809 - loss 0.06437289 - time (sec): 92.86 - samples/sec: 3252.54 - lr: 0.000040 - momentum: 0.000000
2023-10-17 17:17:50,414 epoch 3 - iter 1620/1809 - loss 0.06571986 - time (sec): 104.39 - samples/sec: 3248.91 - lr: 0.000039 - momentum: 0.000000
2023-10-17 17:18:02,437 epoch 3 - iter 1800/1809 - loss 0.06565150 - time (sec): 116.42 - samples/sec: 3250.23 - lr: 0.000039 - momentum: 0.000000
2023-10-17 17:18:02,972 ----------------------------------------------------------------------------------------------------
2023-10-17 17:18:02,972 EPOCH 3 done: loss 0.0658 - lr: 0.000039
2023-10-17 17:18:09,303 DEV : loss 0.14074555039405823 - f1-score (micro avg) 0.6223
2023-10-17 17:18:09,344 ----------------------------------------------------------------------------------------------------
2023-10-17 17:18:20,880 epoch 4 - iter 180/1809 - loss 0.03667366 - time (sec): 11.53 - samples/sec: 3245.02 - lr: 0.000038 - momentum: 0.000000
2023-10-17 17:18:32,507 epoch 4 - iter 360/1809 - loss 0.04218888 - time (sec): 23.16 - samples/sec: 3262.25 - lr: 0.000038 - momentum: 0.000000
2023-10-17 17:18:44,274 epoch 4 - iter 540/1809 - loss 0.04768593 - time (sec): 34.93 - samples/sec: 3279.30 - lr: 0.000037 - momentum: 0.000000
2023-10-17 17:18:56,716 epoch 4 - iter 720/1809 - loss 0.04825907 - time (sec): 47.37 - samples/sec: 3197.49 - lr: 0.000037 - momentum: 0.000000
2023-10-17 17:19:07,914 epoch 4 - iter 900/1809 - loss 0.04754011 - time (sec): 58.57 - samples/sec: 3203.52 - lr: 0.000036 - momentum: 0.000000
2023-10-17 17:19:19,903 epoch 4 - iter 1080/1809 - loss 0.04777440 - time (sec): 70.56 - samples/sec: 3221.03 - lr: 0.000036 - momentum: 0.000000
2023-10-17 17:19:31,289 epoch 4 - iter 1260/1809 - loss 0.04719754 - time (sec): 81.94 - samples/sec: 3228.10 - lr: 0.000035 - momentum: 0.000000
2023-10-17 17:19:42,794 epoch 4 - iter 1440/1809 - loss 0.04734516 - time (sec): 93.45 - samples/sec: 3227.79 - lr: 0.000034 - momentum: 0.000000
2023-10-17 17:19:54,449 epoch 4 - iter 1620/1809 - loss 0.04770838 - time (sec): 105.10 - samples/sec: 3240.33 - lr: 0.000034 - momentum: 0.000000
2023-10-17 17:20:06,242 epoch 4 - iter 1800/1809 - loss 0.04788521 - time (sec): 116.90 - samples/sec: 3235.87 - lr: 0.000033 - momentum: 0.000000
2023-10-17 17:20:06,788 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:06,789 EPOCH 4 done: loss 0.0479 - lr: 0.000033
2023-10-17 17:20:13,082 DEV : loss 0.19297103583812714 - f1-score (micro avg) 0.6447
2023-10-17 17:20:13,123 saving best model
2023-10-17 17:20:13,730 ----------------------------------------------------------------------------------------------------
2023-10-17 17:20:25,234 epoch 5 - iter 180/1809 - loss 0.03277920 - time (sec): 11.50 - samples/sec: 3302.89 - lr: 0.000033 - momentum: 0.000000
2023-10-17 17:20:36,730 epoch 5 - iter 360/1809 - loss 0.03524838 - time (sec): 23.00 - samples/sec: 3286.71 - lr: 0.000032 - momentum: 0.000000
2023-10-17 17:20:48,176 epoch 5 - iter 540/1809 - loss 0.03536296 - time (sec): 34.44 - samples/sec: 3271.30 - lr: 0.000032 - momentum: 0.000000
2023-10-17 17:20:59,738 epoch 5 - iter 720/1809 - loss 0.03462499 - time (sec): 46.01 - samples/sec: 3270.53 - lr: 0.000031 - momentum: 0.000000
2023-10-17 17:21:11,201 epoch 5 - iter 900/1809 - loss 0.03564067 - time (sec): 57.47 - samples/sec: 3261.46 - lr: 0.000031 - momentum: 0.000000
2023-10-17 17:21:22,490 epoch 5 - iter 1080/1809 - loss 0.03492180 - time (sec): 68.76 - samples/sec: 3259.96 - lr: 0.000030 - momentum: 0.000000
2023-10-17 17:21:34,269 epoch 5 - iter 1260/1809 - loss 0.03567128 - time (sec): 80.54 - samples/sec: 3266.60 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:21:45,765 epoch 5 - iter 1440/1809 - loss 0.03640100 - time (sec): 92.03 - samples/sec: 3271.15 - lr: 0.000029 - momentum: 0.000000
2023-10-17 17:21:57,633 epoch 5 - iter 1620/1809 - loss 0.03683705 - time (sec): 103.90 - samples/sec: 3266.95 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:22:10,494 epoch 5 - iter 1800/1809 - loss 0.03664054 - time (sec): 116.76 - samples/sec: 3240.02 - lr: 0.000028 - momentum: 0.000000
2023-10-17 17:22:11,124 ----------------------------------------------------------------------------------------------------
2023-10-17 17:22:11,124 EPOCH 5 done: loss 0.0365 - lr: 0.000028
2023-10-17 17:22:18,401 DEV : loss 0.27114248275756836 - f1-score (micro avg) 0.6489
2023-10-17 17:22:18,442 saving best model
2023-10-17 17:22:19,012 ----------------------------------------------------------------------------------------------------
2023-10-17 17:22:30,865 epoch 6 - iter 180/1809 - loss 0.02380828 - time (sec): 11.85 - samples/sec: 3203.90 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:22:42,265 epoch 6 - iter 360/1809 - loss 0.02336456 - time (sec): 23.25 - samples/sec: 3201.60 - lr: 0.000027 - momentum: 0.000000
2023-10-17 17:22:53,931 epoch 6 - iter 540/1809 - loss 0.02458402 - time (sec): 34.92 - samples/sec: 3210.75 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:23:06,012 epoch 6 - iter 720/1809 - loss 0.02370887 - time (sec): 47.00 - samples/sec: 3231.32 - lr: 0.000026 - momentum: 0.000000
2023-10-17 17:23:17,769 epoch 6 - iter 900/1809 - loss 0.02321931 - time (sec): 58.76 - samples/sec: 3223.85 - lr: 0.000025 - momentum: 0.000000
2023-10-17 17:23:30,920 epoch 6 - iter 1080/1809 - loss 0.02428414 - time (sec): 71.91 - samples/sec: 3165.19 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:23:43,365 epoch 6 - iter 1260/1809 - loss 0.02576350 - time (sec): 84.35 - samples/sec: 3117.42 - lr: 0.000024 - momentum: 0.000000
2023-10-17 17:23:56,563 epoch 6 - iter 1440/1809 - loss 0.02577017 - time (sec): 97.55 - samples/sec: 3086.57 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:24:09,657 epoch 6 - iter 1620/1809 - loss 0.02611217 - time (sec): 110.64 - samples/sec: 3067.46 - lr: 0.000023 - momentum: 0.000000
2023-10-17 17:24:22,953 epoch 6 - iter 1800/1809 - loss 0.02567029 - time (sec): 123.94 - samples/sec: 3052.44 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:24:23,520 ----------------------------------------------------------------------------------------------------
2023-10-17 17:24:23,521 EPOCH 6 done: loss 0.0257 - lr: 0.000022
2023-10-17 17:24:29,799 DEV : loss 0.2926296889781952 - f1-score (micro avg) 0.6557
2023-10-17 17:24:29,840 saving best model
2023-10-17 17:24:30,418 ----------------------------------------------------------------------------------------------------
2023-10-17 17:24:41,928 epoch 7 - iter 180/1809 - loss 0.01540372 - time (sec): 11.51 - samples/sec: 3142.49 - lr: 0.000022 - momentum: 0.000000
2023-10-17 17:24:53,530 epoch 7 - iter 360/1809 - loss 0.01373858 - time (sec): 23.11 - samples/sec: 3149.87 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:25:04,891 epoch 7 - iter 540/1809 - loss 0.01476080 - time (sec): 34.47 - samples/sec: 3169.89 - lr: 0.000021 - momentum: 0.000000
2023-10-17 17:25:17,063 epoch 7 - iter 720/1809 - loss 0.01553034 - time (sec): 46.64 - samples/sec: 3178.69 - lr: 0.000020 - momentum: 0.000000
2023-10-17 17:25:31,212 epoch 7 - iter 900/1809 - loss 0.01608482 - time (sec): 60.79 - samples/sec: 3092.52 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:25:44,555 epoch 7 - iter 1080/1809 - loss 0.01625864 - time (sec): 74.14 - samples/sec: 3065.32 - lr: 0.000019 - momentum: 0.000000
2023-10-17 17:25:57,185 epoch 7 - iter 1260/1809 - loss 0.01598109 - time (sec): 86.77 - samples/sec: 3046.68 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:26:10,352 epoch 7 - iter 1440/1809 - loss 0.01620623 - time (sec): 99.93 - samples/sec: 3018.29 - lr: 0.000018 - momentum: 0.000000
2023-10-17 17:26:23,271 epoch 7 - iter 1620/1809 - loss 0.01562169 - time (sec): 112.85 - samples/sec: 3008.43 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:26:35,991 epoch 7 - iter 1800/1809 - loss 0.01568355 - time (sec): 125.57 - samples/sec: 3009.51 - lr: 0.000017 - momentum: 0.000000
2023-10-17 17:26:36,606 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:36,607 EPOCH 7 done: loss 0.0157 - lr: 0.000017
2023-10-17 17:26:43,029 DEV : loss 0.3529641330242157 - f1-score (micro avg) 0.6571
2023-10-17 17:26:43,073 saving best model
2023-10-17 17:26:43,674 ----------------------------------------------------------------------------------------------------
2023-10-17 17:26:56,220 epoch 8 - iter 180/1809 - loss 0.01695089 - time (sec): 12.54 - samples/sec: 2962.85 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:27:08,773 epoch 8 - iter 360/1809 - loss 0.01377195 - time (sec): 25.10 - samples/sec: 2935.92 - lr: 0.000016 - momentum: 0.000000
2023-10-17 17:27:21,487 epoch 8 - iter 540/1809 - loss 0.01220865 - time (sec): 37.81 - samples/sec: 2935.57 - lr: 0.000015 - momentum: 0.000000
2023-10-17 17:27:35,132 epoch 8 - iter 720/1809 - loss 0.01373650 - time (sec): 51.46 - samples/sec: 2917.35 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:27:48,191 epoch 8 - iter 900/1809 - loss 0.01422431 - time (sec): 64.52 - samples/sec: 2914.87 - lr: 0.000014 - momentum: 0.000000
2023-10-17 17:28:01,243 epoch 8 - iter 1080/1809 - loss 0.01318920 - time (sec): 77.57 - samples/sec: 2904.54 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:28:14,364 epoch 8 - iter 1260/1809 - loss 0.01262812 - time (sec): 90.69 - samples/sec: 2907.44 - lr: 0.000013 - momentum: 0.000000
2023-10-17 17:28:27,356 epoch 8 - iter 1440/1809 - loss 0.01230139 - time (sec): 103.68 - samples/sec: 2915.00 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:28:40,048 epoch 8 - iter 1620/1809 - loss 0.01216736 - time (sec): 116.37 - samples/sec: 2911.57 - lr: 0.000012 - momentum: 0.000000
2023-10-17 17:28:53,484 epoch 8 - iter 1800/1809 - loss 0.01174195 - time (sec): 129.81 - samples/sec: 2910.13 - lr: 0.000011 - momentum: 0.000000
2023-10-17 17:28:54,159 ----------------------------------------------------------------------------------------------------
2023-10-17 17:28:54,160 EPOCH 8 done: loss 0.0119 - lr: 0.000011
2023-10-17 17:29:01,334 DEV : loss 0.36510923504829407 - f1-score (micro avg) 0.6581
2023-10-17 17:29:01,379 saving best model
2023-10-17 17:29:01,990 ----------------------------------------------------------------------------------------------------
2023-10-17 17:29:14,799 epoch 9 - iter 180/1809 - loss 0.00676309 - time (sec): 12.81 - samples/sec: 2836.96 - lr: 0.000011 - momentum: 0.000000
2023-10-17 17:29:27,626 epoch 9 - iter 360/1809 - loss 0.00654625 - time (sec): 25.63 - samples/sec: 2881.92 - lr: 0.000010 - momentum: 0.000000
2023-10-17 17:29:40,593 epoch 9 - iter 540/1809 - loss 0.00662540 - time (sec): 38.60 - samples/sec: 2901.39 - lr: 0.000009 - momentum: 0.000000
2023-10-17 17:29:54,034 epoch 9 - iter 720/1809 - loss 0.00713313 - time (sec): 52.04 - samples/sec: 2892.66 - lr: 0.000009 - momentum: 0.000000
2023-10-17 17:30:07,467 epoch 9 - iter 900/1809 - loss 0.00725837 - time (sec): 65.48 - samples/sec: 2877.00 - lr: 0.000008 - momentum: 0.000000
2023-10-17 17:30:20,535 epoch 9 - iter 1080/1809 - loss 0.00748436 - time (sec): 78.54 - samples/sec: 2881.75 - lr: 0.000008 - momentum: 0.000000
2023-10-17 17:30:34,143 epoch 9 - iter 1260/1809 - loss 0.00775191 - time (sec): 92.15 - samples/sec: 2889.46 - lr: 0.000007 - momentum: 0.000000
2023-10-17 17:30:46,967 epoch 9 - iter 1440/1809 - loss 0.00761670 - time (sec): 104.97 - samples/sec: 2900.62 - lr: 0.000007 - momentum: 0.000000
2023-10-17 17:30:59,707 epoch 9 - iter 1620/1809 - loss 0.00766050 - time (sec): 117.71 - samples/sec: 2901.81 - lr: 0.000006 - momentum: 0.000000
2023-10-17 17:31:13,080 epoch 9 - iter 1800/1809 - loss 0.00746642 - time (sec): 131.09 - samples/sec: 2886.97 - lr: 0.000006 - momentum: 0.000000
2023-10-17 17:31:13,704 ----------------------------------------------------------------------------------------------------
2023-10-17 17:31:13,704 EPOCH 9 done: loss 0.0075 - lr: 0.000006
2023-10-17 17:31:20,092 DEV : loss 0.39079737663269043 - f1-score (micro avg) 0.6541
2023-10-17 17:31:20,133 ----------------------------------------------------------------------------------------------------
2023-10-17 17:31:34,992 epoch 10 - iter 180/1809 - loss 0.00435699 - time (sec): 14.86 - samples/sec: 2533.83 - lr: 0.000005 - momentum: 0.000000
2023-10-17 17:31:49,379 epoch 10 - iter 360/1809 - loss 0.00449193 - time (sec): 29.24 - samples/sec: 2657.93 - lr: 0.000004 - momentum: 0.000000
2023-10-17 17:32:03,565 epoch 10 - iter 540/1809 - loss 0.00402430 - time (sec): 43.43 - samples/sec: 2649.88 - lr: 0.000004 - momentum: 0.000000
2023-10-17 17:32:17,158 epoch 10 - iter 720/1809 - loss 0.00509725 - time (sec): 57.02 - samples/sec: 2683.43 - lr: 0.000003 - momentum: 0.000000
2023-10-17 17:32:30,887 epoch 10 - iter 900/1809 - loss 0.00529774 - time (sec): 70.75 - samples/sec: 2670.34 - lr: 0.000003 - momentum: 0.000000
2023-10-17 17:32:44,665 epoch 10 - iter 1080/1809 - loss 0.00507294 - time (sec): 84.53 - samples/sec: 2692.78 - lr: 0.000002 - momentum: 0.000000
2023-10-17 17:32:57,604 epoch 10 - iter 1260/1809 - loss 0.00509465 - time (sec): 97.47 - samples/sec: 2728.72 - lr: 0.000002 - momentum: 0.000000
2023-10-17 17:33:10,510 epoch 10 - iter 1440/1809 - loss 0.00507500 - time (sec): 110.38 - samples/sec: 2749.44 - lr: 0.000001 - momentum: 0.000000
2023-10-17 17:33:23,583 epoch 10 - iter 1620/1809 - loss 0.00502796 - time (sec): 123.45 - samples/sec: 2774.09 - lr: 0.000001 - momentum: 0.000000
2023-10-17 17:33:37,029 epoch 10 - iter 1800/1809 - loss 0.00499215 - time (sec): 136.89 - samples/sec: 2765.41 - lr: 0.000000 - momentum: 0.000000
2023-10-17 17:33:37,688 ----------------------------------------------------------------------------------------------------
2023-10-17 17:33:37,688 EPOCH 10 done: loss 0.0050 - lr: 0.000000
2023-10-17 17:33:43,978 DEV : loss 0.39810895919799805 - f1-score (micro avg) 0.6564
2023-10-17 17:33:44,542 ----------------------------------------------------------------------------------------------------
2023-10-17 17:33:44,544 Loading model from best epoch ...
2023-10-17 17:33:46,262 SequenceTagger predicts: Dictionary with 13 tags: O, S-loc, B-loc, E-loc, I-loc, S-pers, B-pers, E-pers, I-pers, S-org, B-org, E-org, I-org
2023-10-17 17:33:55,163
Results:
- F-score (micro) 0.6707
- F-score (macro) 0.5415
- Accuracy 0.5185
By class:
precision recall f1-score support
loc 0.6425 0.8088 0.7161 591
pers 0.5864 0.7703 0.6659 357
org 0.3019 0.2025 0.2424 79
micro avg 0.6074 0.7488 0.6707 1027
macro avg 0.5102 0.5939 0.5415 1027
weighted avg 0.5968 0.7488 0.6622 1027
2023-10-17 17:33:55,163 ----------------------------------------------------------------------------------------------------