2023-02-12 15:41:29,581 - INFO - Experiment directory: runs/downstream-watermark-removal 2023-02-12 15:41:29,581 - INFO - Device: cuda 2023-02-12 15:41:29,581 - INFO - Number of devices: 1 2023-02-12 15:41:29,944 - INFO - Size of training set: 28352 2023-02-12 15:41:29,944 - INFO - Size of validation set: 4051 2023-02-12 15:41:29,944 - INFO - Batch size per device: 4 2023-02-12 15:41:29,944 - INFO - Effective batch size: 4 2023-02-12 15:41:31,526 - INFO - Successfully load mpn from ./runs/places-joint/ckpt/step149999.pt 2023-02-12 15:41:31,531 - INFO - Successfully load rin from ./runs/places-joint/ckpt/step149999.pt 2023-02-12 15:41:31,532 - INFO - Successfully load disc from ./runs/places-joint/ckpt/step149999.pt 2023-02-12 15:41:31,534 - INFO - Successfully load pdisc from ./runs/places-joint/ckpt/step149999.pt 2023-02-12 15:41:31,534 - INFO - Resume from runs/downstream-watermark-removal/ckpt/step009999.pt 2023-02-12 15:41:31,627 - INFO - Successfully load mpn from runs/downstream-watermark-removal/ckpt/step009999.pt 2023-02-12 15:41:31,632 - INFO - Successfully load rin from runs/downstream-watermark-removal/ckpt/step009999.pt 2023-02-12 15:41:31,634 - INFO - Successfully load disc from runs/downstream-watermark-removal/ckpt/step009999.pt 2023-02-12 15:41:31,636 - INFO - Successfully load pdisc from runs/downstream-watermark-removal/ckpt/step009999.pt 2023-02-12 15:41:31,669 - INFO - Successfully load optimizers from runs/downstream-watermark-removal/ckpt/step009999.pt 2023-02-12 15:41:31,669 - INFO - Restart training at step 10000 2023-02-12 15:41:31,669 - INFO - Best psnr so far: 28.375944137573242 2023-02-12 15:41:31,671 - INFO - Start training... 2023-02-12 15:42:03,389 - INFO - [Train] step: 10099, loss_adv_disc: -0.573632 2023-02-12 15:42:03,596 - INFO - [Train] step: 10099, loss_mpn: 0.010319, loss_rec: 0.027359, loss_semantic: 0.370544, loss_idmrf: 0.774260, loss_adv_gen: -122.804909 2023-02-12 15:42:32,762 - INFO - [Train] step: 10199, loss_adv_disc: -0.752703 2023-02-12 15:42:32,971 - INFO - [Train] step: 10199, loss_mpn: 0.021178, loss_rec: 0.029023, loss_semantic: 0.415801, loss_idmrf: 1.008709, loss_adv_gen: -128.898590 2023-02-12 15:43:02,229 - INFO - [Train] step: 10299, loss_adv_disc: -0.624492 2023-02-12 15:43:02,439 - INFO - [Train] step: 10299, loss_mpn: 0.015903, loss_rec: 0.024953, loss_semantic: 0.390218, loss_idmrf: 0.604853, loss_adv_gen: -81.564667 2023-02-12 15:43:31,721 - INFO - [Train] step: 10399, loss_adv_disc: -0.534651 2023-02-12 15:43:31,930 - INFO - [Train] step: 10399, loss_mpn: 0.009860, loss_rec: 0.031842, loss_semantic: 0.361394, loss_idmrf: 1.416407, loss_adv_gen: -81.499672 2023-02-12 15:44:01,214 - INFO - [Train] step: 10499, loss_adv_disc: 0.128698 2023-02-12 15:44:01,424 - INFO - [Train] step: 10499, loss_mpn: 0.022621, loss_rec: 0.026604, loss_semantic: 0.383198, loss_idmrf: 0.996032, loss_adv_gen: -173.462540 2023-02-12 15:44:30,713 - INFO - [Train] step: 10599, loss_adv_disc: -0.762986 2023-02-12 15:44:30,923 - INFO - [Train] step: 10599, loss_mpn: 0.010371, loss_rec: 0.021601, loss_semantic: 0.355843, loss_idmrf: 0.626368, loss_adv_gen: -87.495544 2023-02-12 15:45:00,220 - INFO - [Train] step: 10699, loss_adv_disc: -0.799230 2023-02-12 15:45:00,431 - INFO - [Train] step: 10699, loss_mpn: 0.022823, loss_rec: 0.034656, loss_semantic: 0.432447, loss_idmrf: 1.455905, loss_adv_gen: -148.209473 2023-02-12 15:45:29,723 - INFO - [Train] step: 10799, loss_adv_disc: -1.669115 2023-02-12 15:45:29,933 - INFO - [Train] step: 10799, loss_mpn: 0.019033, loss_rec: 0.032545, loss_semantic: 0.419988, loss_idmrf: 2.659751, loss_adv_gen: -113.579796 2023-02-12 15:45:59,232 - INFO - [Train] step: 10899, loss_adv_disc: -0.240895 2023-02-12 15:45:59,442 - INFO - [Train] step: 10899, loss_mpn: 0.008078, loss_rec: 0.018564, loss_semantic: 0.299887, loss_idmrf: 0.776784, loss_adv_gen: -110.333099 2023-02-12 15:46:28,732 - INFO - [Train] step: 10999, loss_adv_disc: -1.609084 2023-02-12 15:46:28,941 - INFO - [Train] step: 10999, loss_mpn: 0.015474, loss_rec: 0.032228, loss_semantic: 0.409391, loss_idmrf: 1.527376, loss_adv_gen: -152.766113 2023-02-12 15:47:06,356 - INFO - [Eval] step: 10999, bce: 0.257489, psnr: 28.320116, ssim: 0.953730 2023-02-12 15:47:37,093 - INFO - [Train] step: 11099, loss_adv_disc: -1.827085 2023-02-12 15:47:37,302 - INFO - [Train] step: 11099, loss_mpn: 0.046034, loss_rec: 0.034839, loss_semantic: 0.449751, loss_idmrf: 1.792729, loss_adv_gen: -104.510086 2023-02-12 15:48:06,631 - INFO - [Train] step: 11199, loss_adv_disc: -1.588022 2023-02-12 15:48:06,841 - INFO - [Train] step: 11199, loss_mpn: 0.023192, loss_rec: 0.030508, loss_semantic: 0.357541, loss_idmrf: 1.278599, loss_adv_gen: -167.189331 2023-02-12 15:48:36,166 - INFO - [Train] step: 11299, loss_adv_disc: 0.126817 2023-02-12 15:48:36,376 - INFO - [Train] step: 11299, loss_mpn: 0.010594, loss_rec: 0.029868, loss_semantic: 0.403277, loss_idmrf: 0.817758, loss_adv_gen: -148.307251 2023-02-12 15:49:05,695 - INFO - [Train] step: 11399, loss_adv_disc: -5.733956 2023-02-12 15:49:05,905 - INFO - [Train] step: 11399, loss_mpn: 0.025480, loss_rec: 0.046885, loss_semantic: 0.485265, loss_idmrf: 1.430085, loss_adv_gen: -135.652435 2023-02-12 15:49:35,210 - INFO - [Train] step: 11499, loss_adv_disc: -0.405542 2023-02-12 15:49:35,419 - INFO - [Train] step: 11499, loss_mpn: 0.017168, loss_rec: 0.014043, loss_semantic: 0.287016, loss_idmrf: 1.553092, loss_adv_gen: -151.363632 2023-02-12 15:50:04,729 - INFO - [Train] step: 11599, loss_adv_disc: -0.672015 2023-02-12 15:50:04,938 - INFO - [Train] step: 11599, loss_mpn: 0.031546, loss_rec: 0.028756, loss_semantic: 0.401363, loss_idmrf: 1.752889, loss_adv_gen: -101.953789 2023-02-12 15:50:34,248 - INFO - [Train] step: 11699, loss_adv_disc: -0.624735 2023-02-12 15:50:34,457 - INFO - [Train] step: 11699, loss_mpn: 0.018015, loss_rec: 0.027845, loss_semantic: 0.394872, loss_idmrf: 1.055130, loss_adv_gen: -150.515442 2023-02-12 15:51:03,766 - INFO - [Train] step: 11799, loss_adv_disc: 1.819307 2023-02-12 15:51:03,975 - INFO - [Train] step: 11799, loss_mpn: 0.015464, loss_rec: 0.035582, loss_semantic: 0.379069, loss_idmrf: 2.861618, loss_adv_gen: -122.459030 2023-02-12 15:51:33,272 - INFO - [Train] step: 11899, loss_adv_disc: -1.051750 2023-02-12 15:51:33,481 - INFO - [Train] step: 11899, loss_mpn: 0.035472, loss_rec: 0.042832, loss_semantic: 0.438833, loss_idmrf: 1.802002, loss_adv_gen: -112.499496 2023-02-12 15:52:02,783 - INFO - [Train] step: 11999, loss_adv_disc: -1.834618 2023-02-12 15:52:02,992 - INFO - [Train] step: 11999, loss_mpn: 0.046120, loss_rec: 0.038236, loss_semantic: 0.390554, loss_idmrf: 1.070025, loss_adv_gen: -160.587997 2023-02-12 15:52:40,183 - INFO - [Eval] step: 11999, bce: 0.294210, psnr: 28.423737, ssim: 0.953786 2023-02-12 15:53:10,757 - INFO - [Train] step: 12099, loss_adv_disc: -0.467648 2023-02-12 15:53:10,967 - INFO - [Train] step: 12099, loss_mpn: 0.018973, loss_rec: 0.018282, loss_semantic: 0.319116, loss_idmrf: 0.968704, loss_adv_gen: -130.123260 2023-02-12 15:53:40,280 - INFO - [Train] step: 12199, loss_adv_disc: -1.703174 2023-02-12 15:53:40,489 - INFO - [Train] step: 12199, loss_mpn: 0.013166, loss_rec: 0.025926, loss_semantic: 0.310816, loss_idmrf: 1.490854, loss_adv_gen: -87.041710 2023-02-12 15:54:09,788 - INFO - [Train] step: 12299, loss_adv_disc: -2.071906 2023-02-12 15:54:09,997 - INFO - [Train] step: 12299, loss_mpn: 0.011900, loss_rec: 0.023009, loss_semantic: 0.361497, loss_idmrf: 1.356379, loss_adv_gen: -97.818542 2023-02-12 15:54:39,307 - INFO - [Train] step: 12399, loss_adv_disc: -1.547906 2023-02-12 15:54:39,517 - INFO - [Train] step: 12399, loss_mpn: 0.031430, loss_rec: 0.039016, loss_semantic: 0.466467, loss_idmrf: 1.551023, loss_adv_gen: -124.634491 2023-02-12 15:55:08,815 - INFO - [Train] step: 12499, loss_adv_disc: -2.076171 2023-02-12 15:55:09,024 - INFO - [Train] step: 12499, loss_mpn: 0.025787, loss_rec: 0.031508, loss_semantic: 0.419648, loss_idmrf: 1.477141, loss_adv_gen: -92.370377 2023-02-12 15:55:38,328 - INFO - [Train] step: 12599, loss_adv_disc: -1.768132 2023-02-12 15:55:38,538 - INFO - [Train] step: 12599, loss_mpn: 0.010386, loss_rec: 0.034599, loss_semantic: 0.419125, loss_idmrf: 1.166921, loss_adv_gen: -133.431656 2023-02-12 15:56:07,831 - INFO - [Train] step: 12699, loss_adv_disc: -2.694801 2023-02-12 15:56:08,040 - INFO - [Train] step: 12699, loss_mpn: 0.033597, loss_rec: 0.041970, loss_semantic: 0.457011, loss_idmrf: 1.751136, loss_adv_gen: -126.347549 2023-02-12 15:56:37,340 - INFO - [Train] step: 12799, loss_adv_disc: 0.018329 2023-02-12 15:56:37,551 - INFO - [Train] step: 12799, loss_mpn: 0.024124, loss_rec: 0.032403, loss_semantic: 0.401399, loss_idmrf: 1.235319, loss_adv_gen: -91.264397 2023-02-12 15:57:06,860 - INFO - [Train] step: 12899, loss_adv_disc: -1.688953 2023-02-12 15:57:07,069 - INFO - [Train] step: 12899, loss_mpn: 0.024907, loss_rec: 0.032172, loss_semantic: 0.405167, loss_idmrf: 1.241795, loss_adv_gen: -99.125114 2023-02-12 15:57:36,378 - INFO - [Train] step: 12999, loss_adv_disc: -1.207177 2023-02-12 15:57:36,587 - INFO - [Train] step: 12999, loss_mpn: 0.016857, loss_rec: 0.023081, loss_semantic: 0.333206, loss_idmrf: 1.260828, loss_adv_gen: -170.703522 2023-02-12 15:58:13,782 - INFO - [Eval] step: 12999, bce: 0.266584, psnr: 28.522163, ssim: 0.954158 2023-02-12 15:58:44,344 - INFO - [Train] step: 13099, loss_adv_disc: 0.384707 2023-02-12 15:58:44,554 - INFO - [Train] step: 13099, loss_mpn: 0.011170, loss_rec: 0.019468, loss_semantic: 0.341580, loss_idmrf: 1.395938, loss_adv_gen: -162.299805 2023-02-12 15:59:13,857 - INFO - [Train] step: 13199, loss_adv_disc: -0.637955 2023-02-12 15:59:14,066 - INFO - [Train] step: 13199, loss_mpn: 0.010275, loss_rec: 0.022119, loss_semantic: 0.369303, loss_idmrf: 1.168316, loss_adv_gen: -97.904114 2023-02-12 15:59:43,376 - INFO - [Train] step: 13299, loss_adv_disc: -0.400550 2023-02-12 15:59:43,585 - INFO - [Train] step: 13299, loss_mpn: 0.011687, loss_rec: 0.012240, loss_semantic: 0.243617, loss_idmrf: 0.481851, loss_adv_gen: -106.272179 2023-02-12 16:00:12,885 - INFO - [Train] step: 13399, loss_adv_disc: -1.855669 2023-02-12 16:00:13,096 - INFO - [Train] step: 13399, loss_mpn: 0.014073, loss_rec: 0.027463, loss_semantic: 0.361384, loss_idmrf: 1.515916, loss_adv_gen: -106.083847 2023-02-12 16:00:42,394 - INFO - [Train] step: 13499, loss_adv_disc: -1.340236 2023-02-12 16:00:42,603 - INFO - [Train] step: 13499, loss_mpn: 0.016790, loss_rec: 0.020464, loss_semantic: 0.314939, loss_idmrf: 1.105718, loss_adv_gen: -119.364258 2023-02-12 16:01:11,909 - INFO - [Train] step: 13599, loss_adv_disc: -1.605150 2023-02-12 16:01:12,119 - INFO - [Train] step: 13599, loss_mpn: 0.036913, loss_rec: 0.031908, loss_semantic: 0.419063, loss_idmrf: 1.360922, loss_adv_gen: -161.308777 2023-02-12 16:01:41,423 - INFO - [Train] step: 13699, loss_adv_disc: -2.649448 2023-02-12 16:01:41,634 - INFO - [Train] step: 13699, loss_mpn: 0.044100, loss_rec: 0.030275, loss_semantic: 0.416397, loss_idmrf: 1.658602, loss_adv_gen: -116.658020 2023-02-12 16:02:10,936 - INFO - [Train] step: 13799, loss_adv_disc: -0.988471 2023-02-12 16:02:11,145 - INFO - [Train] step: 13799, loss_mpn: 0.034739, loss_rec: 0.029279, loss_semantic: 0.382340, loss_idmrf: 0.971351, loss_adv_gen: -150.975433 2023-02-12 16:02:40,442 - INFO - [Train] step: 13899, loss_adv_disc: -2.611970 2023-02-12 16:02:40,651 - INFO - [Train] step: 13899, loss_mpn: 0.023674, loss_rec: 0.028898, loss_semantic: 0.407663, loss_idmrf: 1.427401, loss_adv_gen: -159.901657 2023-02-12 16:03:09,949 - INFO - [Train] step: 13999, loss_adv_disc: -0.316513 2023-02-12 16:03:10,159 - INFO - [Train] step: 13999, loss_mpn: 0.017420, loss_rec: 0.029493, loss_semantic: 0.419786, loss_idmrf: 1.184973, loss_adv_gen: -136.433777 2023-02-12 16:03:47,344 - INFO - [Eval] step: 13999, bce: 0.303972, psnr: 28.459097, ssim: 0.954072 2023-02-12 16:04:17,701 - INFO - [Train] step: 14099, loss_adv_disc: 0.521385 2023-02-12 16:04:17,911 - INFO - [Train] step: 14099, loss_mpn: 0.010085, loss_rec: 0.025302, loss_semantic: 0.358362, loss_idmrf: 1.194657, loss_adv_gen: -136.419891 2023-02-12 16:04:47,224 - INFO - [Train] step: 14199, loss_adv_disc: -0.374627 2023-02-12 16:04:47,434 - INFO - [Train] step: 14199, loss_mpn: 0.025627, loss_rec: 0.015940, loss_semantic: 0.260977, loss_idmrf: 0.729602, loss_adv_gen: -205.061829 2023-02-12 16:05:16,739 - INFO - [Train] step: 14299, loss_adv_disc: -0.613578 2023-02-12 16:05:16,949 - INFO - [Train] step: 14299, loss_mpn: 0.012513, loss_rec: 0.018452, loss_semantic: 0.297812, loss_idmrf: 0.867024, loss_adv_gen: -117.039078 2023-02-12 16:05:46,257 - INFO - [Train] step: 14399, loss_adv_disc: -0.752550 2023-02-12 16:05:46,467 - INFO - [Train] step: 14399, loss_mpn: 0.010786, loss_rec: 0.020313, loss_semantic: 0.317454, loss_idmrf: 0.587289, loss_adv_gen: -146.788406 2023-02-12 16:06:15,774 - INFO - [Train] step: 14499, loss_adv_disc: -2.348324 2023-02-12 16:06:15,983 - INFO - [Train] step: 14499, loss_mpn: 0.009751, loss_rec: 0.029186, loss_semantic: 0.382273, loss_idmrf: 1.368526, loss_adv_gen: -186.162994 2023-02-12 16:06:45,289 - INFO - [Train] step: 14599, loss_adv_disc: -4.208821 2023-02-12 16:06:45,498 - INFO - [Train] step: 14599, loss_mpn: 0.020542, loss_rec: 0.047389, loss_semantic: 0.453684, loss_idmrf: 1.460051, loss_adv_gen: -124.967163 2023-02-12 16:07:14,806 - INFO - [Train] step: 14699, loss_adv_disc: -1.152210 2023-02-12 16:07:15,017 - INFO - [Train] step: 14699, loss_mpn: 0.016993, loss_rec: 0.025116, loss_semantic: 0.355742, loss_idmrf: 1.580167, loss_adv_gen: -120.020020 2023-02-12 16:07:44,332 - INFO - [Train] step: 14799, loss_adv_disc: -0.867544 2023-02-12 16:07:44,543 - INFO - [Train] step: 14799, loss_mpn: 0.026516, loss_rec: 0.026188, loss_semantic: 0.376111, loss_idmrf: 1.591483, loss_adv_gen: -132.943726 2023-02-12 16:08:13,848 - INFO - [Train] step: 14899, loss_adv_disc: -1.832621 2023-02-12 16:08:14,057 - INFO - [Train] step: 14899, loss_mpn: 0.024598, loss_rec: 0.026206, loss_semantic: 0.376802, loss_idmrf: 1.258729, loss_adv_gen: -103.054977 2023-02-12 16:08:43,362 - INFO - [Train] step: 14999, loss_adv_disc: -0.470228 2023-02-12 16:08:43,573 - INFO - [Train] step: 14999, loss_mpn: 0.016009, loss_rec: 0.023309, loss_semantic: 0.347093, loss_idmrf: 1.167062, loss_adv_gen: -129.543365 2023-02-12 16:09:20,763 - INFO - [Eval] step: 14999, bce: 0.308236, psnr: 28.478035, ssim: 0.954528 2023-02-12 16:09:51,239 - INFO - [Train] step: 15099, loss_adv_disc: -3.643587 2023-02-12 16:09:51,448 - INFO - [Train] step: 15099, loss_mpn: 0.017914, loss_rec: 0.037130, loss_semantic: 0.469375, loss_idmrf: 0.786234, loss_adv_gen: -152.553009 2023-02-12 16:10:20,756 - INFO - [Train] step: 15199, loss_adv_disc: -1.558701 2023-02-12 16:10:20,966 - INFO - [Train] step: 15199, loss_mpn: 0.013246, loss_rec: 0.020152, loss_semantic: 0.340950, loss_idmrf: 0.891224, loss_adv_gen: -135.107086 2023-02-12 16:10:50,262 - INFO - [Train] step: 15299, loss_adv_disc: -0.838546 2023-02-12 16:10:50,472 - INFO - [Train] step: 15299, loss_mpn: 0.021482, loss_rec: 0.033843, loss_semantic: 0.446977, loss_idmrf: 1.599546, loss_adv_gen: -160.477463 2023-02-12 16:11:19,776 - INFO - [Train] step: 15399, loss_adv_disc: -1.011172 2023-02-12 16:11:19,985 - INFO - [Train] step: 15399, loss_mpn: 0.021991, loss_rec: 0.026417, loss_semantic: 0.354353, loss_idmrf: 0.977984, loss_adv_gen: -126.219818 2023-02-12 16:11:49,291 - INFO - [Train] step: 15499, loss_adv_disc: -0.555625 2023-02-12 16:11:49,500 - INFO - [Train] step: 15499, loss_mpn: 0.013487, loss_rec: 0.020957, loss_semantic: 0.318727, loss_idmrf: 1.733619, loss_adv_gen: -151.311798 2023-02-12 16:12:18,806 - INFO - [Train] step: 15599, loss_adv_disc: -1.035817 2023-02-12 16:12:19,015 - INFO - [Train] step: 15599, loss_mpn: 0.013551, loss_rec: 0.021167, loss_semantic: 0.354886, loss_idmrf: 0.928748, loss_adv_gen: -122.134354 2023-02-12 16:12:48,326 - INFO - [Train] step: 15699, loss_adv_disc: -2.363843 2023-02-12 16:12:48,536 - INFO - [Train] step: 15699, loss_mpn: 0.019518, loss_rec: 0.024358, loss_semantic: 0.379131, loss_idmrf: 1.181651, loss_adv_gen: -103.208733 2023-02-12 16:13:17,837 - INFO - [Train] step: 15799, loss_adv_disc: -0.818816 2023-02-12 16:13:18,046 - INFO - [Train] step: 15799, loss_mpn: 0.017659, loss_rec: 0.026649, loss_semantic: 0.391756, loss_idmrf: 1.939934, loss_adv_gen: -135.258698 2023-02-12 16:13:47,349 - INFO - [Train] step: 15899, loss_adv_disc: -1.612358 2023-02-12 16:13:47,558 - INFO - [Train] step: 15899, loss_mpn: 0.011624, loss_rec: 0.027462, loss_semantic: 0.385324, loss_idmrf: 0.910483, loss_adv_gen: -129.717789 2023-02-12 16:14:16,865 - INFO - [Train] step: 15999, loss_adv_disc: -1.801970 2023-02-12 16:14:17,076 - INFO - [Train] step: 15999, loss_mpn: 0.034720, loss_rec: 0.032492, loss_semantic: 0.394786, loss_idmrf: 1.137444, loss_adv_gen: -132.660370 2023-02-12 16:14:54,265 - INFO - [Eval] step: 15999, bce: 0.306298, psnr: 28.455198, ssim: 0.954385 2023-02-12 16:15:24,618 - INFO - [Train] step: 16099, loss_adv_disc: -0.523683 2023-02-12 16:15:24,828 - INFO - [Train] step: 16099, loss_mpn: 0.013928, loss_rec: 0.034034, loss_semantic: 0.433492, loss_idmrf: 1.327242, loss_adv_gen: -82.181107 2023-02-12 16:15:54,132 - INFO - [Train] step: 16199, loss_adv_disc: -1.345654 2023-02-12 16:15:54,341 - INFO - [Train] step: 16199, loss_mpn: 0.016082, loss_rec: 0.028009, loss_semantic: 0.397227, loss_idmrf: 0.816519, loss_adv_gen: -131.595657 2023-02-12 16:16:23,646 - INFO - [Train] step: 16299, loss_adv_disc: -2.553635 2023-02-12 16:16:23,855 - INFO - [Train] step: 16299, loss_mpn: 0.020678, loss_rec: 0.027889, loss_semantic: 0.396791, loss_idmrf: 0.867642, loss_adv_gen: -144.310089 2023-02-12 16:16:53,149 - INFO - [Train] step: 16399, loss_adv_disc: -1.215216 2023-02-12 16:16:53,359 - INFO - [Train] step: 16399, loss_mpn: 0.015651, loss_rec: 0.026465, loss_semantic: 0.372941, loss_idmrf: 0.596170, loss_adv_gen: -141.296097 2023-02-12 16:17:22,658 - INFO - [Train] step: 16499, loss_adv_disc: -1.153915 2023-02-12 16:17:22,869 - INFO - [Train] step: 16499, loss_mpn: 0.011120, loss_rec: 0.020798, loss_semantic: 0.302271, loss_idmrf: 1.135478, loss_adv_gen: -156.829041 2023-02-12 16:17:52,172 - INFO - [Train] step: 16599, loss_adv_disc: -0.912775 2023-02-12 16:17:52,383 - INFO - [Train] step: 16599, loss_mpn: 0.008206, loss_rec: 0.018265, loss_semantic: 0.288056, loss_idmrf: 0.732165, loss_adv_gen: -106.054985 2023-02-12 16:18:21,672 - INFO - [Train] step: 16699, loss_adv_disc: -2.081031 2023-02-12 16:18:21,881 - INFO - [Train] step: 16699, loss_mpn: 0.018193, loss_rec: 0.031450, loss_semantic: 0.419401, loss_idmrf: 0.929888, loss_adv_gen: -111.711731 2023-02-12 16:18:51,180 - INFO - [Train] step: 16799, loss_adv_disc: -1.307692 2023-02-12 16:18:51,389 - INFO - [Train] step: 16799, loss_mpn: 0.012063, loss_rec: 0.024626, loss_semantic: 0.336043, loss_idmrf: 1.499988, loss_adv_gen: -175.205460 2023-02-12 16:19:20,703 - INFO - [Train] step: 16899, loss_adv_disc: -1.324519 2023-02-12 16:19:20,912 - INFO - [Train] step: 16899, loss_mpn: 0.016227, loss_rec: 0.031347, loss_semantic: 0.402643, loss_idmrf: 2.323829, loss_adv_gen: -120.771484 2023-02-12 16:19:50,217 - INFO - [Train] step: 16999, loss_adv_disc: -2.507675 2023-02-12 16:19:50,426 - INFO - [Train] step: 16999, loss_mpn: 0.016803, loss_rec: 0.025666, loss_semantic: 0.354471, loss_idmrf: 1.031530, loss_adv_gen: -143.293396 2023-02-12 16:20:27,604 - INFO - [Eval] step: 16999, bce: 0.273737, psnr: 28.555843, ssim: 0.955114 2023-02-12 16:20:58,416 - INFO - [Train] step: 17099, loss_adv_disc: 0.443840 2023-02-12 16:20:58,626 - INFO - [Train] step: 17099, loss_mpn: 0.023819, loss_rec: 0.035763, loss_semantic: 0.446109, loss_idmrf: 1.448572, loss_adv_gen: -135.764587 2023-02-12 16:21:27,925 - INFO - [Train] step: 17199, loss_adv_disc: -1.347090 2023-02-12 16:21:28,137 - INFO - [Train] step: 17199, loss_mpn: 0.012972, loss_rec: 0.021817, loss_semantic: 0.339731, loss_idmrf: 1.089608, loss_adv_gen: -148.485229 2023-02-12 16:21:57,450 - INFO - [Train] step: 17299, loss_adv_disc: -1.612137 2023-02-12 16:21:57,660 - INFO - [Train] step: 17299, loss_mpn: 0.011751, loss_rec: 0.028418, loss_semantic: 0.330752, loss_idmrf: 1.093765, loss_adv_gen: -128.659851 2023-02-12 16:22:26,975 - INFO - [Train] step: 17399, loss_adv_disc: -1.544677 2023-02-12 16:22:27,184 - INFO - [Train] step: 17399, loss_mpn: 0.020075, loss_rec: 0.023185, loss_semantic: 0.295154, loss_idmrf: 0.754979, loss_adv_gen: -121.820137 2023-02-12 16:22:56,493 - INFO - [Train] step: 17499, loss_adv_disc: -0.432313 2023-02-12 16:22:56,702 - INFO - [Train] step: 17499, loss_mpn: 0.014809, loss_rec: 0.022634, loss_semantic: 0.313536, loss_idmrf: 1.329498, loss_adv_gen: -147.561279 2023-02-12 16:23:26,003 - INFO - [Train] step: 17599, loss_adv_disc: -1.115981 2023-02-12 16:23:26,212 - INFO - [Train] step: 17599, loss_mpn: 0.009049, loss_rec: 0.025591, loss_semantic: 0.337981, loss_idmrf: 0.947413, loss_adv_gen: -143.247269 2023-02-12 16:23:55,511 - INFO - [Train] step: 17699, loss_adv_disc: -2.487102 2023-02-12 16:23:55,720 - INFO - [Train] step: 17699, loss_mpn: 0.013790, loss_rec: 0.025868, loss_semantic: 0.363133, loss_idmrf: 1.507466, loss_adv_gen: -111.246292 2023-02-12 16:24:25,024 - INFO - [Train] step: 17799, loss_adv_disc: -0.998510 2023-02-12 16:24:25,233 - INFO - [Train] step: 17799, loss_mpn: 0.019137, loss_rec: 0.031876, loss_semantic: 0.361409, loss_idmrf: 1.181487, loss_adv_gen: -124.277679 2023-02-12 16:24:54,528 - INFO - [Train] step: 17899, loss_adv_disc: -1.453950 2023-02-12 16:24:54,738 - INFO - [Train] step: 17899, loss_mpn: 0.014552, loss_rec: 0.023205, loss_semantic: 0.341911, loss_idmrf: 0.837011, loss_adv_gen: -134.504120 2023-02-12 16:25:24,050 - INFO - [Train] step: 17999, loss_adv_disc: -0.939511 2023-02-12 16:25:24,259 - INFO - [Train] step: 17999, loss_mpn: 0.012760, loss_rec: 0.025366, loss_semantic: 0.373259, loss_idmrf: 0.979126, loss_adv_gen: -151.648438 2023-02-12 16:26:01,440 - INFO - [Eval] step: 17999, bce: 0.251774, psnr: 28.777643, ssim: 0.955392 2023-02-12 16:26:32,027 - INFO - [Train] step: 18099, loss_adv_disc: -2.070312 2023-02-12 16:26:32,236 - INFO - [Train] step: 18099, loss_mpn: 0.004872, loss_rec: 0.020001, loss_semantic: 0.310476, loss_idmrf: 1.048967, loss_adv_gen: -150.932953 2023-02-12 16:27:01,538 - INFO - [Train] step: 18199, loss_adv_disc: -1.143127 2023-02-12 16:27:01,747 - INFO - [Train] step: 18199, loss_mpn: 0.025022, loss_rec: 0.028321, loss_semantic: 0.423840, loss_idmrf: 1.156174, loss_adv_gen: -102.148407 2023-02-12 16:27:31,063 - INFO - [Train] step: 18299, loss_adv_disc: -4.435246 2023-02-12 16:27:31,273 - INFO - [Train] step: 18299, loss_mpn: 0.018011, loss_rec: 0.030718, loss_semantic: 0.427602, loss_idmrf: 0.809859, loss_adv_gen: -186.041138 2023-02-12 16:28:00,568 - INFO - [Train] step: 18399, loss_adv_disc: -3.066000 2023-02-12 16:28:00,778 - INFO - [Train] step: 18399, loss_mpn: 0.011016, loss_rec: 0.030814, loss_semantic: 0.371703, loss_idmrf: 0.757866, loss_adv_gen: -105.622498 2023-02-12 16:28:30,073 - INFO - [Train] step: 18499, loss_adv_disc: -1.114918 2023-02-12 16:28:30,282 - INFO - [Train] step: 18499, loss_mpn: 0.010394, loss_rec: 0.026084, loss_semantic: 0.364294, loss_idmrf: 0.909241, loss_adv_gen: -115.210663 2023-02-12 16:28:59,592 - INFO - [Train] step: 18599, loss_adv_disc: -1.054129 2023-02-12 16:28:59,802 - INFO - [Train] step: 18599, loss_mpn: 0.009915, loss_rec: 0.017679, loss_semantic: 0.280701, loss_idmrf: 0.945293, loss_adv_gen: -198.327484 2023-02-12 16:29:29,101 - INFO - [Train] step: 18699, loss_adv_disc: -1.622249 2023-02-12 16:29:29,311 - INFO - [Train] step: 18699, loss_mpn: 0.014588, loss_rec: 0.022627, loss_semantic: 0.315072, loss_idmrf: 0.948333, loss_adv_gen: -130.181458 2023-02-12 16:29:58,623 - INFO - [Train] step: 18799, loss_adv_disc: -1.277717 2023-02-12 16:29:58,832 - INFO - [Train] step: 18799, loss_mpn: 0.008285, loss_rec: 0.023155, loss_semantic: 0.342413, loss_idmrf: 0.691079, loss_adv_gen: -154.404709 2023-02-12 16:30:28,135 - INFO - [Train] step: 18899, loss_adv_disc: -0.820742 2023-02-12 16:30:28,345 - INFO - [Train] step: 18899, loss_mpn: 0.012449, loss_rec: 0.027330, loss_semantic: 0.399460, loss_idmrf: 0.998687, loss_adv_gen: -198.893234 2023-02-12 16:30:57,650 - INFO - [Train] step: 18999, loss_adv_disc: -1.660756 2023-02-12 16:30:57,859 - INFO - [Train] step: 18999, loss_mpn: 0.023072, loss_rec: 0.028610, loss_semantic: 0.388851, loss_idmrf: 2.183407, loss_adv_gen: -116.730270 2023-02-12 16:31:35,047 - INFO - [Eval] step: 18999, bce: 0.249404, psnr: 28.782251, ssim: 0.955003 2023-02-12 16:32:05,584 - INFO - [Train] step: 19099, loss_adv_disc: -0.901823 2023-02-12 16:32:05,793 - INFO - [Train] step: 19099, loss_mpn: 0.013171, loss_rec: 0.022117, loss_semantic: 0.309392, loss_idmrf: 1.791707, loss_adv_gen: -120.883339 2023-02-12 16:32:35,096 - INFO - [Train] step: 19199, loss_adv_disc: -2.141203 2023-02-12 16:32:35,305 - INFO - [Train] step: 19199, loss_mpn: 0.015092, loss_rec: 0.021426, loss_semantic: 0.324317, loss_idmrf: 0.992624, loss_adv_gen: -87.088943 2023-02-12 16:33:04,621 - INFO - [Train] step: 19299, loss_adv_disc: -0.994007 2023-02-12 16:33:04,831 - INFO - [Train] step: 19299, loss_mpn: 0.010342, loss_rec: 0.022274, loss_semantic: 0.321889, loss_idmrf: 1.285398, loss_adv_gen: -134.916077 2023-02-12 16:33:34,141 - INFO - [Train] step: 19399, loss_adv_disc: -1.172217 2023-02-12 16:33:34,350 - INFO - [Train] step: 19399, loss_mpn: 0.031772, loss_rec: 0.026156, loss_semantic: 0.393020, loss_idmrf: 1.249870, loss_adv_gen: -119.829636 2023-02-12 16:34:03,654 - INFO - [Train] step: 19499, loss_adv_disc: -1.592698 2023-02-12 16:34:03,864 - INFO - [Train] step: 19499, loss_mpn: 0.008814, loss_rec: 0.021568, loss_semantic: 0.314776, loss_idmrf: 1.079767, loss_adv_gen: -189.912231 2023-02-12 16:34:33,164 - INFO - [Train] step: 19599, loss_adv_disc: -2.335361 2023-02-12 16:34:33,373 - INFO - [Train] step: 19599, loss_mpn: 0.015828, loss_rec: 0.028755, loss_semantic: 0.400870, loss_idmrf: 0.821946, loss_adv_gen: -116.378441 2023-02-12 16:35:02,679 - INFO - [Train] step: 19699, loss_adv_disc: -3.652874 2023-02-12 16:35:02,889 - INFO - [Train] step: 19699, loss_mpn: 0.013268, loss_rec: 0.028320, loss_semantic: 0.378278, loss_idmrf: 1.227645, loss_adv_gen: -189.740234 2023-02-12 16:35:32,194 - INFO - [Train] step: 19799, loss_adv_disc: -3.464795 2023-02-12 16:35:32,403 - INFO - [Train] step: 19799, loss_mpn: 0.019603, loss_rec: 0.027852, loss_semantic: 0.389158, loss_idmrf: 1.068079, loss_adv_gen: -110.916656 2023-02-12 16:36:01,720 - INFO - [Train] step: 19899, loss_adv_disc: -1.928282 2023-02-12 16:36:01,931 - INFO - [Train] step: 19899, loss_mpn: 0.012590, loss_rec: 0.024261, loss_semantic: 0.363997, loss_idmrf: 0.816192, loss_adv_gen: -141.078857 2023-02-12 16:36:31,241 - INFO - [Train] step: 19999, loss_adv_disc: -2.433385 2023-02-12 16:36:31,451 - INFO - [Train] step: 19999, loss_mpn: 0.020244, loss_rec: 0.027704, loss_semantic: 0.405578, loss_idmrf: 1.701902, loss_adv_gen: -145.730759 2023-02-12 16:37:08,618 - INFO - [Eval] step: 19999, bce: 0.209174, psnr: 28.818005, ssim: 0.956007 2023-02-12 16:37:39,295 - INFO - [Train] step: 20099, loss_adv_disc: -1.841959 2023-02-12 16:37:39,504 - INFO - [Train] step: 20099, loss_mpn: 0.013280, loss_rec: 0.023714, loss_semantic: 0.338448, loss_idmrf: 0.850038, loss_adv_gen: -134.805756 2023-02-12 16:38:08,805 - INFO - [Train] step: 20199, loss_adv_disc: 1.855919 2023-02-12 16:38:09,014 - INFO - [Train] step: 20199, loss_mpn: 0.026404, loss_rec: 0.031911, loss_semantic: 0.355084, loss_idmrf: 1.424328, loss_adv_gen: -146.292511 2023-02-12 16:38:38,323 - INFO - [Train] step: 20299, loss_adv_disc: -2.067120 2023-02-12 16:38:38,534 - INFO - [Train] step: 20299, loss_mpn: 0.019413, loss_rec: 0.023018, loss_semantic: 0.353827, loss_idmrf: 1.648720, loss_adv_gen: -185.932465 2023-02-12 16:39:07,843 - INFO - [Train] step: 20399, loss_adv_disc: -0.934464 2023-02-12 16:39:08,053 - INFO - [Train] step: 20399, loss_mpn: 0.006587, loss_rec: 0.017788, loss_semantic: 0.290594, loss_idmrf: 0.537718, loss_adv_gen: -150.597672 2023-02-12 16:39:37,357 - INFO - [Train] step: 20499, loss_adv_disc: -1.260324 2023-02-12 16:39:37,567 - INFO - [Train] step: 20499, loss_mpn: 0.016781, loss_rec: 0.030744, loss_semantic: 0.392980, loss_idmrf: 1.363693, loss_adv_gen: -115.129227 2023-02-12 16:40:06,877 - INFO - [Train] step: 20599, loss_adv_disc: -2.582953 2023-02-12 16:40:07,086 - INFO - [Train] step: 20599, loss_mpn: 0.024953, loss_rec: 0.027939, loss_semantic: 0.394067, loss_idmrf: 0.777221, loss_adv_gen: -77.646194 2023-02-12 16:40:36,383 - INFO - [Train] step: 20699, loss_adv_disc: -3.089488 2023-02-12 16:40:36,593 - INFO - [Train] step: 20699, loss_mpn: 0.029470, loss_rec: 0.025282, loss_semantic: 0.334554, loss_idmrf: 1.474221, loss_adv_gen: -127.277504 2023-02-12 16:41:05,889 - INFO - [Train] step: 20799, loss_adv_disc: -0.876586 2023-02-12 16:41:06,098 - INFO - [Train] step: 20799, loss_mpn: 0.009639, loss_rec: 0.017377, loss_semantic: 0.285358, loss_idmrf: 0.991840, loss_adv_gen: -159.339050 2023-02-12 16:41:35,407 - INFO - [Train] step: 20899, loss_adv_disc: -1.457209 2023-02-12 16:41:35,616 - INFO - [Train] step: 20899, loss_mpn: 0.008650, loss_rec: 0.017830, loss_semantic: 0.283788, loss_idmrf: 0.417758, loss_adv_gen: -157.442688 2023-02-12 16:42:04,910 - INFO - [Train] step: 20999, loss_adv_disc: -1.419061 2023-02-12 16:42:05,120 - INFO - [Train] step: 20999, loss_mpn: 0.012357, loss_rec: 0.018674, loss_semantic: 0.279355, loss_idmrf: 0.810399, loss_adv_gen: -120.525650 2023-02-12 16:42:42,298 - INFO - [Eval] step: 20999, bce: 0.252244, psnr: 28.920322, ssim: 0.956192 2023-02-12 16:43:12,884 - INFO - [Train] step: 21099, loss_adv_disc: -3.656307 2023-02-12 16:43:13,093 - INFO - [Train] step: 21099, loss_mpn: 0.036810, loss_rec: 0.028897, loss_semantic: 0.422848, loss_idmrf: 1.096376, loss_adv_gen: -134.792084 2023-02-12 16:43:42,399 - INFO - [Train] step: 21199, loss_adv_disc: -1.268478 2023-02-12 16:43:42,608 - INFO - [Train] step: 21199, loss_mpn: 0.011635, loss_rec: 0.021036, loss_semantic: 0.336162, loss_idmrf: 1.261765, loss_adv_gen: -150.898529 2023-02-12 16:44:11,927 - INFO - [Train] step: 21299, loss_adv_disc: -2.235332 2023-02-12 16:44:12,137 - INFO - [Train] step: 21299, loss_mpn: 0.016775, loss_rec: 0.025687, loss_semantic: 0.365133, loss_idmrf: 0.836298, loss_adv_gen: -120.975540 2023-02-12 16:44:41,436 - INFO - [Train] step: 21399, loss_adv_disc: -1.310328 2023-02-12 16:44:41,645 - INFO - [Train] step: 21399, loss_mpn: 0.015091, loss_rec: 0.018726, loss_semantic: 0.313983, loss_idmrf: 1.089233, loss_adv_gen: -122.337807 2023-02-12 16:45:10,949 - INFO - [Train] step: 21499, loss_adv_disc: -1.005040 2023-02-12 16:45:11,160 - INFO - [Train] step: 21499, loss_mpn: 0.006984, loss_rec: 0.018068, loss_semantic: 0.275090, loss_idmrf: 1.075845, loss_adv_gen: -125.419098 2023-02-12 16:45:40,463 - INFO - [Train] step: 21599, loss_adv_disc: -3.571425 2023-02-12 16:45:40,674 - INFO - [Train] step: 21599, loss_mpn: 0.037772, loss_rec: 0.044806, loss_semantic: 0.504730, loss_idmrf: 2.309431, loss_adv_gen: -139.435516 2023-02-12 16:46:09,968 - INFO - [Train] step: 21699, loss_adv_disc: -2.916445 2023-02-12 16:46:10,177 - INFO - [Train] step: 21699, loss_mpn: 0.013587, loss_rec: 0.024536, loss_semantic: 0.374580, loss_idmrf: 0.640778, loss_adv_gen: -175.977646 2023-02-12 16:46:39,484 - INFO - [Train] step: 21799, loss_adv_disc: -2.589717 2023-02-12 16:46:39,694 - INFO - [Train] step: 21799, loss_mpn: 0.020012, loss_rec: 0.031750, loss_semantic: 0.408765, loss_idmrf: 1.518515, loss_adv_gen: -157.491180 2023-02-12 16:47:09,001 - INFO - [Train] step: 21899, loss_adv_disc: -1.635605 2023-02-12 16:47:09,210 - INFO - [Train] step: 21899, loss_mpn: 0.015952, loss_rec: 0.027732, loss_semantic: 0.386680, loss_idmrf: 1.042697, loss_adv_gen: -186.985092 2023-02-12 16:47:38,497 - INFO - [Train] step: 21999, loss_adv_disc: -8.392606 2023-02-12 16:47:38,706 - INFO - [Train] step: 21999, loss_mpn: 0.107939, loss_rec: 0.044124, loss_semantic: 0.426951, loss_idmrf: 1.802685, loss_adv_gen: -192.524048 2023-02-12 16:48:15,882 - INFO - [Eval] step: 21999, bce: 0.284210, psnr: 28.904793, ssim: 0.955964 2023-02-12 16:48:46,222 - INFO - [Train] step: 22099, loss_adv_disc: -1.746154 2023-02-12 16:48:46,432 - INFO - [Train] step: 22099, loss_mpn: 0.019285, loss_rec: 0.036723, loss_semantic: 0.427170, loss_idmrf: 1.650035, loss_adv_gen: -164.901230 2023-02-12 16:49:15,747 - INFO - [Train] step: 22199, loss_adv_disc: -1.306040 2023-02-12 16:49:15,956 - INFO - [Train] step: 22199, loss_mpn: 0.025310, loss_rec: 0.022228, loss_semantic: 0.356517, loss_idmrf: 1.232680, loss_adv_gen: -159.752747 2023-02-12 16:49:45,259 - INFO - [Train] step: 22299, loss_adv_disc: -2.872841 2023-02-12 16:49:45,467 - INFO - [Train] step: 22299, loss_mpn: 0.010491, loss_rec: 0.028485, loss_semantic: 0.366613, loss_idmrf: 1.538266, loss_adv_gen: -118.306511 2023-02-12 16:50:14,769 - INFO - [Train] step: 22399, loss_adv_disc: -1.093340 2023-02-12 16:50:14,979 - INFO - [Train] step: 22399, loss_mpn: 0.014559, loss_rec: 0.027062, loss_semantic: 0.412688, loss_idmrf: 1.395007, loss_adv_gen: -133.619751 2023-02-12 16:50:44,277 - INFO - [Train] step: 22499, loss_adv_disc: -2.594123 2023-02-12 16:50:44,486 - INFO - [Train] step: 22499, loss_mpn: 0.013136, loss_rec: 0.034545, loss_semantic: 0.410396, loss_idmrf: 1.819280, loss_adv_gen: -176.939453 2023-02-12 16:51:13,787 - INFO - [Train] step: 22599, loss_adv_disc: -0.594784 2023-02-12 16:51:13,996 - INFO - [Train] step: 22599, loss_mpn: 0.014466, loss_rec: 0.016839, loss_semantic: 0.266022, loss_idmrf: 1.078344, loss_adv_gen: -150.209732 2023-02-12 16:51:43,297 - INFO - [Train] step: 22699, loss_adv_disc: -1.318361 2023-02-12 16:51:43,506 - INFO - [Train] step: 22699, loss_mpn: 0.010150, loss_rec: 0.020194, loss_semantic: 0.302510, loss_idmrf: 1.550588, loss_adv_gen: -143.478241 2023-02-12 16:52:12,797 - INFO - [Train] step: 22799, loss_adv_disc: -0.774170 2023-02-12 16:52:13,007 - INFO - [Train] step: 22799, loss_mpn: 0.007393, loss_rec: 0.013815, loss_semantic: 0.271783, loss_idmrf: 0.527916, loss_adv_gen: -135.298737 2023-02-12 16:52:42,315 - INFO - [Train] step: 22899, loss_adv_disc: -1.783115 2023-02-12 16:52:42,526 - INFO - [Train] step: 22899, loss_mpn: 0.016110, loss_rec: 0.019332, loss_semantic: 0.317586, loss_idmrf: 0.727104, loss_adv_gen: -146.352356 2023-02-12 16:53:11,839 - INFO - [Train] step: 22999, loss_adv_disc: -2.031198 2023-02-12 16:53:12,049 - INFO - [Train] step: 22999, loss_mpn: 0.006879, loss_rec: 0.022278, loss_semantic: 0.279931, loss_idmrf: 0.680530, loss_adv_gen: -140.565704 2023-02-12 16:53:49,269 - INFO - [Eval] step: 22999, bce: 0.233620, psnr: 28.940762, ssim: 0.956207 2023-02-12 16:54:19,812 - INFO - [Train] step: 23099, loss_adv_disc: -1.811294 2023-02-12 16:54:20,023 - INFO - [Train] step: 23099, loss_mpn: 0.022722, loss_rec: 0.032051, loss_semantic: 0.389429, loss_idmrf: 1.227663, loss_adv_gen: -113.256638 2023-02-12 16:54:49,328 - INFO - [Train] step: 23199, loss_adv_disc: -1.308440 2023-02-12 16:54:49,537 - INFO - [Train] step: 23199, loss_mpn: 0.011498, loss_rec: 0.019091, loss_semantic: 0.329096, loss_idmrf: 0.807683, loss_adv_gen: -167.257248 2023-02-12 16:55:18,840 - INFO - [Train] step: 23299, loss_adv_disc: -3.865481 2023-02-12 16:55:19,049 - INFO - [Train] step: 23299, loss_mpn: 0.021598, loss_rec: 0.025840, loss_semantic: 0.344968, loss_idmrf: 1.217728, loss_adv_gen: -150.691925 2023-02-12 16:55:48,345 - INFO - [Train] step: 23399, loss_adv_disc: -2.390714 2023-02-12 16:55:48,554 - INFO - [Train] step: 23399, loss_mpn: 0.018954, loss_rec: 0.026836, loss_semantic: 0.372773, loss_idmrf: 1.250325, loss_adv_gen: -141.416229 2023-02-12 16:56:17,853 - INFO - [Train] step: 23499, loss_adv_disc: -3.273348 2023-02-12 16:56:18,063 - INFO - [Train] step: 23499, loss_mpn: 0.018282, loss_rec: 0.028839, loss_semantic: 0.362737, loss_idmrf: 1.505737, loss_adv_gen: -162.818954 2023-02-12 16:56:47,366 - INFO - [Train] step: 23599, loss_adv_disc: -2.462272 2023-02-12 16:56:47,575 - INFO - [Train] step: 23599, loss_mpn: 0.024494, loss_rec: 0.026901, loss_semantic: 0.354137, loss_idmrf: 1.104522, loss_adv_gen: -141.302185 2023-02-12 16:57:16,873 - INFO - [Train] step: 23699, loss_adv_disc: -0.763700 2023-02-12 16:57:17,082 - INFO - [Train] step: 23699, loss_mpn: 0.015099, loss_rec: 0.020825, loss_semantic: 0.319443, loss_idmrf: 0.718052, loss_adv_gen: -161.518600 2023-02-12 16:57:46,376 - INFO - [Train] step: 23799, loss_adv_disc: -2.089024 2023-02-12 16:57:46,584 - INFO - [Train] step: 23799, loss_mpn: 0.017112, loss_rec: 0.028866, loss_semantic: 0.389991, loss_idmrf: 1.943102, loss_adv_gen: -167.856949 2023-02-12 16:58:15,885 - INFO - [Train] step: 23899, loss_adv_disc: -2.228132 2023-02-12 16:58:16,093 - INFO - [Train] step: 23899, loss_mpn: 0.013112, loss_rec: 0.023315, loss_semantic: 0.341805, loss_idmrf: 0.952730, loss_adv_gen: -161.645691 2023-02-12 16:58:45,399 - INFO - [Train] step: 23999, loss_adv_disc: -2.955951 2023-02-12 16:58:45,608 - INFO - [Train] step: 23999, loss_mpn: 0.023215, loss_rec: 0.024262, loss_semantic: 0.338789, loss_idmrf: 0.893917, loss_adv_gen: -151.165527 2023-02-12 16:59:22,806 - INFO - [Eval] step: 23999, bce: 0.227693, psnr: 28.873348, ssim: 0.956545 2023-02-12 16:59:53,146 - INFO - [Train] step: 24099, loss_adv_disc: 0.729901 2023-02-12 16:59:53,355 - INFO - [Train] step: 24099, loss_mpn: 0.011586, loss_rec: 0.028419, loss_semantic: 0.348516, loss_idmrf: 0.913226, loss_adv_gen: -73.398590 2023-02-12 17:00:22,863 - INFO - [Train] step: 24199, loss_adv_disc: -1.865173 2023-02-12 17:00:23,072 - INFO - [Train] step: 24199, loss_mpn: 0.005211, loss_rec: 0.016962, loss_semantic: 0.275092, loss_idmrf: 0.944809, loss_adv_gen: -170.304947 2023-02-12 17:00:52,390 - INFO - [Train] step: 24299, loss_adv_disc: -2.595350 2023-02-12 17:00:52,599 - INFO - [Train] step: 24299, loss_mpn: 0.017710, loss_rec: 0.027222, loss_semantic: 0.361134, loss_idmrf: 1.021314, loss_adv_gen: -127.347702 2023-02-12 17:01:21,901 - INFO - [Train] step: 24399, loss_adv_disc: -1.910214 2023-02-12 17:01:22,110 - INFO - [Train] step: 24399, loss_mpn: 0.005701, loss_rec: 0.019258, loss_semantic: 0.314592, loss_idmrf: 0.992199, loss_adv_gen: -133.050537 2023-02-12 17:01:51,440 - INFO - [Train] step: 24499, loss_adv_disc: -1.782067 2023-02-12 17:01:51,649 - INFO - [Train] step: 24499, loss_mpn: 0.014929, loss_rec: 0.026913, loss_semantic: 0.362223, loss_idmrf: 2.032750, loss_adv_gen: -174.313446 2023-02-12 17:02:20,949 - INFO - [Train] step: 24599, loss_adv_disc: -2.633541 2023-02-12 17:02:21,158 - INFO - [Train] step: 24599, loss_mpn: 0.016451, loss_rec: 0.029286, loss_semantic: 0.428398, loss_idmrf: 1.132536, loss_adv_gen: -180.112793 2023-02-12 17:02:50,472 - INFO - [Train] step: 24699, loss_adv_disc: -1.979819 2023-02-12 17:02:50,681 - INFO - [Train] step: 24699, loss_mpn: 0.015001, loss_rec: 0.025143, loss_semantic: 0.349735, loss_idmrf: 1.616082, loss_adv_gen: -172.063812 2023-02-12 17:03:20,001 - INFO - [Train] step: 24799, loss_adv_disc: -2.323519 2023-02-12 17:03:20,210 - INFO - [Train] step: 24799, loss_mpn: 0.014403, loss_rec: 0.024294, loss_semantic: 0.321397, loss_idmrf: 1.131323, loss_adv_gen: -120.714363 2023-02-12 17:03:49,517 - INFO - [Train] step: 24899, loss_adv_disc: -1.233411 2023-02-12 17:03:49,727 - INFO - [Train] step: 24899, loss_mpn: 0.024732, loss_rec: 0.041411, loss_semantic: 0.437810, loss_idmrf: 2.088786, loss_adv_gen: -137.012451 2023-02-12 17:04:19,026 - INFO - [Train] step: 24999, loss_adv_disc: -2.287255 2023-02-12 17:04:19,235 - INFO - [Train] step: 24999, loss_mpn: 0.012211, loss_rec: 0.016496, loss_semantic: 0.273215, loss_idmrf: 0.798228, loss_adv_gen: -187.349533 2023-02-12 17:04:56,407 - INFO - [Eval] step: 24999, bce: 0.217434, psnr: 28.747816, ssim: 0.956415 2023-02-12 17:05:26,876 - INFO - [Train] step: 25099, loss_adv_disc: -1.498417 2023-02-12 17:05:27,084 - INFO - [Train] step: 25099, loss_mpn: 0.008912, loss_rec: 0.024698, loss_semantic: 0.377229, loss_idmrf: 1.032481, loss_adv_gen: -147.471710 2023-02-12 17:05:56,391 - INFO - [Train] step: 25199, loss_adv_disc: -2.891723 2023-02-12 17:05:56,601 - INFO - [Train] step: 25199, loss_mpn: 0.016737, loss_rec: 0.028748, loss_semantic: 0.384448, loss_idmrf: 1.425442, loss_adv_gen: -110.839966 2023-02-12 17:06:25,909 - INFO - [Train] step: 25299, loss_adv_disc: -0.386247 2023-02-12 17:06:26,119 - INFO - [Train] step: 25299, loss_mpn: 0.008405, loss_rec: 0.023324, loss_semantic: 0.325677, loss_idmrf: 1.598768, loss_adv_gen: -164.717896 2023-02-12 17:06:55,427 - INFO - [Train] step: 25399, loss_adv_disc: -2.591228 2023-02-12 17:06:55,636 - INFO - [Train] step: 25399, loss_mpn: 0.020380, loss_rec: 0.028707, loss_semantic: 0.367617, loss_idmrf: 1.145007, loss_adv_gen: -150.130249 2023-02-12 17:07:24,943 - INFO - [Train] step: 25499, loss_adv_disc: -2.601324 2023-02-12 17:07:25,153 - INFO - [Train] step: 25499, loss_mpn: 0.016160, loss_rec: 0.028725, loss_semantic: 0.408707, loss_idmrf: 1.307903, loss_adv_gen: -150.284836 2023-02-12 17:07:54,461 - INFO - [Train] step: 25599, loss_adv_disc: -3.960473 2023-02-12 17:07:54,670 - INFO - [Train] step: 25599, loss_mpn: 0.042515, loss_rec: 0.038495, loss_semantic: 0.474609, loss_idmrf: 2.141218, loss_adv_gen: -146.856583 2023-02-12 17:08:23,980 - INFO - [Train] step: 25699, loss_adv_disc: -1.731003 2023-02-12 17:08:24,190 - INFO - [Train] step: 25699, loss_mpn: 0.015890, loss_rec: 0.027864, loss_semantic: 0.372621, loss_idmrf: 1.385496, loss_adv_gen: -167.668106 2023-02-12 17:08:53,496 - INFO - [Train] step: 25799, loss_adv_disc: -3.080600 2023-02-12 17:08:53,707 - INFO - [Train] step: 25799, loss_mpn: 0.011337, loss_rec: 0.024306, loss_semantic: 0.335416, loss_idmrf: 0.799093, loss_adv_gen: -145.205933 2023-02-12 17:09:23,015 - INFO - [Train] step: 25899, loss_adv_disc: -3.237796 2023-02-12 17:09:23,224 - INFO - [Train] step: 25899, loss_mpn: 0.022188, loss_rec: 0.026139, loss_semantic: 0.354102, loss_idmrf: 1.123192, loss_adv_gen: -202.245911 2023-02-12 17:09:52,552 - INFO - [Train] step: 25999, loss_adv_disc: -1.719940 2023-02-12 17:09:52,760 - INFO - [Train] step: 25999, loss_mpn: 0.013568, loss_rec: 0.021621, loss_semantic: 0.338723, loss_idmrf: 0.946962, loss_adv_gen: -166.338562 2023-02-12 17:10:29,917 - INFO - [Eval] step: 25999, bce: 0.217176, psnr: 28.862385, ssim: 0.956445 2023-02-12 17:11:00,280 - INFO - [Train] step: 26099, loss_adv_disc: -2.482706 2023-02-12 17:11:00,489 - INFO - [Train] step: 26099, loss_mpn: 0.011296, loss_rec: 0.023331, loss_semantic: 0.338147, loss_idmrf: 1.110683, loss_adv_gen: -163.306931 2023-02-12 17:11:29,795 - INFO - [Train] step: 26199, loss_adv_disc: -0.978846 2023-02-12 17:11:30,004 - INFO - [Train] step: 26199, loss_mpn: 0.008555, loss_rec: 0.015771, loss_semantic: 0.300213, loss_idmrf: 0.859133, loss_adv_gen: -133.072189 2023-02-12 17:11:59,315 - INFO - [Train] step: 26299, loss_adv_disc: -2.406770 2023-02-12 17:11:59,525 - INFO - [Train] step: 26299, loss_mpn: 0.039417, loss_rec: 0.023581, loss_semantic: 0.308855, loss_idmrf: 0.960897, loss_adv_gen: -176.477188 2023-02-12 17:12:28,826 - INFO - [Train] step: 26399, loss_adv_disc: -2.433149 2023-02-12 17:12:29,036 - INFO - [Train] step: 26399, loss_mpn: 0.007664, loss_rec: 0.025552, loss_semantic: 0.329118, loss_idmrf: 1.333155, loss_adv_gen: -160.530304 2023-02-12 17:12:58,351 - INFO - [Train] step: 26499, loss_adv_disc: -1.197546 2023-02-12 17:12:58,559 - INFO - [Train] step: 26499, loss_mpn: 0.020631, loss_rec: 0.025007, loss_semantic: 0.386383, loss_idmrf: 1.628488, loss_adv_gen: -188.927597 2023-02-12 17:13:27,861 - INFO - [Train] step: 26599, loss_adv_disc: -5.886103 2023-02-12 17:13:28,072 - INFO - [Train] step: 26599, loss_mpn: 0.026936, loss_rec: 0.045931, loss_semantic: 0.476970, loss_idmrf: 1.890998, loss_adv_gen: -153.358917 2023-02-12 17:13:57,376 - INFO - [Train] step: 26699, loss_adv_disc: -1.288116 2023-02-12 17:13:57,585 - INFO - [Train] step: 26699, loss_mpn: 0.020642, loss_rec: 0.025228, loss_semantic: 0.360734, loss_idmrf: 1.322896, loss_adv_gen: -154.646317 2023-02-12 17:14:26,886 - INFO - [Train] step: 26799, loss_adv_disc: -3.372795 2023-02-12 17:14:27,095 - INFO - [Train] step: 26799, loss_mpn: 0.034300, loss_rec: 0.028037, loss_semantic: 0.404747, loss_idmrf: 1.278778, loss_adv_gen: -156.918350 2023-02-12 17:14:56,417 - INFO - [Train] step: 26899, loss_adv_disc: -1.826304 2023-02-12 17:14:56,626 - INFO - [Train] step: 26899, loss_mpn: 0.021970, loss_rec: 0.030406, loss_semantic: 0.371994, loss_idmrf: 1.689493, loss_adv_gen: -144.927139 2023-02-12 17:15:25,923 - INFO - [Train] step: 26999, loss_adv_disc: -2.447597 2023-02-12 17:15:26,132 - INFO - [Train] step: 26999, loss_mpn: 0.018844, loss_rec: 0.027386, loss_semantic: 0.355592, loss_idmrf: 0.811143, loss_adv_gen: -177.419571 2023-02-12 17:16:03,343 - INFO - [Eval] step: 26999, bce: 0.235807, psnr: 28.637211, ssim: 0.956397 2023-02-12 17:16:33,707 - INFO - [Train] step: 27099, loss_adv_disc: -1.882026 2023-02-12 17:16:33,917 - INFO - [Train] step: 27099, loss_mpn: 0.023662, loss_rec: 0.042340, loss_semantic: 0.459490, loss_idmrf: 1.902304, loss_adv_gen: -148.617722 2023-02-12 17:17:03,231 - INFO - [Train] step: 27199, loss_adv_disc: 0.313122 2023-02-12 17:17:03,441 - INFO - [Train] step: 27199, loss_mpn: 0.018258, loss_rec: 0.022931, loss_semantic: 0.334520, loss_idmrf: 2.197278, loss_adv_gen: -167.865952 2023-02-12 17:17:32,753 - INFO - [Train] step: 27299, loss_adv_disc: -3.068515 2023-02-12 17:17:32,964 - INFO - [Train] step: 27299, loss_mpn: 0.011543, loss_rec: 0.027794, loss_semantic: 0.316708, loss_idmrf: 2.313043, loss_adv_gen: -170.539856 2023-02-12 17:18:02,275 - INFO - [Train] step: 27399, loss_adv_disc: -1.599291 2023-02-12 17:18:02,484 - INFO - [Train] step: 27399, loss_mpn: 0.016099, loss_rec: 0.025766, loss_semantic: 0.386845, loss_idmrf: 1.257993, loss_adv_gen: -175.805573 2023-02-12 17:18:31,789 - INFO - [Train] step: 27499, loss_adv_disc: -2.258552 2023-02-12 17:18:31,998 - INFO - [Train] step: 27499, loss_mpn: 0.007269, loss_rec: 0.021754, loss_semantic: 0.314972, loss_idmrf: 0.773397, loss_adv_gen: -148.134369 2023-02-12 17:19:01,311 - INFO - [Train] step: 27599, loss_adv_disc: -3.566375 2023-02-12 17:19:01,520 - INFO - [Train] step: 27599, loss_mpn: 0.022415, loss_rec: 0.032542, loss_semantic: 0.447897, loss_idmrf: 1.548123, loss_adv_gen: -141.587982 2023-02-12 17:19:30,832 - INFO - [Train] step: 27699, loss_adv_disc: -2.320062 2023-02-12 17:19:31,041 - INFO - [Train] step: 27699, loss_mpn: 0.013773, loss_rec: 0.026881, loss_semantic: 0.395102, loss_idmrf: 1.057364, loss_adv_gen: -169.374649 2023-02-12 17:20:00,348 - INFO - [Train] step: 27799, loss_adv_disc: -1.803896 2023-02-12 17:20:00,557 - INFO - [Train] step: 27799, loss_mpn: 0.008567, loss_rec: 0.023302, loss_semantic: 0.317890, loss_idmrf: 1.018495, loss_adv_gen: -194.033859 2023-02-12 17:20:29,867 - INFO - [Train] step: 27899, loss_adv_disc: -0.175231 2023-02-12 17:20:30,076 - INFO - [Train] step: 27899, loss_mpn: 0.018249, loss_rec: 0.031407, loss_semantic: 0.447779, loss_idmrf: 1.503504, loss_adv_gen: -144.521729 2023-02-12 17:20:59,379 - INFO - [Train] step: 27999, loss_adv_disc: -1.153788 2023-02-12 17:20:59,588 - INFO - [Train] step: 27999, loss_mpn: 0.018410, loss_rec: 0.029667, loss_semantic: 0.356129, loss_idmrf: 1.001561, loss_adv_gen: -149.088348 2023-02-12 17:21:36,769 - INFO - [Eval] step: 27999, bce: 0.245350, psnr: 28.910185, ssim: 0.956662 2023-02-12 17:22:07,114 - INFO - [Train] step: 28099, loss_adv_disc: -2.552972 2023-02-12 17:22:07,323 - INFO - [Train] step: 28099, loss_mpn: 0.032058, loss_rec: 0.027175, loss_semantic: 0.427047, loss_idmrf: 1.808927, loss_adv_gen: -197.024048 2023-02-12 17:22:36,642 - INFO - [Train] step: 28199, loss_adv_disc: -2.774353 2023-02-12 17:22:36,851 - INFO - [Train] step: 28199, loss_mpn: 0.015617, loss_rec: 0.023739, loss_semantic: 0.370789, loss_idmrf: 1.317662, loss_adv_gen: -158.977325 2023-02-12 17:23:06,150 - INFO - [Train] step: 28299, loss_adv_disc: -3.636169 2023-02-12 17:23:06,359 - INFO - [Train] step: 28299, loss_mpn: 0.078334, loss_rec: 0.024055, loss_semantic: 0.319711, loss_idmrf: 2.335714, loss_adv_gen: -113.333748 2023-02-12 17:23:35,670 - INFO - [Train] step: 28399, loss_adv_disc: -1.829424 2023-02-12 17:23:35,879 - INFO - [Train] step: 28399, loss_mpn: 0.012427, loss_rec: 0.022979, loss_semantic: 0.341002, loss_idmrf: 2.427568, loss_adv_gen: -196.448471 2023-02-12 17:24:05,197 - INFO - [Train] step: 28499, loss_adv_disc: -1.946328 2023-02-12 17:24:05,406 - INFO - [Train] step: 28499, loss_mpn: 0.003964, loss_rec: 0.021108, loss_semantic: 0.305945, loss_idmrf: 0.565797, loss_adv_gen: -209.550598 2023-02-12 17:24:34,717 - INFO - [Train] step: 28599, loss_adv_disc: -2.240687 2023-02-12 17:24:34,928 - INFO - [Train] step: 28599, loss_mpn: 0.010957, loss_rec: 0.023754, loss_semantic: 0.336603, loss_idmrf: 0.863460, loss_adv_gen: -123.758453 2023-02-12 17:25:04,235 - INFO - [Train] step: 28699, loss_adv_disc: -3.230961 2023-02-12 17:25:04,445 - INFO - [Train] step: 28699, loss_mpn: 0.021533, loss_rec: 0.029686, loss_semantic: 0.391708, loss_idmrf: 1.315190, loss_adv_gen: -146.300156 2023-02-12 17:25:33,768 - INFO - [Train] step: 28799, loss_adv_disc: -2.195852 2023-02-12 17:25:33,977 - INFO - [Train] step: 28799, loss_mpn: 0.007490, loss_rec: 0.018340, loss_semantic: 0.257990, loss_idmrf: 0.833679, loss_adv_gen: -166.157791 2023-02-12 17:26:03,279 - INFO - [Train] step: 28899, loss_adv_disc: -0.680066 2023-02-12 17:26:03,488 - INFO - [Train] step: 28899, loss_mpn: 0.012889, loss_rec: 0.021147, loss_semantic: 0.326759, loss_idmrf: 1.329344, loss_adv_gen: -174.175812 2023-02-12 17:26:32,794 - INFO - [Train] step: 28999, loss_adv_disc: -1.741491 2023-02-12 17:26:33,003 - INFO - [Train] step: 28999, loss_mpn: 0.011110, loss_rec: 0.023960, loss_semantic: 0.379533, loss_idmrf: 0.734622, loss_adv_gen: -172.853516 2023-02-12 17:27:10,183 - INFO - [Eval] step: 28999, bce: 0.326479, psnr: 29.054596, ssim: 0.956231 2023-02-12 17:27:40,815 - INFO - [Train] step: 29099, loss_adv_disc: -2.452001 2023-02-12 17:27:41,024 - INFO - [Train] step: 29099, loss_mpn: 0.008640, loss_rec: 0.021803, loss_semantic: 0.286660, loss_idmrf: 0.494146, loss_adv_gen: -167.526169 2023-02-12 17:28:10,334 - INFO - [Train] step: 29199, loss_adv_disc: -2.613540 2023-02-12 17:28:10,544 - INFO - [Train] step: 29199, loss_mpn: 0.010937, loss_rec: 0.023518, loss_semantic: 0.338009, loss_idmrf: 1.724309, loss_adv_gen: -155.009048 2023-02-12 17:28:39,865 - INFO - [Train] step: 29299, loss_adv_disc: -2.793272 2023-02-12 17:28:40,075 - INFO - [Train] step: 29299, loss_mpn: 0.017880, loss_rec: 0.023838, loss_semantic: 0.310715, loss_idmrf: 0.966292, loss_adv_gen: -93.657944 2023-02-12 17:29:09,386 - INFO - [Train] step: 29399, loss_adv_disc: -3.374887 2023-02-12 17:29:09,595 - INFO - [Train] step: 29399, loss_mpn: 0.005188, loss_rec: 0.030336, loss_semantic: 0.389112, loss_idmrf: 1.143252, loss_adv_gen: -150.846100 2023-02-12 17:29:38,903 - INFO - [Train] step: 29499, loss_adv_disc: -1.504680 2023-02-12 17:29:39,112 - INFO - [Train] step: 29499, loss_mpn: 0.008265, loss_rec: 0.021538, loss_semantic: 0.322539, loss_idmrf: 0.847658, loss_adv_gen: -152.164841 2023-02-12 17:30:08,427 - INFO - [Train] step: 29599, loss_adv_disc: -1.419346 2023-02-12 17:30:08,636 - INFO - [Train] step: 29599, loss_mpn: 0.005204, loss_rec: 0.015426, loss_semantic: 0.252596, loss_idmrf: 0.561830, loss_adv_gen: -174.995392 2023-02-12 17:30:37,937 - INFO - [Train] step: 29699, loss_adv_disc: -1.781726 2023-02-12 17:30:38,147 - INFO - [Train] step: 29699, loss_mpn: 0.007239, loss_rec: 0.019059, loss_semantic: 0.293832, loss_idmrf: 1.135729, loss_adv_gen: -149.783600 2023-02-12 17:31:07,453 - INFO - [Train] step: 29799, loss_adv_disc: -1.680035 2023-02-12 17:31:07,661 - INFO - [Train] step: 29799, loss_mpn: 0.010340, loss_rec: 0.023492, loss_semantic: 0.340636, loss_idmrf: 0.845601, loss_adv_gen: -165.822464 2023-02-12 17:31:36,974 - INFO - [Train] step: 29899, loss_adv_disc: -0.897828 2023-02-12 17:31:37,183 - INFO - [Train] step: 29899, loss_mpn: 0.012099, loss_rec: 0.019235, loss_semantic: 0.316152, loss_idmrf: 0.806648, loss_adv_gen: -159.392731 2023-02-12 17:32:06,484 - INFO - [Train] step: 29999, loss_adv_disc: -2.476839 2023-02-12 17:32:06,693 - INFO - [Train] step: 29999, loss_mpn: 0.007322, loss_rec: 0.021646, loss_semantic: 0.299142, loss_idmrf: 0.653534, loss_adv_gen: -143.732330 2023-02-12 17:32:43,878 - INFO - [Eval] step: 29999, bce: 0.293404, psnr: 29.084879, ssim: 0.956233 2023-02-12 17:33:14,553 - INFO - [Train] step: 30099, loss_adv_disc: -2.295209 2023-02-12 17:33:14,762 - INFO - [Train] step: 30099, loss_mpn: 0.012230, loss_rec: 0.023526, loss_semantic: 0.338702, loss_idmrf: 0.758642, loss_adv_gen: -181.508331 2023-02-12 17:33:44,073 - INFO - [Train] step: 30199, loss_adv_disc: -0.826948 2023-02-12 17:33:44,284 - INFO - [Train] step: 30199, loss_mpn: 0.010710, loss_rec: 0.021615, loss_semantic: 0.373864, loss_idmrf: 1.576367, loss_adv_gen: -187.674469 2023-02-12 17:34:13,606 - INFO - [Train] step: 30299, loss_adv_disc: -2.133293 2023-02-12 17:34:13,815 - INFO - [Train] step: 30299, loss_mpn: 0.016979, loss_rec: 0.027435, loss_semantic: 0.385272, loss_idmrf: 1.185734, loss_adv_gen: -164.756134 2023-02-12 17:34:43,115 - INFO - [Train] step: 30399, loss_adv_disc: -2.732825 2023-02-12 17:34:43,324 - INFO - [Train] step: 30399, loss_mpn: 0.016938, loss_rec: 0.023776, loss_semantic: 0.372872, loss_idmrf: 0.756852, loss_adv_gen: -199.103821 2023-02-12 17:35:12,630 - INFO - [Train] step: 30499, loss_adv_disc: -1.756166 2023-02-12 17:35:12,840 - INFO - [Train] step: 30499, loss_mpn: 0.012822, loss_rec: 0.021439, loss_semantic: 0.336182, loss_idmrf: 1.944498, loss_adv_gen: -125.164902 2023-02-12 17:35:42,155 - INFO - [Train] step: 30599, loss_adv_disc: -3.763085 2023-02-12 17:35:42,364 - INFO - [Train] step: 30599, loss_mpn: 0.030022, loss_rec: 0.035324, loss_semantic: 0.348464, loss_idmrf: 0.999722, loss_adv_gen: -205.326401 2023-02-12 17:36:11,668 - INFO - [Train] step: 30699, loss_adv_disc: -2.120890 2023-02-12 17:36:11,878 - INFO - [Train] step: 30699, loss_mpn: 0.012736, loss_rec: 0.023642, loss_semantic: 0.339662, loss_idmrf: 1.113874, loss_adv_gen: -167.260910 2023-02-12 17:36:41,185 - INFO - [Train] step: 30799, loss_adv_disc: -1.686772 2023-02-12 17:36:41,395 - INFO - [Train] step: 30799, loss_mpn: 0.007747, loss_rec: 0.017307, loss_semantic: 0.288114, loss_idmrf: 1.101601, loss_adv_gen: -156.756332 2023-02-12 17:37:10,708 - INFO - [Train] step: 30899, loss_adv_disc: -1.261269 2023-02-12 17:37:10,917 - INFO - [Train] step: 30899, loss_mpn: 0.027333, loss_rec: 0.028359, loss_semantic: 0.366937, loss_idmrf: 1.513100, loss_adv_gen: -153.674728 2023-02-12 17:37:40,232 - INFO - [Train] step: 30999, loss_adv_disc: -3.315798 2023-02-12 17:37:40,440 - INFO - [Train] step: 30999, loss_mpn: 0.024642, loss_rec: 0.026408, loss_semantic: 0.354345, loss_idmrf: 1.203915, loss_adv_gen: -181.382324 2023-02-12 17:38:17,636 - INFO - [Eval] step: 30999, bce: 0.303419, psnr: 28.905222, ssim: 0.955708 2023-02-12 17:38:47,980 - INFO - [Train] step: 31099, loss_adv_disc: -0.007547 2023-02-12 17:38:48,190 - INFO - [Train] step: 31099, loss_mpn: 0.019936, loss_rec: 0.016585, loss_semantic: 0.308037, loss_idmrf: 1.196816, loss_adv_gen: -165.474762 2023-02-12 17:39:17,511 - INFO - [Train] step: 31199, loss_adv_disc: -3.100771 2023-02-12 17:39:17,721 - INFO - [Train] step: 31199, loss_mpn: 0.047903, loss_rec: 0.033918, loss_semantic: 0.394242, loss_idmrf: 1.596047, loss_adv_gen: -174.689697 2023-02-12 17:39:47,253 - INFO - [Train] step: 31299, loss_adv_disc: -2.970794 2023-02-12 17:39:47,463 - INFO - [Train] step: 31299, loss_mpn: 0.010504, loss_rec: 0.019647, loss_semantic: 0.303357, loss_idmrf: 0.783023, loss_adv_gen: -180.590225 2023-02-12 17:40:16,760 - INFO - [Train] step: 31399, loss_adv_disc: -6.078952 2023-02-12 17:40:16,971 - INFO - [Train] step: 31399, loss_mpn: 0.038989, loss_rec: 0.050333, loss_semantic: 0.449593, loss_idmrf: 2.652514, loss_adv_gen: -125.236641 2023-02-12 17:40:46,260 - INFO - [Train] step: 31499, loss_adv_disc: -1.884539 2023-02-12 17:40:46,470 - INFO - [Train] step: 31499, loss_mpn: 0.010681, loss_rec: 0.023947, loss_semantic: 0.360296, loss_idmrf: 0.808058, loss_adv_gen: -170.917923 2023-02-12 17:41:15,768 - INFO - [Train] step: 31599, loss_adv_disc: -2.956500 2023-02-12 17:41:15,977 - INFO - [Train] step: 31599, loss_mpn: 0.030799, loss_rec: 0.023938, loss_semantic: 0.370240, loss_idmrf: 1.183903, loss_adv_gen: -164.906891 2023-02-12 17:41:45,272 - INFO - [Train] step: 31699, loss_adv_disc: -3.153804 2023-02-12 17:41:45,481 - INFO - [Train] step: 31699, loss_mpn: 0.021414, loss_rec: 0.019947, loss_semantic: 0.322531, loss_idmrf: 0.674489, loss_adv_gen: -93.386604 2023-02-12 17:42:14,774 - INFO - [Train] step: 31799, loss_adv_disc: -1.710007 2023-02-12 17:42:14,984 - INFO - [Train] step: 31799, loss_mpn: 0.036256, loss_rec: 0.028560, loss_semantic: 0.415102, loss_idmrf: 2.383774, loss_adv_gen: -175.761688 2023-02-12 17:42:44,289 - INFO - [Train] step: 31899, loss_adv_disc: -3.047843 2023-02-12 17:42:44,499 - INFO - [Train] step: 31899, loss_mpn: 0.027713, loss_rec: 0.026380, loss_semantic: 0.332514, loss_idmrf: 1.533957, loss_adv_gen: -149.624115 2023-02-12 17:43:13,812 - INFO - [Train] step: 31999, loss_adv_disc: -1.858572 2023-02-12 17:43:14,020 - INFO - [Train] step: 31999, loss_mpn: 0.010280, loss_rec: 0.020402, loss_semantic: 0.322862, loss_idmrf: 1.281962, loss_adv_gen: -147.178192 2023-02-12 17:43:51,209 - INFO - [Eval] step: 31999, bce: 0.244124, psnr: 29.033735, ssim: 0.957002 2023-02-12 17:44:21,556 - INFO - [Train] step: 32099, loss_adv_disc: -1.773243 2023-02-12 17:44:21,766 - INFO - [Train] step: 32099, loss_mpn: 0.010236, loss_rec: 0.019495, loss_semantic: 0.324063, loss_idmrf: 0.927464, loss_adv_gen: -157.694138 2023-02-12 17:44:51,072 - INFO - [Train] step: 32199, loss_adv_disc: -2.132874 2023-02-12 17:44:51,281 - INFO - [Train] step: 32199, loss_mpn: 0.011885, loss_rec: 0.022537, loss_semantic: 0.375585, loss_idmrf: 0.882232, loss_adv_gen: -135.762253 2023-02-12 17:45:20,593 - INFO - [Train] step: 32299, loss_adv_disc: -1.886452 2023-02-12 17:45:20,803 - INFO - [Train] step: 32299, loss_mpn: 0.010089, loss_rec: 0.020109, loss_semantic: 0.370543, loss_idmrf: 1.304058, loss_adv_gen: -199.865463 2023-02-12 17:45:50,102 - INFO - [Train] step: 32399, loss_adv_disc: -2.067355 2023-02-12 17:45:50,314 - INFO - [Train] step: 32399, loss_mpn: 0.013049, loss_rec: 0.019836, loss_semantic: 0.319731, loss_idmrf: 0.880733, loss_adv_gen: -159.825104 2023-02-12 17:46:19,619 - INFO - [Train] step: 32499, loss_adv_disc: -0.600443 2023-02-12 17:46:19,828 - INFO - [Train] step: 32499, loss_mpn: 0.006746, loss_rec: 0.021167, loss_semantic: 0.321014, loss_idmrf: 0.948911, loss_adv_gen: -181.750092 2023-02-12 17:46:49,139 - INFO - [Train] step: 32599, loss_adv_disc: -3.762564 2023-02-12 17:46:49,348 - INFO - [Train] step: 32599, loss_mpn: 0.018765, loss_rec: 0.027648, loss_semantic: 0.382336, loss_idmrf: 1.474476, loss_adv_gen: -159.072769 2023-02-12 17:47:18,650 - INFO - [Train] step: 32699, loss_adv_disc: -3.938961 2023-02-12 17:47:18,858 - INFO - [Train] step: 32699, loss_mpn: 0.018220, loss_rec: 0.037988, loss_semantic: 0.426480, loss_idmrf: 1.077446, loss_adv_gen: -218.430145 2023-02-12 17:47:48,167 - INFO - [Train] step: 32799, loss_adv_disc: -1.820462 2023-02-12 17:47:48,376 - INFO - [Train] step: 32799, loss_mpn: 0.015794, loss_rec: 0.023826, loss_semantic: 0.344492, loss_idmrf: 1.018291, loss_adv_gen: -192.202225 2023-02-12 17:48:17,682 - INFO - [Train] step: 32899, loss_adv_disc: -0.630190 2023-02-12 17:48:17,892 - INFO - [Train] step: 32899, loss_mpn: 0.009499, loss_rec: 0.015845, loss_semantic: 0.258353, loss_idmrf: 0.815678, loss_adv_gen: -181.675018 2023-02-12 17:48:47,199 - INFO - [Train] step: 32999, loss_adv_disc: -2.690881 2023-02-12 17:48:47,408 - INFO - [Train] step: 32999, loss_mpn: 0.029729, loss_rec: 0.040808, loss_semantic: 0.434876, loss_idmrf: 1.766638, loss_adv_gen: -157.622253 2023-02-12 17:49:24,574 - INFO - [Eval] step: 32999, bce: 0.236952, psnr: 29.123842, ssim: 0.957203 2023-02-12 17:49:55,154 - INFO - [Train] step: 33099, loss_adv_disc: -2.034162 2023-02-12 17:49:55,363 - INFO - [Train] step: 33099, loss_mpn: 0.008753, loss_rec: 0.019981, loss_semantic: 0.295725, loss_idmrf: 1.098127, loss_adv_gen: -152.226898 2023-02-12 17:50:24,659 - INFO - [Train] step: 33199, loss_adv_disc: -1.958740 2023-02-12 17:50:24,870 - INFO - [Train] step: 33199, loss_mpn: 0.011206, loss_rec: 0.016534, loss_semantic: 0.280637, loss_idmrf: 0.633732, loss_adv_gen: -159.003998 2023-02-12 17:50:54,176 - INFO - [Train] step: 33299, loss_adv_disc: -1.830784 2023-02-12 17:50:54,386 - INFO - [Train] step: 33299, loss_mpn: 0.010057, loss_rec: 0.023243, loss_semantic: 0.324185, loss_idmrf: 1.492048, loss_adv_gen: -154.222672 2023-02-12 17:51:23,681 - INFO - [Train] step: 33399, loss_adv_disc: -1.288402 2023-02-12 17:51:23,891 - INFO - [Train] step: 33399, loss_mpn: 0.014923, loss_rec: 0.023018, loss_semantic: 0.327985, loss_idmrf: 1.027667, loss_adv_gen: -159.042847 2023-02-12 17:51:53,188 - INFO - [Train] step: 33499, loss_adv_disc: -6.701092 2023-02-12 17:51:53,397 - INFO - [Train] step: 33499, loss_mpn: 0.019534, loss_rec: 0.044890, loss_semantic: 0.475802, loss_idmrf: 1.395057, loss_adv_gen: -158.572418 2023-02-12 17:52:22,688 - INFO - [Train] step: 33599, loss_adv_disc: -1.702268 2023-02-12 17:52:22,897 - INFO - [Train] step: 33599, loss_mpn: 0.015045, loss_rec: 0.017588, loss_semantic: 0.291119, loss_idmrf: 0.805030, loss_adv_gen: -145.972595 2023-02-12 17:52:52,192 - INFO - [Train] step: 33699, loss_adv_disc: -2.639381 2023-02-12 17:52:52,402 - INFO - [Train] step: 33699, loss_mpn: 0.016541, loss_rec: 0.020158, loss_semantic: 0.286486, loss_idmrf: 1.200664, loss_adv_gen: -180.444855 2023-02-12 17:53:21,695 - INFO - [Train] step: 33799, loss_adv_disc: -1.218509 2023-02-12 17:53:21,904 - INFO - [Train] step: 33799, loss_mpn: 0.005585, loss_rec: 0.017871, loss_semantic: 0.283142, loss_idmrf: 0.790761, loss_adv_gen: -151.421829 2023-02-12 17:53:51,212 - INFO - [Train] step: 33899, loss_adv_disc: -2.619796 2023-02-12 17:53:51,421 - INFO - [Train] step: 33899, loss_mpn: 0.018898, loss_rec: 0.028334, loss_semantic: 0.404334, loss_idmrf: 1.572336, loss_adv_gen: -183.246170 2023-02-12 17:54:20,713 - INFO - [Train] step: 33999, loss_adv_disc: -2.281010 2023-02-12 17:54:20,922 - INFO - [Train] step: 33999, loss_mpn: 0.012565, loss_rec: 0.024612, loss_semantic: 0.344790, loss_idmrf: 1.011440, loss_adv_gen: -167.908554 2023-02-12 17:54:58,091 - INFO - [Eval] step: 33999, bce: 0.205579, psnr: 29.251085, ssim: 0.957331 2023-02-12 17:55:28,606 - INFO - [Train] step: 34099, loss_adv_disc: -1.240660 2023-02-12 17:55:28,815 - INFO - [Train] step: 34099, loss_mpn: 0.015477, loss_rec: 0.020481, loss_semantic: 0.339956, loss_idmrf: 0.873007, loss_adv_gen: -145.012589 2023-02-12 17:55:58,116 - INFO - [Train] step: 34199, loss_adv_disc: -1.539145 2023-02-12 17:55:58,326 - INFO - [Train] step: 34199, loss_mpn: 0.023260, loss_rec: 0.016593, loss_semantic: 0.298659, loss_idmrf: 0.785321, loss_adv_gen: -176.368118 2023-02-12 17:56:27,621 - INFO - [Train] step: 34299, loss_adv_disc: -2.760028 2023-02-12 17:56:27,830 - INFO - [Train] step: 34299, loss_mpn: 0.008323, loss_rec: 0.020988, loss_semantic: 0.324639, loss_idmrf: 0.616013, loss_adv_gen: -157.422699 2023-02-12 17:56:57,126 - INFO - [Train] step: 34399, loss_adv_disc: -2.553612 2023-02-12 17:56:57,335 - INFO - [Train] step: 34399, loss_mpn: 0.024272, loss_rec: 0.023415, loss_semantic: 0.346571, loss_idmrf: 2.065329, loss_adv_gen: -172.307251 2023-02-12 17:57:26,633 - INFO - [Train] step: 34499, loss_adv_disc: -1.682182 2023-02-12 17:57:26,844 - INFO - [Train] step: 34499, loss_mpn: 0.017541, loss_rec: 0.029529, loss_semantic: 0.391527, loss_idmrf: 0.985749, loss_adv_gen: -136.285248 2023-02-12 17:57:56,153 - INFO - [Train] step: 34599, loss_adv_disc: -2.231610 2023-02-12 17:57:56,362 - INFO - [Train] step: 34599, loss_mpn: 0.009918, loss_rec: 0.022532, loss_semantic: 0.311467, loss_idmrf: 1.222375, loss_adv_gen: -146.531326 2023-02-12 17:58:25,655 - INFO - [Train] step: 34699, loss_adv_disc: -1.929838 2023-02-12 17:58:25,864 - INFO - [Train] step: 34699, loss_mpn: 0.011590, loss_rec: 0.023047, loss_semantic: 0.360442, loss_idmrf: 0.897688, loss_adv_gen: -142.923645 2023-02-12 17:58:55,158 - INFO - [Train] step: 34799, loss_adv_disc: -3.194428 2023-02-12 17:58:55,368 - INFO - [Train] step: 34799, loss_mpn: 0.016799, loss_rec: 0.023705, loss_semantic: 0.371475, loss_idmrf: 1.045437, loss_adv_gen: -139.159607 2023-02-12 17:59:24,662 - INFO - [Train] step: 34899, loss_adv_disc: -2.469903 2023-02-12 17:59:24,871 - INFO - [Train] step: 34899, loss_mpn: 0.033517, loss_rec: 0.024883, loss_semantic: 0.361412, loss_idmrf: 1.065504, loss_adv_gen: -139.059204 2023-02-12 17:59:54,175 - INFO - [Train] step: 34999, loss_adv_disc: -2.247408 2023-02-12 17:59:54,385 - INFO - [Train] step: 34999, loss_mpn: 0.023007, loss_rec: 0.030979, loss_semantic: 0.374123, loss_idmrf: 1.299219, loss_adv_gen: -162.767334 2023-02-12 18:00:31,576 - INFO - [Eval] step: 34999, bce: 0.320766, psnr: 28.822138, ssim: 0.956486 2023-02-12 18:01:02,068 - INFO - [Train] step: 35099, loss_adv_disc: -1.925705 2023-02-12 18:01:02,276 - INFO - [Train] step: 35099, loss_mpn: 0.008340, loss_rec: 0.019603, loss_semantic: 0.292233, loss_idmrf: 0.722772, loss_adv_gen: -150.566864 2023-02-12 18:01:31,578 - INFO - [Train] step: 35199, loss_adv_disc: -2.216438 2023-02-12 18:01:31,787 - INFO - [Train] step: 35199, loss_mpn: 0.010101, loss_rec: 0.018999, loss_semantic: 0.296512, loss_idmrf: 0.717818, loss_adv_gen: -157.691696 2023-02-12 18:02:01,085 - INFO - [Train] step: 35299, loss_adv_disc: -2.085813 2023-02-12 18:02:01,295 - INFO - [Train] step: 35299, loss_mpn: 0.012465, loss_rec: 0.021619, loss_semantic: 0.302058, loss_idmrf: 0.736243, loss_adv_gen: -164.773544 2023-02-12 18:02:30,581 - INFO - [Train] step: 35399, loss_adv_disc: -1.152507 2023-02-12 18:02:30,791 - INFO - [Train] step: 35399, loss_mpn: 0.008907, loss_rec: 0.017051, loss_semantic: 0.305792, loss_idmrf: 1.240918, loss_adv_gen: -183.348083 2023-02-12 18:03:00,082 - INFO - [Train] step: 35499, loss_adv_disc: -3.182268 2023-02-12 18:03:00,292 - INFO - [Train] step: 35499, loss_mpn: 0.021173, loss_rec: 0.028958, loss_semantic: 0.340135, loss_idmrf: 0.814898, loss_adv_gen: -145.432465 2023-02-12 18:03:29,576 - INFO - [Train] step: 35599, loss_adv_disc: -2.632684 2023-02-12 18:03:29,786 - INFO - [Train] step: 35599, loss_mpn: 0.010914, loss_rec: 0.021396, loss_semantic: 0.322619, loss_idmrf: 0.804622, loss_adv_gen: -155.417786 2023-02-12 18:03:59,077 - INFO - [Train] step: 35699, loss_adv_disc: -0.674989 2023-02-12 18:03:59,287 - INFO - [Train] step: 35699, loss_mpn: 0.012684, loss_rec: 0.018897, loss_semantic: 0.314094, loss_idmrf: 0.802836, loss_adv_gen: -122.680061 2023-02-12 18:04:28,588 - INFO - [Train] step: 35799, loss_adv_disc: -1.077418 2023-02-12 18:04:28,798 - INFO - [Train] step: 35799, loss_mpn: 0.010844, loss_rec: 0.017388, loss_semantic: 0.293481, loss_idmrf: 0.978365, loss_adv_gen: -150.090057 2023-02-12 18:04:58,105 - INFO - [Train] step: 35899, loss_adv_disc: -1.272169 2023-02-12 18:04:58,314 - INFO - [Train] step: 35899, loss_mpn: 0.005835, loss_rec: 0.015025, loss_semantic: 0.235116, loss_idmrf: 0.707628, loss_adv_gen: -178.820572 2023-02-12 18:05:27,613 - INFO - [Train] step: 35999, loss_adv_disc: -1.505566 2023-02-12 18:05:27,822 - INFO - [Train] step: 35999, loss_mpn: 0.014298, loss_rec: 0.022206, loss_semantic: 0.337595, loss_idmrf: 1.241463, loss_adv_gen: -119.724541 2023-02-12 18:06:05,009 - INFO - [Eval] step: 35999, bce: 0.215667, psnr: 29.208504, ssim: 0.957480 2023-02-12 18:06:35,351 - INFO - [Train] step: 36099, loss_adv_disc: -2.579553 2023-02-12 18:06:35,561 - INFO - [Train] step: 36099, loss_mpn: 0.012997, loss_rec: 0.028975, loss_semantic: 0.367534, loss_idmrf: 0.619056, loss_adv_gen: -132.626465 2023-02-12 18:07:04,872 - INFO - [Train] step: 36199, loss_adv_disc: -2.344000 2023-02-12 18:07:05,081 - INFO - [Train] step: 36199, loss_mpn: 0.019409, loss_rec: 0.026721, loss_semantic: 0.397292, loss_idmrf: 1.140161, loss_adv_gen: -135.823578 2023-02-12 18:07:34,382 - INFO - [Train] step: 36299, loss_adv_disc: -1.334474 2023-02-12 18:07:34,591 - INFO - [Train] step: 36299, loss_mpn: 0.009234, loss_rec: 0.019641, loss_semantic: 0.298558, loss_idmrf: 1.226933, loss_adv_gen: -120.522873 2023-02-12 18:08:03,885 - INFO - [Train] step: 36399, loss_adv_disc: -1.131894 2023-02-12 18:08:04,094 - INFO - [Train] step: 36399, loss_mpn: 0.012937, loss_rec: 0.020012, loss_semantic: 0.315967, loss_idmrf: 1.429564, loss_adv_gen: -130.471527 2023-02-12 18:08:33,388 - INFO - [Train] step: 36499, loss_adv_disc: -2.588792 2023-02-12 18:08:33,598 - INFO - [Train] step: 36499, loss_mpn: 0.007716, loss_rec: 0.021406, loss_semantic: 0.337165, loss_idmrf: 0.819726, loss_adv_gen: -192.262817 2023-02-12 18:09:02,896 - INFO - [Train] step: 36599, loss_adv_disc: -2.801477 2023-02-12 18:09:03,105 - INFO - [Train] step: 36599, loss_mpn: 0.013779, loss_rec: 0.028437, loss_semantic: 0.358279, loss_idmrf: 1.790163, loss_adv_gen: -182.666779 2023-02-12 18:09:32,399 - INFO - [Train] step: 36699, loss_adv_disc: -1.444665 2023-02-12 18:09:32,609 - INFO - [Train] step: 36699, loss_mpn: 0.005687, loss_rec: 0.016171, loss_semantic: 0.265918, loss_idmrf: 0.778973, loss_adv_gen: -177.270248 2023-02-12 18:10:01,904 - INFO - [Train] step: 36799, loss_adv_disc: -0.598214 2023-02-12 18:10:02,112 - INFO - [Train] step: 36799, loss_mpn: 0.019448, loss_rec: 0.033833, loss_semantic: 0.386135, loss_idmrf: 2.627398, loss_adv_gen: -121.002739 2023-02-12 18:10:31,403 - INFO - [Train] step: 36899, loss_adv_disc: -1.689862 2023-02-12 18:10:31,614 - INFO - [Train] step: 36899, loss_mpn: 0.036045, loss_rec: 0.051232, loss_semantic: 0.521603, loss_idmrf: 2.444339, loss_adv_gen: -155.355438 2023-02-12 18:11:00,897 - INFO - [Train] step: 36999, loss_adv_disc: -1.665134 2023-02-12 18:11:01,107 - INFO - [Train] step: 36999, loss_mpn: 0.011023, loss_rec: 0.017451, loss_semantic: 0.291547, loss_idmrf: 0.723715, loss_adv_gen: -128.404785 2023-02-12 18:11:38,307 - INFO - [Eval] step: 36999, bce: 0.238638, psnr: 28.968206, ssim: 0.957086 2023-02-12 18:12:08,657 - INFO - [Train] step: 37099, loss_adv_disc: -2.624739 2023-02-12 18:12:08,866 - INFO - [Train] step: 37099, loss_mpn: 0.075043, loss_rec: 0.041531, loss_semantic: 0.354840, loss_idmrf: 2.027423, loss_adv_gen: -165.253250 2023-02-12 18:12:38,163 - INFO - [Train] step: 37199, loss_adv_disc: -0.411377 2023-02-12 18:12:38,372 - INFO - [Train] step: 37199, loss_mpn: 0.024229, loss_rec: 0.034768, loss_semantic: 0.432866, loss_idmrf: 1.988033, loss_adv_gen: -111.484383 2023-02-12 18:13:07,658 - INFO - [Train] step: 37299, loss_adv_disc: -7.693477 2023-02-12 18:13:07,868 - INFO - [Train] step: 37299, loss_mpn: 0.010649, loss_rec: 0.034489, loss_semantic: 0.423346, loss_idmrf: 0.647329, loss_adv_gen: -174.697113 2023-02-12 18:13:37,150 - INFO - [Train] step: 37399, loss_adv_disc: -1.210416 2023-02-12 18:13:37,362 - INFO - [Train] step: 37399, loss_mpn: 0.013813, loss_rec: 0.021487, loss_semantic: 0.342859, loss_idmrf: 0.876422, loss_adv_gen: -150.490631 2023-02-12 18:14:06,656 - INFO - [Train] step: 37499, loss_adv_disc: -0.195121 2023-02-12 18:14:06,865 - INFO - [Train] step: 37499, loss_mpn: 0.087205, loss_rec: 0.045215, loss_semantic: 0.474774, loss_idmrf: 2.398547, loss_adv_gen: -115.655479 2023-02-12 18:14:36,162 - INFO - [Train] step: 37599, loss_adv_disc: -0.905542 2023-02-12 18:14:36,371 - INFO - [Train] step: 37599, loss_mpn: 0.005782, loss_rec: 0.019019, loss_semantic: 0.291359, loss_idmrf: 1.198322, loss_adv_gen: -167.739548 2023-02-12 18:15:05,659 - INFO - [Train] step: 37699, loss_adv_disc: -2.423355 2023-02-12 18:15:05,868 - INFO - [Train] step: 37699, loss_mpn: 0.007076, loss_rec: 0.018368, loss_semantic: 0.290402, loss_idmrf: 0.577030, loss_adv_gen: -134.585464 2023-02-12 18:15:35,156 - INFO - [Train] step: 37799, loss_adv_disc: -1.321113 2023-02-12 18:15:35,366 - INFO - [Train] step: 37799, loss_mpn: 0.014285, loss_rec: 0.023402, loss_semantic: 0.329866, loss_idmrf: 1.099209, loss_adv_gen: -153.450317 2023-02-12 18:16:04,665 - INFO - [Train] step: 37899, loss_adv_disc: -0.865566 2023-02-12 18:16:04,875 - INFO - [Train] step: 37899, loss_mpn: 0.010119, loss_rec: 0.016443, loss_semantic: 0.298884, loss_idmrf: 3.041412, loss_adv_gen: -177.334885 2023-02-12 18:16:34,165 - INFO - [Train] step: 37999, loss_adv_disc: -2.521109 2023-02-12 18:16:34,374 - INFO - [Train] step: 37999, loss_mpn: 0.021668, loss_rec: 0.039516, loss_semantic: 0.449243, loss_idmrf: 2.472616, loss_adv_gen: -136.542038 2023-02-12 18:17:11,537 - INFO - [Eval] step: 37999, bce: 0.225955, psnr: 29.120995, ssim: 0.957187 2023-02-12 18:17:41,874 - INFO - [Train] step: 38099, loss_adv_disc: -2.744044 2023-02-12 18:17:42,082 - INFO - [Train] step: 38099, loss_mpn: 0.021230, loss_rec: 0.031108, loss_semantic: 0.408870, loss_idmrf: 1.305857, loss_adv_gen: -161.483597 2023-02-12 18:18:11,377 - INFO - [Train] step: 38199, loss_adv_disc: -0.561423 2023-02-12 18:18:11,586 - INFO - [Train] step: 38199, loss_mpn: 0.025529, loss_rec: 0.017625, loss_semantic: 0.310408, loss_idmrf: 0.933706, loss_adv_gen: -161.700989 2023-02-12 18:18:40,869 - INFO - [Train] step: 38299, loss_adv_disc: -2.385153 2023-02-12 18:18:41,078 - INFO - [Train] step: 38299, loss_mpn: 0.014295, loss_rec: 0.025686, loss_semantic: 0.367414, loss_idmrf: 1.355175, loss_adv_gen: -152.060501 2023-02-12 18:19:10,574 - INFO - [Train] step: 38399, loss_adv_disc: -2.426915 2023-02-12 18:19:10,783 - INFO - [Train] step: 38399, loss_mpn: 0.013024, loss_rec: 0.030852, loss_semantic: 0.391074, loss_idmrf: 1.009728, loss_adv_gen: -165.416992 2023-02-12 18:19:40,078 - INFO - [Train] step: 38499, loss_adv_disc: -4.689393 2023-02-12 18:19:40,287 - INFO - [Train] step: 38499, loss_mpn: 0.019696, loss_rec: 0.027881, loss_semantic: 0.391156, loss_idmrf: 1.031767, loss_adv_gen: -138.605347 2023-02-12 18:20:09,594 - INFO - [Train] step: 38599, loss_adv_disc: -2.699141 2023-02-12 18:20:09,804 - INFO - [Train] step: 38599, loss_mpn: 0.021303, loss_rec: 0.021669, loss_semantic: 0.310449, loss_idmrf: 1.450065, loss_adv_gen: -141.766159 2023-02-12 18:20:39,092 - INFO - [Train] step: 38699, loss_adv_disc: -1.293295 2023-02-12 18:20:39,301 - INFO - [Train] step: 38699, loss_mpn: 0.019939, loss_rec: 0.023198, loss_semantic: 0.342722, loss_idmrf: 0.967522, loss_adv_gen: -141.349274 2023-02-12 18:21:08,592 - INFO - [Train] step: 38799, loss_adv_disc: -2.370568 2023-02-12 18:21:08,802 - INFO - [Train] step: 38799, loss_mpn: 0.013691, loss_rec: 0.027043, loss_semantic: 0.394423, loss_idmrf: 1.014322, loss_adv_gen: -128.361740 2023-02-12 18:21:38,086 - INFO - [Train] step: 38899, loss_adv_disc: -3.034408 2023-02-12 18:21:38,296 - INFO - [Train] step: 38899, loss_mpn: 0.015817, loss_rec: 0.029135, loss_semantic: 0.386312, loss_idmrf: 1.342609, loss_adv_gen: -126.475113 2023-02-12 18:22:07,591 - INFO - [Train] step: 38999, loss_adv_disc: -1.397598 2023-02-12 18:22:07,800 - INFO - [Train] step: 38999, loss_mpn: 0.010588, loss_rec: 0.020827, loss_semantic: 0.331730, loss_idmrf: 1.256760, loss_adv_gen: -153.721786 2023-02-12 18:22:44,966 - INFO - [Eval] step: 38999, bce: 0.246073, psnr: 29.178984, ssim: 0.956290 2023-02-12 18:23:15,321 - INFO - [Train] step: 39099, loss_adv_disc: -2.853390 2023-02-12 18:23:15,532 - INFO - [Train] step: 39099, loss_mpn: 0.010844, loss_rec: 0.024636, loss_semantic: 0.313481, loss_idmrf: 0.797807, loss_adv_gen: -159.673004 2023-02-12 18:23:44,838 - INFO - [Train] step: 39199, loss_adv_disc: -1.216084 2023-02-12 18:23:45,047 - INFO - [Train] step: 39199, loss_mpn: 0.012530, loss_rec: 0.024071, loss_semantic: 0.364845, loss_idmrf: 1.248156, loss_adv_gen: -176.617432 2023-02-12 18:24:14,345 - INFO - [Train] step: 39299, loss_adv_disc: -2.530161 2023-02-12 18:24:14,555 - INFO - [Train] step: 39299, loss_mpn: 0.013298, loss_rec: 0.026537, loss_semantic: 0.358308, loss_idmrf: 1.608152, loss_adv_gen: -135.134796 2023-02-12 18:24:43,842 - INFO - [Train] step: 39399, loss_adv_disc: -4.221055 2023-02-12 18:24:44,053 - INFO - [Train] step: 39399, loss_mpn: 0.016926, loss_rec: 0.029155, loss_semantic: 0.384890, loss_idmrf: 1.177253, loss_adv_gen: -143.907532 2023-02-12 18:25:13,352 - INFO - [Train] step: 39499, loss_adv_disc: -1.787352 2023-02-12 18:25:13,561 - INFO - [Train] step: 39499, loss_mpn: 0.009887, loss_rec: 0.021772, loss_semantic: 0.318555, loss_idmrf: 0.491218, loss_adv_gen: -110.921028 2023-02-12 18:25:42,854 - INFO - [Train] step: 39599, loss_adv_disc: -2.442396 2023-02-12 18:25:43,063 - INFO - [Train] step: 39599, loss_mpn: 0.020922, loss_rec: 0.027451, loss_semantic: 0.385381, loss_idmrf: 1.085233, loss_adv_gen: -173.586121 2023-02-12 18:26:12,357 - INFO - [Train] step: 39699, loss_adv_disc: -1.176004 2023-02-12 18:26:12,567 - INFO - [Train] step: 39699, loss_mpn: 0.006752, loss_rec: 0.015613, loss_semantic: 0.262376, loss_idmrf: 0.885146, loss_adv_gen: -142.798950 2023-02-12 18:26:41,860 - INFO - [Train] step: 39799, loss_adv_disc: -2.509575 2023-02-12 18:26:42,069 - INFO - [Train] step: 39799, loss_mpn: 0.011692, loss_rec: 0.021853, loss_semantic: 0.347759, loss_idmrf: 0.978138, loss_adv_gen: -140.471588 2023-02-12 18:27:11,371 - INFO - [Train] step: 39899, loss_adv_disc: -1.222441 2023-02-12 18:27:11,580 - INFO - [Train] step: 39899, loss_mpn: 0.018954, loss_rec: 0.041342, loss_semantic: 0.367312, loss_idmrf: 1.244858, loss_adv_gen: -126.972168 2023-02-12 18:27:40,884 - INFO - [Train] step: 39999, loss_adv_disc: -2.574458 2023-02-12 18:27:41,094 - INFO - [Train] step: 39999, loss_mpn: 0.011977, loss_rec: 0.024579, loss_semantic: 0.352384, loss_idmrf: 0.950865, loss_adv_gen: -172.960602 2023-02-12 18:28:18,270 - INFO - [Eval] step: 39999, bce: 0.194767, psnr: 29.095812, ssim: 0.957131 2023-02-12 18:28:48,719 - INFO - [Train] step: 40099, loss_adv_disc: -4.535568 2023-02-12 18:28:48,928 - INFO - [Train] step: 40099, loss_mpn: 0.015236, loss_rec: 0.036578, loss_semantic: 0.443424, loss_idmrf: 0.898106, loss_adv_gen: -125.037216 2023-02-12 18:29:18,226 - INFO - [Train] step: 40199, loss_adv_disc: -1.275886 2023-02-12 18:29:18,435 - INFO - [Train] step: 40199, loss_mpn: 0.028273, loss_rec: 0.028285, loss_semantic: 0.420791, loss_idmrf: 2.383886, loss_adv_gen: -127.612259 2023-02-12 18:29:47,744 - INFO - [Train] step: 40299, loss_adv_disc: -1.292017 2023-02-12 18:29:47,955 - INFO - [Train] step: 40299, loss_mpn: 0.020296, loss_rec: 0.032139, loss_semantic: 0.412999, loss_idmrf: 1.599472, loss_adv_gen: -150.880920 2023-02-12 18:30:17,244 - INFO - [Train] step: 40399, loss_adv_disc: -1.286506 2023-02-12 18:30:17,455 - INFO - [Train] step: 40399, loss_mpn: 0.015423, loss_rec: 0.019935, loss_semantic: 0.316125, loss_idmrf: 1.459979, loss_adv_gen: -133.933685 2023-02-12 18:30:46,761 - INFO - [Train] step: 40499, loss_adv_disc: -2.546759 2023-02-12 18:30:46,970 - INFO - [Train] step: 40499, loss_mpn: 0.016017, loss_rec: 0.026256, loss_semantic: 0.377436, loss_idmrf: 1.676869, loss_adv_gen: -147.588593 2023-02-12 18:31:16,276 - INFO - [Train] step: 40599, loss_adv_disc: -1.326667 2023-02-12 18:31:16,486 - INFO - [Train] step: 40599, loss_mpn: 0.039922, loss_rec: 0.027237, loss_semantic: 0.383460, loss_idmrf: 1.472036, loss_adv_gen: -87.110321 2023-02-12 18:31:45,795 - INFO - [Train] step: 40699, loss_adv_disc: -3.702699 2023-02-12 18:31:46,004 - INFO - [Train] step: 40699, loss_mpn: 0.015969, loss_rec: 0.034648, loss_semantic: 0.411500, loss_idmrf: 0.839379, loss_adv_gen: -139.590637 2023-02-12 18:32:15,306 - INFO - [Train] step: 40799, loss_adv_disc: -2.090775 2023-02-12 18:32:15,516 - INFO - [Train] step: 40799, loss_mpn: 0.011456, loss_rec: 0.026507, loss_semantic: 0.346328, loss_idmrf: 0.941839, loss_adv_gen: -127.098099 2023-02-12 18:32:44,814 - INFO - [Train] step: 40899, loss_adv_disc: -2.884137 2023-02-12 18:32:45,023 - INFO - [Train] step: 40899, loss_mpn: 0.010164, loss_rec: 0.025798, loss_semantic: 0.337456, loss_idmrf: 0.968100, loss_adv_gen: -124.397766 2023-02-12 18:33:14,316 - INFO - [Train] step: 40999, loss_adv_disc: -1.325020 2023-02-12 18:33:14,526 - INFO - [Train] step: 40999, loss_mpn: 0.013489, loss_rec: 0.023419, loss_semantic: 0.336951, loss_idmrf: 1.080487, loss_adv_gen: -154.497879 2023-02-12 18:33:51,701 - INFO - [Eval] step: 40999, bce: 0.211028, psnr: 29.067577, ssim: 0.957131 2023-02-12 18:34:22,028 - INFO - [Train] step: 41099, loss_adv_disc: -1.966713 2023-02-12 18:34:22,237 - INFO - [Train] step: 41099, loss_mpn: 0.012680, loss_rec: 0.019835, loss_semantic: 0.304152, loss_idmrf: 1.108077, loss_adv_gen: -132.496246 2023-02-12 18:34:51,550 - INFO - [Train] step: 41199, loss_adv_disc: -2.591340 2023-02-12 18:34:51,759 - INFO - [Train] step: 41199, loss_mpn: 0.015032, loss_rec: 0.021413, loss_semantic: 0.314138, loss_idmrf: 0.918010, loss_adv_gen: -132.101868 2023-02-12 18:35:21,064 - INFO - [Train] step: 41299, loss_adv_disc: -0.449496 2023-02-12 18:35:21,273 - INFO - [Train] step: 41299, loss_mpn: 0.006305, loss_rec: 0.017464, loss_semantic: 0.290435, loss_idmrf: 0.675877, loss_adv_gen: -189.131271 2023-02-12 18:35:50,583 - INFO - [Train] step: 41399, loss_adv_disc: -1.811503 2023-02-12 18:35:50,794 - INFO - [Train] step: 41399, loss_mpn: 0.006530, loss_rec: 0.020799, loss_semantic: 0.320156, loss_idmrf: 0.618132, loss_adv_gen: -147.685303 2023-02-12 18:36:20,096 - INFO - [Train] step: 41499, loss_adv_disc: -3.698590 2023-02-12 18:36:20,305 - INFO - [Train] step: 41499, loss_mpn: 0.009602, loss_rec: 0.027312, loss_semantic: 0.368191, loss_idmrf: 0.554497, loss_adv_gen: -130.560028 2023-02-12 18:36:49,596 - INFO - [Train] step: 41599, loss_adv_disc: -3.010313 2023-02-12 18:36:49,806 - INFO - [Train] step: 41599, loss_mpn: 0.011292, loss_rec: 0.026035, loss_semantic: 0.367295, loss_idmrf: 1.293433, loss_adv_gen: -129.985779 2023-02-12 18:37:19,104 - INFO - [Train] step: 41699, loss_adv_disc: -2.650678 2023-02-12 18:37:19,313 - INFO - [Train] step: 41699, loss_mpn: 0.011438, loss_rec: 0.019516, loss_semantic: 0.311721, loss_idmrf: 0.767802, loss_adv_gen: -102.489609 2023-02-12 18:37:48,604 - INFO - [Train] step: 41799, loss_adv_disc: -0.760895 2023-02-12 18:37:48,813 - INFO - [Train] step: 41799, loss_mpn: 0.004543, loss_rec: 0.014450, loss_semantic: 0.250473, loss_idmrf: 1.218911, loss_adv_gen: -155.222519 2023-02-12 18:38:18,117 - INFO - [Train] step: 41899, loss_adv_disc: -3.305377 2023-02-12 18:38:18,326 - INFO - [Train] step: 41899, loss_mpn: 0.010997, loss_rec: 0.024039, loss_semantic: 0.347023, loss_idmrf: 0.889829, loss_adv_gen: -143.904968 2023-02-12 18:38:47,628 - INFO - [Train] step: 41999, loss_adv_disc: -1.330894 2023-02-12 18:38:47,837 - INFO - [Train] step: 41999, loss_mpn: 0.022094, loss_rec: 0.032639, loss_semantic: 0.408295, loss_idmrf: 1.409587, loss_adv_gen: -113.762909 2023-02-12 18:39:25,019 - INFO - [Eval] step: 41999, bce: 0.210949, psnr: 29.142321, ssim: 0.957402 2023-02-12 18:39:55,367 - INFO - [Train] step: 42099, loss_adv_disc: -2.665808 2023-02-12 18:39:55,576 - INFO - [Train] step: 42099, loss_mpn: 0.010887, loss_rec: 0.026553, loss_semantic: 0.385582, loss_idmrf: 1.571114, loss_adv_gen: -151.907104 2023-02-12 18:40:24,878 - INFO - [Train] step: 42199, loss_adv_disc: -3.195778 2023-02-12 18:40:25,087 - INFO - [Train] step: 42199, loss_mpn: 0.008814, loss_rec: 0.023642, loss_semantic: 0.342627, loss_idmrf: 1.361953, loss_adv_gen: -130.199127 2023-02-12 18:40:54,380 - INFO - [Train] step: 42299, loss_adv_disc: -2.699011 2023-02-12 18:40:54,589 - INFO - [Train] step: 42299, loss_mpn: 0.025111, loss_rec: 0.029612, loss_semantic: 0.388090, loss_idmrf: 1.680518, loss_adv_gen: -138.131500 2023-02-12 18:41:23,885 - INFO - [Train] step: 42399, loss_adv_disc: -2.940691 2023-02-12 18:41:24,094 - INFO - [Train] step: 42399, loss_mpn: 0.012269, loss_rec: 0.023819, loss_semantic: 0.333529, loss_idmrf: 0.686632, loss_adv_gen: -134.680328 2023-02-12 18:41:53,393 - INFO - [Train] step: 42499, loss_adv_disc: -2.314073 2023-02-12 18:41:53,602 - INFO - [Train] step: 42499, loss_mpn: 0.017753, loss_rec: 0.024255, loss_semantic: 0.323019, loss_idmrf: 1.560460, loss_adv_gen: -154.048370 2023-02-12 18:42:22,899 - INFO - [Train] step: 42599, loss_adv_disc: -2.526686 2023-02-12 18:42:23,108 - INFO - [Train] step: 42599, loss_mpn: 0.020656, loss_rec: 0.029482, loss_semantic: 0.376596, loss_idmrf: 0.992542, loss_adv_gen: -112.163139 2023-02-12 18:42:52,398 - INFO - [Train] step: 42699, loss_adv_disc: -2.068728 2023-02-12 18:42:52,608 - INFO - [Train] step: 42699, loss_mpn: 0.011803, loss_rec: 0.020583, loss_semantic: 0.322981, loss_idmrf: 0.903103, loss_adv_gen: -116.537788 2023-02-12 18:43:21,896 - INFO - [Train] step: 42799, loss_adv_disc: -0.543986 2023-02-12 18:43:22,105 - INFO - [Train] step: 42799, loss_mpn: 0.017163, loss_rec: 0.030206, loss_semantic: 0.389116, loss_idmrf: 1.321068, loss_adv_gen: -129.887299 2023-02-12 18:43:51,405 - INFO - [Train] step: 42899, loss_adv_disc: -2.033113 2023-02-12 18:43:51,614 - INFO - [Train] step: 42899, loss_mpn: 0.010556, loss_rec: 0.021970, loss_semantic: 0.322701, loss_idmrf: 0.919145, loss_adv_gen: -100.012039 2023-02-12 18:44:20,907 - INFO - [Train] step: 42999, loss_adv_disc: -1.097463 2023-02-12 18:44:21,116 - INFO - [Train] step: 42999, loss_mpn: 0.005941, loss_rec: 0.017077, loss_semantic: 0.283843, loss_idmrf: 1.092433, loss_adv_gen: -97.320061 2023-02-12 18:44:58,317 - INFO - [Eval] step: 42999, bce: 0.261437, psnr: 29.182854, ssim: 0.956990 2023-02-12 18:45:28,644 - INFO - [Train] step: 43099, loss_adv_disc: -1.896635 2023-02-12 18:45:28,853 - INFO - [Train] step: 43099, loss_mpn: 0.018344, loss_rec: 0.030292, loss_semantic: 0.421648, loss_idmrf: 1.093099, loss_adv_gen: -128.265961 2023-02-12 18:45:58,148 - INFO - [Train] step: 43199, loss_adv_disc: -2.803107 2023-02-12 18:45:58,357 - INFO - [Train] step: 43199, loss_mpn: 0.021354, loss_rec: 0.031753, loss_semantic: 0.368565, loss_idmrf: 1.554116, loss_adv_gen: -113.412605 2023-02-12 18:46:27,661 - INFO - [Train] step: 43299, loss_adv_disc: -4.239297 2023-02-12 18:46:27,870 - INFO - [Train] step: 43299, loss_mpn: 0.020429, loss_rec: 0.031146, loss_semantic: 0.382425, loss_idmrf: 0.766201, loss_adv_gen: -107.834732 2023-02-12 18:46:57,163 - INFO - [Train] step: 43399, loss_adv_disc: -1.784337 2023-02-12 18:46:57,373 - INFO - [Train] step: 43399, loss_mpn: 0.015985, loss_rec: 0.017654, loss_semantic: 0.290674, loss_idmrf: 0.919770, loss_adv_gen: -128.457977 2023-02-12 18:47:26,664 - INFO - [Train] step: 43499, loss_adv_disc: -1.516450 2023-02-12 18:47:26,873 - INFO - [Train] step: 43499, loss_mpn: 0.011174, loss_rec: 0.017259, loss_semantic: 0.292501, loss_idmrf: 1.144227, loss_adv_gen: -127.167282 2023-02-12 18:47:56,174 - INFO - [Train] step: 43599, loss_adv_disc: -1.608698 2023-02-12 18:47:56,384 - INFO - [Train] step: 43599, loss_mpn: 0.014227, loss_rec: 0.017216, loss_semantic: 0.297073, loss_idmrf: 1.059537, loss_adv_gen: -102.701126 2023-02-12 18:48:25,677 - INFO - [Train] step: 43699, loss_adv_disc: -2.064444 2023-02-12 18:48:25,887 - INFO - [Train] step: 43699, loss_mpn: 0.009232, loss_rec: 0.020965, loss_semantic: 0.321229, loss_idmrf: 1.114394, loss_adv_gen: -134.445740 2023-02-12 18:48:55,186 - INFO - [Train] step: 43799, loss_adv_disc: -2.908204 2023-02-12 18:48:55,395 - INFO - [Train] step: 43799, loss_mpn: 0.032256, loss_rec: 0.026111, loss_semantic: 0.312473, loss_idmrf: 1.004970, loss_adv_gen: -113.739731 2023-02-12 18:49:24,698 - INFO - [Train] step: 43899, loss_adv_disc: -2.565762 2023-02-12 18:49:24,907 - INFO - [Train] step: 43899, loss_mpn: 0.019131, loss_rec: 0.023535, loss_semantic: 0.355505, loss_idmrf: 1.083641, loss_adv_gen: -131.373352 2023-02-12 18:49:54,202 - INFO - [Train] step: 43999, loss_adv_disc: -0.886762 2023-02-12 18:49:54,412 - INFO - [Train] step: 43999, loss_mpn: 0.008522, loss_rec: 0.015679, loss_semantic: 0.284922, loss_idmrf: 0.487373, loss_adv_gen: -118.949120 2023-02-12 18:50:31,584 - INFO - [Eval] step: 43999, bce: 0.194368, psnr: 28.917126, ssim: 0.957285 2023-02-12 18:51:01,924 - INFO - [Train] step: 44099, loss_adv_disc: -2.109013 2023-02-12 18:51:02,135 - INFO - [Train] step: 44099, loss_mpn: 0.007931, loss_rec: 0.020231, loss_semantic: 0.312674, loss_idmrf: 0.736357, loss_adv_gen: -163.130508 2023-02-12 18:51:31,426 - INFO - [Train] step: 44199, loss_adv_disc: -1.809198 2023-02-12 18:51:31,635 - INFO - [Train] step: 44199, loss_mpn: 0.011999, loss_rec: 0.024499, loss_semantic: 0.337322, loss_idmrf: 0.754612, loss_adv_gen: -122.636627 2023-02-12 18:52:00,926 - INFO - [Train] step: 44299, loss_adv_disc: -2.669969 2023-02-12 18:52:01,136 - INFO - [Train] step: 44299, loss_mpn: 0.011659, loss_rec: 0.021793, loss_semantic: 0.313015, loss_idmrf: 1.180753, loss_adv_gen: -71.234894 2023-02-12 18:52:30,430 - INFO - [Train] step: 44399, loss_adv_disc: -2.069655 2023-02-12 18:52:30,639 - INFO - [Train] step: 44399, loss_mpn: 0.021003, loss_rec: 0.031595, loss_semantic: 0.433131, loss_idmrf: 2.114752, loss_adv_gen: -104.753006 2023-02-12 18:52:59,924 - INFO - [Train] step: 44499, loss_adv_disc: -1.210405 2023-02-12 18:53:00,134 - INFO - [Train] step: 44499, loss_mpn: 0.010182, loss_rec: 0.016950, loss_semantic: 0.325993, loss_idmrf: 0.896369, loss_adv_gen: -140.877686 2023-02-12 18:53:29,416 - INFO - [Train] step: 44599, loss_adv_disc: -0.991096 2023-02-12 18:53:29,626 - INFO - [Train] step: 44599, loss_mpn: 0.009903, loss_rec: 0.023898, loss_semantic: 0.345652, loss_idmrf: 0.870231, loss_adv_gen: -141.430023 2023-02-12 18:53:58,913 - INFO - [Train] step: 44699, loss_adv_disc: -2.421599 2023-02-12 18:53:59,122 - INFO - [Train] step: 44699, loss_mpn: 0.027594, loss_rec: 0.021111, loss_semantic: 0.296740, loss_idmrf: 2.191382, loss_adv_gen: -141.897797 2023-02-12 18:54:28,420 - INFO - [Train] step: 44799, loss_adv_disc: -2.393608 2023-02-12 18:54:28,630 - INFO - [Train] step: 44799, loss_mpn: 0.026077, loss_rec: 0.026524, loss_semantic: 0.399981, loss_idmrf: 1.215351, loss_adv_gen: -141.662292 2023-02-12 18:54:57,916 - INFO - [Train] step: 44899, loss_adv_disc: -1.943000 2023-02-12 18:54:58,127 - INFO - [Train] step: 44899, loss_mpn: 0.005789, loss_rec: 0.018024, loss_semantic: 0.282490, loss_idmrf: 1.155201, loss_adv_gen: -131.278305 2023-02-12 18:55:27,406 - INFO - [Train] step: 44999, loss_adv_disc: -3.287998 2023-02-12 18:55:27,617 - INFO - [Train] step: 44999, loss_mpn: 0.018217, loss_rec: 0.029792, loss_semantic: 0.371263, loss_idmrf: 0.902070, loss_adv_gen: -107.782036 2023-02-12 18:56:04,791 - INFO - [Eval] step: 44999, bce: 0.283522, psnr: 29.104111, ssim: 0.957416 2023-02-12 18:56:35,241 - INFO - [Train] step: 45099, loss_adv_disc: -1.618816 2023-02-12 18:56:35,451 - INFO - [Train] step: 45099, loss_mpn: 0.019453, loss_rec: 0.023029, loss_semantic: 0.332724, loss_idmrf: 0.997722, loss_adv_gen: -134.765015 2023-02-12 18:57:04,737 - INFO - [Train] step: 45199, loss_adv_disc: -2.166257 2023-02-12 18:57:04,946 - INFO - [Train] step: 45199, loss_mpn: 0.016220, loss_rec: 0.029939, loss_semantic: 0.354338, loss_idmrf: 1.164202, loss_adv_gen: -138.789719 2023-02-12 18:57:34,232 - INFO - [Train] step: 45299, loss_adv_disc: -2.744868 2023-02-12 18:57:34,442 - INFO - [Train] step: 45299, loss_mpn: 0.006668, loss_rec: 0.023699, loss_semantic: 0.351245, loss_idmrf: 0.852077, loss_adv_gen: -125.358887 2023-02-12 18:58:03,744 - INFO - [Train] step: 45399, loss_adv_disc: -4.544851 2023-02-12 18:58:03,954 - INFO - [Train] step: 45399, loss_mpn: 0.017645, loss_rec: 0.031923, loss_semantic: 0.421283, loss_idmrf: 0.882782, loss_adv_gen: -123.332794 2023-02-12 18:58:33,435 - INFO - [Train] step: 45499, loss_adv_disc: -0.926121 2023-02-12 18:58:33,644 - INFO - [Train] step: 45499, loss_mpn: 0.005050, loss_rec: 0.012667, loss_semantic: 0.232011, loss_idmrf: 0.686866, loss_adv_gen: -119.117294 2023-02-12 18:59:02,936 - INFO - [Train] step: 45599, loss_adv_disc: -0.831104 2023-02-12 18:59:03,146 - INFO - [Train] step: 45599, loss_mpn: 0.006611, loss_rec: 0.014783, loss_semantic: 0.229055, loss_idmrf: 0.813169, loss_adv_gen: -114.963379 2023-02-12 18:59:32,427 - INFO - [Train] step: 45699, loss_adv_disc: -1.295311 2023-02-12 18:59:32,637 - INFO - [Train] step: 45699, loss_mpn: 0.010169, loss_rec: 0.018375, loss_semantic: 0.284788, loss_idmrf: 0.765621, loss_adv_gen: -141.038055 2023-02-12 19:00:01,918 - INFO - [Train] step: 45799, loss_adv_disc: -0.814036 2023-02-12 19:00:02,128 - INFO - [Train] step: 45799, loss_mpn: 0.016879, loss_rec: 0.020380, loss_semantic: 0.335941, loss_idmrf: 1.194481, loss_adv_gen: -140.793274 2023-02-12 19:00:31,412 - INFO - [Train] step: 45899, loss_adv_disc: -3.289245 2023-02-12 19:00:31,622 - INFO - [Train] step: 45899, loss_mpn: 0.008814, loss_rec: 0.030724, loss_semantic: 0.459330, loss_idmrf: 1.103817, loss_adv_gen: -113.740219 2023-02-12 19:01:00,904 - INFO - [Train] step: 45999, loss_adv_disc: -1.644310 2023-02-12 19:01:01,113 - INFO - [Train] step: 45999, loss_mpn: 0.005488, loss_rec: 0.015050, loss_semantic: 0.270024, loss_idmrf: 0.989204, loss_adv_gen: -163.489929 2023-02-12 19:01:38,279 - INFO - [Eval] step: 45999, bce: 0.204107, psnr: 29.267492, ssim: 0.957987 2023-02-12 19:02:08,867 - INFO - [Train] step: 46099, loss_adv_disc: -0.558196 2023-02-12 19:02:09,076 - INFO - [Train] step: 46099, loss_mpn: 0.013652, loss_rec: 0.014307, loss_semantic: 0.259496, loss_idmrf: 1.009143, loss_adv_gen: -142.300995 2023-02-12 19:02:38,367 - INFO - [Train] step: 46199, loss_adv_disc: -1.836662 2023-02-12 19:02:38,576 - INFO - [Train] step: 46199, loss_mpn: 0.035741, loss_rec: 0.030679, loss_semantic: 0.402914, loss_idmrf: 2.525560, loss_adv_gen: -145.601227 2023-02-12 19:03:07,868 - INFO - [Train] step: 46299, loss_adv_disc: -1.580151 2023-02-12 19:03:08,077 - INFO - [Train] step: 46299, loss_mpn: 0.012537, loss_rec: 0.022228, loss_semantic: 0.345051, loss_idmrf: 1.168077, loss_adv_gen: -135.664185 2023-02-12 19:03:37,372 - INFO - [Train] step: 46399, loss_adv_disc: -3.798640 2023-02-12 19:03:37,582 - INFO - [Train] step: 46399, loss_mpn: 0.015971, loss_rec: 0.025560, loss_semantic: 0.390543, loss_idmrf: 1.180987, loss_adv_gen: -116.080765 2023-02-12 19:04:06,863 - INFO - [Train] step: 46499, loss_adv_disc: -0.481576 2023-02-12 19:04:07,072 - INFO - [Train] step: 46499, loss_mpn: 0.017854, loss_rec: 0.022975, loss_semantic: 0.327492, loss_idmrf: 0.714899, loss_adv_gen: -92.058472 2023-02-12 19:04:36,368 - INFO - [Train] step: 46599, loss_adv_disc: -2.327599 2023-02-12 19:04:36,577 - INFO - [Train] step: 46599, loss_mpn: 0.017849, loss_rec: 0.027398, loss_semantic: 0.341538, loss_idmrf: 1.298172, loss_adv_gen: -123.930206 2023-02-12 19:05:05,872 - INFO - [Train] step: 46699, loss_adv_disc: -1.269926 2023-02-12 19:05:06,082 - INFO - [Train] step: 46699, loss_mpn: 0.007905, loss_rec: 0.021437, loss_semantic: 0.347730, loss_idmrf: 0.968422, loss_adv_gen: -144.929443 2023-02-12 19:05:35,367 - INFO - [Train] step: 46799, loss_adv_disc: -1.179983 2023-02-12 19:05:35,576 - INFO - [Train] step: 46799, loss_mpn: 0.022142, loss_rec: 0.024995, loss_semantic: 0.379776, loss_idmrf: 0.902723, loss_adv_gen: -120.207382 2023-02-12 19:06:04,863 - INFO - [Train] step: 46899, loss_adv_disc: -1.672582 2023-02-12 19:06:05,074 - INFO - [Train] step: 46899, loss_mpn: 0.014565, loss_rec: 0.020272, loss_semantic: 0.275730, loss_idmrf: 0.887907, loss_adv_gen: -115.798790 2023-02-12 19:06:34,363 - INFO - [Train] step: 46999, loss_adv_disc: -2.298712 2023-02-12 19:06:34,572 - INFO - [Train] step: 46999, loss_mpn: 0.008810, loss_rec: 0.017283, loss_semantic: 0.278155, loss_idmrf: 1.025754, loss_adv_gen: -139.295609 2023-02-12 19:07:11,744 - INFO - [Eval] step: 46999, bce: 0.253672, psnr: 29.105146, ssim: 0.957524 2023-02-12 19:07:42,080 - INFO - [Train] step: 47099, loss_adv_disc: -5.473932 2023-02-12 19:07:42,289 - INFO - [Train] step: 47099, loss_mpn: 0.015122, loss_rec: 0.030526, loss_semantic: 0.391356, loss_idmrf: 0.526956, loss_adv_gen: -96.352524 2023-02-12 19:08:11,580 - INFO - [Train] step: 47199, loss_adv_disc: -1.417862 2023-02-12 19:08:11,790 - INFO - [Train] step: 47199, loss_mpn: 0.019835, loss_rec: 0.023406, loss_semantic: 0.341328, loss_idmrf: 1.376464, loss_adv_gen: -96.881989 2023-02-12 19:08:41,104 - INFO - [Train] step: 47299, loss_adv_disc: -0.450608 2023-02-12 19:08:41,314 - INFO - [Train] step: 47299, loss_mpn: 0.018413, loss_rec: 0.027004, loss_semantic: 0.369791, loss_idmrf: 1.283139, loss_adv_gen: -139.897705 2023-02-12 19:09:10,604 - INFO - [Train] step: 47399, loss_adv_disc: -1.247165 2023-02-12 19:09:10,812 - INFO - [Train] step: 47399, loss_mpn: 0.011965, loss_rec: 0.018894, loss_semantic: 0.317268, loss_idmrf: 0.687348, loss_adv_gen: -118.171921 2023-02-12 19:09:40,107 - INFO - [Train] step: 47499, loss_adv_disc: -3.375033 2023-02-12 19:09:40,316 - INFO - [Train] step: 47499, loss_mpn: 0.017998, loss_rec: 0.026264, loss_semantic: 0.379550, loss_idmrf: 1.541121, loss_adv_gen: -126.075363 2023-02-12 19:10:09,608 - INFO - [Train] step: 47599, loss_adv_disc: -1.910500 2023-02-12 19:10:09,817 - INFO - [Train] step: 47599, loss_mpn: 0.022538, loss_rec: 0.033266, loss_semantic: 0.416789, loss_idmrf: 1.432770, loss_adv_gen: -122.585632 2023-02-12 19:10:39,115 - INFO - [Train] step: 47699, loss_adv_disc: -0.333590 2023-02-12 19:10:39,324 - INFO - [Train] step: 47699, loss_mpn: 0.010669, loss_rec: 0.018622, loss_semantic: 0.310768, loss_idmrf: 0.798342, loss_adv_gen: -142.267029 2023-02-12 19:11:08,616 - INFO - [Train] step: 47799, loss_adv_disc: -0.506515 2023-02-12 19:11:08,826 - INFO - [Train] step: 47799, loss_mpn: 0.006181, loss_rec: 0.021344, loss_semantic: 0.335353, loss_idmrf: 1.068593, loss_adv_gen: -124.897713 2023-02-12 19:11:38,129 - INFO - [Train] step: 47899, loss_adv_disc: -2.321455 2023-02-12 19:11:38,339 - INFO - [Train] step: 47899, loss_mpn: 0.014455, loss_rec: 0.025240, loss_semantic: 0.383677, loss_idmrf: 1.107193, loss_adv_gen: -122.216599 2023-02-12 19:12:07,626 - INFO - [Train] step: 47999, loss_adv_disc: -2.556845 2023-02-12 19:12:07,835 - INFO - [Train] step: 47999, loss_mpn: 0.017551, loss_rec: 0.026453, loss_semantic: 0.356316, loss_idmrf: 1.417825, loss_adv_gen: -117.186768 2023-02-12 19:12:45,009 - INFO - [Eval] step: 47999, bce: 0.261834, psnr: 29.265079, ssim: 0.957508 2023-02-12 19:13:15,346 - INFO - [Train] step: 48099, loss_adv_disc: -2.110548 2023-02-12 19:13:15,555 - INFO - [Train] step: 48099, loss_mpn: 0.010889, loss_rec: 0.020481, loss_semantic: 0.303065, loss_idmrf: 0.466718, loss_adv_gen: -138.180313 2023-02-12 19:13:44,854 - INFO - [Train] step: 48199, loss_adv_disc: -2.376373 2023-02-12 19:13:45,063 - INFO - [Train] step: 48199, loss_mpn: 0.007582, loss_rec: 0.020205, loss_semantic: 0.285513, loss_idmrf: 1.043513, loss_adv_gen: -85.389282 2023-02-12 19:14:14,347 - INFO - [Train] step: 48299, loss_adv_disc: -1.374611 2023-02-12 19:14:14,556 - INFO - [Train] step: 48299, loss_mpn: 0.007759, loss_rec: 0.021047, loss_semantic: 0.310344, loss_idmrf: 0.933076, loss_adv_gen: -116.241653 2023-02-12 19:14:43,839 - INFO - [Train] step: 48399, loss_adv_disc: -2.311736 2023-02-12 19:14:44,049 - INFO - [Train] step: 48399, loss_mpn: 0.022186, loss_rec: 0.029154, loss_semantic: 0.425822, loss_idmrf: 0.881801, loss_adv_gen: -165.661377 2023-02-12 19:15:13,342 - INFO - [Train] step: 48499, loss_adv_disc: -2.580103 2023-02-12 19:15:13,552 - INFO - [Train] step: 48499, loss_mpn: 0.010942, loss_rec: 0.028834, loss_semantic: 0.405597, loss_idmrf: 2.301194, loss_adv_gen: -107.330780 2023-02-12 19:15:42,841 - INFO - [Train] step: 48599, loss_adv_disc: -0.824483 2023-02-12 19:15:43,051 - INFO - [Train] step: 48599, loss_mpn: 0.012223, loss_rec: 0.019150, loss_semantic: 0.251760, loss_idmrf: 0.694223, loss_adv_gen: -119.733002 2023-02-12 19:16:12,337 - INFO - [Train] step: 48699, loss_adv_disc: -1.411452 2023-02-12 19:16:12,545 - INFO - [Train] step: 48699, loss_mpn: 0.012398, loss_rec: 0.017025, loss_semantic: 0.270724, loss_idmrf: 1.923099, loss_adv_gen: -113.915161 2023-02-12 19:16:41,841 - INFO - [Train] step: 48799, loss_adv_disc: -2.102562 2023-02-12 19:16:42,051 - INFO - [Train] step: 48799, loss_mpn: 0.015094, loss_rec: 0.024003, loss_semantic: 0.344523, loss_idmrf: 0.995992, loss_adv_gen: -118.247231 2023-02-12 19:17:11,336 - INFO - [Train] step: 48899, loss_adv_disc: -1.027884 2023-02-12 19:17:11,545 - INFO - [Train] step: 48899, loss_mpn: 0.014677, loss_rec: 0.025183, loss_semantic: 0.313247, loss_idmrf: 1.630916, loss_adv_gen: -122.107155 2023-02-12 19:17:40,833 - INFO - [Train] step: 48999, loss_adv_disc: -4.285762 2023-02-12 19:17:41,042 - INFO - [Train] step: 48999, loss_mpn: 0.011410, loss_rec: 0.032390, loss_semantic: 0.352806, loss_idmrf: 1.384062, loss_adv_gen: -139.882111 2023-02-12 19:18:18,229 - INFO - [Eval] step: 48999, bce: 0.185750, psnr: 29.401178, ssim: 0.958386 2023-02-12 19:18:48,796 - INFO - [Train] step: 49099, loss_adv_disc: -1.905670 2023-02-12 19:18:49,006 - INFO - [Train] step: 49099, loss_mpn: 0.016775, loss_rec: 0.023903, loss_semantic: 0.322507, loss_idmrf: 0.649161, loss_adv_gen: -112.841377 2023-02-12 19:19:18,304 - INFO - [Train] step: 49199, loss_adv_disc: -2.163928 2023-02-12 19:19:18,513 - INFO - [Train] step: 49199, loss_mpn: 0.004486, loss_rec: 0.019721, loss_semantic: 0.276752, loss_idmrf: 0.822750, loss_adv_gen: -86.521355 2023-02-12 19:19:47,805 - INFO - [Train] step: 49299, loss_adv_disc: -1.868410 2023-02-12 19:19:48,014 - INFO - [Train] step: 49299, loss_mpn: 0.010737, loss_rec: 0.023041, loss_semantic: 0.314100, loss_idmrf: 1.001878, loss_adv_gen: -136.910538 2023-02-12 19:20:17,306 - INFO - [Train] step: 49399, loss_adv_disc: -3.417485 2023-02-12 19:20:17,516 - INFO - [Train] step: 49399, loss_mpn: 0.009672, loss_rec: 0.026141, loss_semantic: 0.342358, loss_idmrf: 0.887028, loss_adv_gen: -97.948143 2023-02-12 19:20:46,808 - INFO - [Train] step: 49499, loss_adv_disc: -1.244874 2023-02-12 19:20:47,017 - INFO - [Train] step: 49499, loss_mpn: 0.022682, loss_rec: 0.024240, loss_semantic: 0.383033, loss_idmrf: 1.708354, loss_adv_gen: -114.859314 2023-02-12 19:21:16,296 - INFO - [Train] step: 49599, loss_adv_disc: -2.553038 2023-02-12 19:21:16,505 - INFO - [Train] step: 49599, loss_mpn: 0.010474, loss_rec: 0.022079, loss_semantic: 0.364744, loss_idmrf: 1.096736, loss_adv_gen: -115.203125 2023-02-12 19:21:45,792 - INFO - [Train] step: 49699, loss_adv_disc: -0.838546 2023-02-12 19:21:46,001 - INFO - [Train] step: 49699, loss_mpn: 0.018769, loss_rec: 0.028353, loss_semantic: 0.408246, loss_idmrf: 1.088397, loss_adv_gen: -131.107605 2023-02-12 19:22:15,283 - INFO - [Train] step: 49799, loss_adv_disc: -2.405115 2023-02-12 19:22:15,493 - INFO - [Train] step: 49799, loss_mpn: 0.024528, loss_rec: 0.025170, loss_semantic: 0.337759, loss_idmrf: 0.678765, loss_adv_gen: -118.921303 2023-02-12 19:22:44,776 - INFO - [Train] step: 49899, loss_adv_disc: -1.891388 2023-02-12 19:22:44,986 - INFO - [Train] step: 49899, loss_mpn: 0.008405, loss_rec: 0.020370, loss_semantic: 0.320439, loss_idmrf: 0.848666, loss_adv_gen: -105.076233 2023-02-12 19:23:14,271 - INFO - [Train] step: 49999, loss_adv_disc: -2.304527 2023-02-12 19:23:14,481 - INFO - [Train] step: 49999, loss_mpn: 0.009999, loss_rec: 0.018110, loss_semantic: 0.288790, loss_idmrf: 1.133276, loss_adv_gen: -90.463264 2023-02-12 19:23:51,646 - INFO - [Eval] step: 49999, bce: 0.237574, psnr: 29.165964, ssim: 0.958170 2023-02-12 19:24:22,091 - INFO - [Train] step: 50099, loss_adv_disc: -2.001171 2023-02-12 19:24:22,302 - INFO - [Train] step: 50099, loss_mpn: 0.047635, loss_rec: 0.026544, loss_semantic: 0.338304, loss_idmrf: 1.879916, loss_adv_gen: -121.646629 2023-02-12 19:24:51,592 - INFO - [Train] step: 50199, loss_adv_disc: -2.856881 2023-02-12 19:24:51,802 - INFO - [Train] step: 50199, loss_mpn: 0.021989, loss_rec: 0.027053, loss_semantic: 0.340269, loss_idmrf: 1.022191, loss_adv_gen: -93.539017 2023-02-12 19:25:21,096 - INFO - [Train] step: 50299, loss_adv_disc: -2.231465 2023-02-12 19:25:21,305 - INFO - [Train] step: 50299, loss_mpn: 0.019300, loss_rec: 0.026914, loss_semantic: 0.345281, loss_idmrf: 0.733344, loss_adv_gen: -95.962479 2023-02-12 19:25:50,594 - INFO - [Train] step: 50399, loss_adv_disc: -0.926012 2023-02-12 19:25:50,803 - INFO - [Train] step: 50399, loss_mpn: 0.031686, loss_rec: 0.033401, loss_semantic: 0.429037, loss_idmrf: 2.494897, loss_adv_gen: -103.491386 2023-02-12 19:26:20,097 - INFO - [Train] step: 50499, loss_adv_disc: -1.408277 2023-02-12 19:26:20,307 - INFO - [Train] step: 50499, loss_mpn: 0.009078, loss_rec: 0.022229, loss_semantic: 0.294701, loss_idmrf: 1.018939, loss_adv_gen: -119.406174 2023-02-12 19:26:49,598 - INFO - [Train] step: 50599, loss_adv_disc: -1.290804 2023-02-12 19:26:49,807 - INFO - [Train] step: 50599, loss_mpn: 0.007198, loss_rec: 0.021514, loss_semantic: 0.311719, loss_idmrf: 0.734038, loss_adv_gen: -108.808784 2023-02-12 19:27:19,101 - INFO - [Train] step: 50699, loss_adv_disc: -1.856999 2023-02-12 19:27:19,312 - INFO - [Train] step: 50699, loss_mpn: 0.017984, loss_rec: 0.028954, loss_semantic: 0.364044, loss_idmrf: 1.311383, loss_adv_gen: -106.070358 2023-02-12 19:27:48,608 - INFO - [Train] step: 50799, loss_adv_disc: -2.041158 2023-02-12 19:27:48,817 - INFO - [Train] step: 50799, loss_mpn: 0.014557, loss_rec: 0.026508, loss_semantic: 0.320085, loss_idmrf: 1.399252, loss_adv_gen: -89.975098 2023-02-12 19:28:18,102 - INFO - [Train] step: 50899, loss_adv_disc: -1.249380 2023-02-12 19:28:18,312 - INFO - [Train] step: 50899, loss_mpn: 0.009905, loss_rec: 0.017467, loss_semantic: 0.313422, loss_idmrf: 1.073002, loss_adv_gen: -92.294327 2023-02-12 19:28:47,613 - INFO - [Train] step: 50999, loss_adv_disc: -2.591040 2023-02-12 19:28:47,823 - INFO - [Train] step: 50999, loss_mpn: 0.005041, loss_rec: 0.020469, loss_semantic: 0.315484, loss_idmrf: 1.245167, loss_adv_gen: -141.676514 2023-02-12 19:29:25,017 - INFO - [Eval] step: 50999, bce: 0.290225, psnr: 29.225630, ssim: 0.958155 2023-02-12 19:29:55,373 - INFO - [Train] step: 51099, loss_adv_disc: -0.296963 2023-02-12 19:29:55,581 - INFO - [Train] step: 51099, loss_mpn: 0.002948, loss_rec: 0.013134, loss_semantic: 0.234765, loss_idmrf: 0.667699, loss_adv_gen: -110.895996 2023-02-12 19:30:24,889 - INFO - [Train] step: 51199, loss_adv_disc: -1.859794 2023-02-12 19:30:25,099 - INFO - [Train] step: 51199, loss_mpn: 0.015886, loss_rec: 0.019499, loss_semantic: 0.322512, loss_idmrf: 0.820203, loss_adv_gen: -51.577698 2023-02-12 19:30:54,391 - INFO - [Train] step: 51299, loss_adv_disc: -0.914640 2023-02-12 19:30:54,602 - INFO - [Train] step: 51299, loss_mpn: 0.008771, loss_rec: 0.018173, loss_semantic: 0.292171, loss_idmrf: 1.109373, loss_adv_gen: -108.367790 2023-02-12 19:31:23,892 - INFO - [Train] step: 51399, loss_adv_disc: -1.505065 2023-02-12 19:31:24,102 - INFO - [Train] step: 51399, loss_mpn: 0.024423, loss_rec: 0.029507, loss_semantic: 0.416032, loss_idmrf: 1.132402, loss_adv_gen: -97.583214 2023-02-12 19:31:53,393 - INFO - [Train] step: 51499, loss_adv_disc: -1.740486 2023-02-12 19:31:53,602 - INFO - [Train] step: 51499, loss_mpn: 0.013225, loss_rec: 0.024096, loss_semantic: 0.355078, loss_idmrf: 0.755905, loss_adv_gen: -78.277763 2023-02-12 19:32:22,874 - INFO - [Train] step: 51599, loss_adv_disc: -2.212143 2023-02-12 19:32:23,083 - INFO - [Train] step: 51599, loss_mpn: 0.006431, loss_rec: 0.021055, loss_semantic: 0.318535, loss_idmrf: 1.391105, loss_adv_gen: -83.720169 2023-02-12 19:32:52,373 - INFO - [Train] step: 51699, loss_adv_disc: -2.427145 2023-02-12 19:32:52,583 - INFO - [Train] step: 51699, loss_mpn: 0.031171, loss_rec: 0.044173, loss_semantic: 0.473553, loss_idmrf: 1.950138, loss_adv_gen: -101.120438 2023-02-12 19:33:21,862 - INFO - [Train] step: 51799, loss_adv_disc: -2.071866 2023-02-12 19:33:22,071 - INFO - [Train] step: 51799, loss_mpn: 0.017028, loss_rec: 0.018801, loss_semantic: 0.314657, loss_idmrf: 1.090089, loss_adv_gen: -139.193634 2023-02-12 19:33:51,364 - INFO - [Train] step: 51899, loss_adv_disc: -1.078505 2023-02-12 19:33:51,573 - INFO - [Train] step: 51899, loss_mpn: 0.009584, loss_rec: 0.022450, loss_semantic: 0.314745, loss_idmrf: 1.011526, loss_adv_gen: -94.887955 2023-02-12 19:34:20,861 - INFO - [Train] step: 51999, loss_adv_disc: -2.614354 2023-02-12 19:34:21,070 - INFO - [Train] step: 51999, loss_mpn: 0.015752, loss_rec: 0.025212, loss_semantic: 0.369025, loss_idmrf: 1.345729, loss_adv_gen: -77.325150 2023-02-12 19:34:58,222 - INFO - [Eval] step: 51999, bce: 0.237634, psnr: 29.204573, ssim: 0.957058 2023-02-12 19:35:28,550 - INFO - [Train] step: 52099, loss_adv_disc: -3.470388 2023-02-12 19:35:28,759 - INFO - [Train] step: 52099, loss_mpn: 0.014506, loss_rec: 0.025014, loss_semantic: 0.380965, loss_idmrf: 1.132447, loss_adv_gen: -103.448784 2023-02-12 19:35:58,055 - INFO - [Train] step: 52199, loss_adv_disc: -2.390130 2023-02-12 19:35:58,264 - INFO - [Train] step: 52199, loss_mpn: 0.005173, loss_rec: 0.020688, loss_semantic: 0.271054, loss_idmrf: 0.777588, loss_adv_gen: -76.866959 2023-02-12 19:36:27,558 - INFO - [Train] step: 52299, loss_adv_disc: -2.220291 2023-02-12 19:36:27,768 - INFO - [Train] step: 52299, loss_mpn: 0.010095, loss_rec: 0.021063, loss_semantic: 0.316981, loss_idmrf: 0.713110, loss_adv_gen: -93.158508 2023-02-12 19:36:57,052 - INFO - [Train] step: 52399, loss_adv_disc: -4.567639 2023-02-12 19:36:57,262 - INFO - [Train] step: 52399, loss_mpn: 0.016775, loss_rec: 0.034063, loss_semantic: 0.413122, loss_idmrf: 1.014567, loss_adv_gen: -69.823128 2023-02-12 19:37:26,553 - INFO - [Train] step: 52499, loss_adv_disc: -1.524404 2023-02-12 19:37:26,762 - INFO - [Train] step: 52499, loss_mpn: 0.009501, loss_rec: 0.020885, loss_semantic: 0.314040, loss_idmrf: 1.669734, loss_adv_gen: -123.930260 2023-02-12 19:37:56,244 - INFO - [Train] step: 52599, loss_adv_disc: -1.203200 2023-02-12 19:37:56,453 - INFO - [Train] step: 52599, loss_mpn: 0.010196, loss_rec: 0.021607, loss_semantic: 0.328519, loss_idmrf: 1.239409, loss_adv_gen: -79.519554 2023-02-12 19:38:25,754 - INFO - [Train] step: 52699, loss_adv_disc: -1.388450 2023-02-12 19:38:25,963 - INFO - [Train] step: 52699, loss_mpn: 0.009088, loss_rec: 0.015277, loss_semantic: 0.258647, loss_idmrf: 0.871864, loss_adv_gen: -112.233932 2023-02-12 19:38:55,242 - INFO - [Train] step: 52799, loss_adv_disc: -1.926094 2023-02-12 19:38:55,451 - INFO - [Train] step: 52799, loss_mpn: 0.012140, loss_rec: 0.021482, loss_semantic: 0.336949, loss_idmrf: 0.705970, loss_adv_gen: -87.094269 2023-02-12 19:39:24,752 - INFO - [Train] step: 52899, loss_adv_disc: -2.244455 2023-02-12 19:39:24,961 - INFO - [Train] step: 52899, loss_mpn: 0.010007, loss_rec: 0.027126, loss_semantic: 0.362129, loss_idmrf: 2.374722, loss_adv_gen: -64.410789 2023-02-12 19:39:54,268 - INFO - [Train] step: 52999, loss_adv_disc: -1.290132 2023-02-12 19:39:54,477 - INFO - [Train] step: 52999, loss_mpn: 0.008291, loss_rec: 0.018043, loss_semantic: 0.305350, loss_idmrf: 0.957197, loss_adv_gen: -88.424088 2023-02-12 19:40:31,670 - INFO - [Eval] step: 52999, bce: 0.212212, psnr: 29.239985, ssim: 0.958323 2023-02-12 19:41:02,008 - INFO - [Train] step: 53099, loss_adv_disc: -2.975368 2023-02-12 19:41:02,217 - INFO - [Train] step: 53099, loss_mpn: 0.012638, loss_rec: 0.032605, loss_semantic: 0.442783, loss_idmrf: 2.058745, loss_adv_gen: -102.036270 2023-02-12 19:41:31,515 - INFO - [Train] step: 53199, loss_adv_disc: -2.679386 2023-02-12 19:41:31,724 - INFO - [Train] step: 53199, loss_mpn: 0.011432, loss_rec: 0.023636, loss_semantic: 0.318617, loss_idmrf: 0.829440, loss_adv_gen: -65.284554 2023-02-12 19:42:01,006 - INFO - [Train] step: 53299, loss_adv_disc: -2.855863 2023-02-12 19:42:01,215 - INFO - [Train] step: 53299, loss_mpn: 0.011267, loss_rec: 0.023803, loss_semantic: 0.361230, loss_idmrf: 1.187395, loss_adv_gen: -120.994247 2023-02-12 19:42:30,529 - INFO - [Train] step: 53399, loss_adv_disc: -3.007375 2023-02-12 19:42:30,740 - INFO - [Train] step: 53399, loss_mpn: 0.009373, loss_rec: 0.023529, loss_semantic: 0.312531, loss_idmrf: 1.221707, loss_adv_gen: -103.945496 2023-02-12 19:43:00,034 - INFO - [Train] step: 53499, loss_adv_disc: -3.264826 2023-02-12 19:43:00,243 - INFO - [Train] step: 53499, loss_mpn: 0.008781, loss_rec: 0.026541, loss_semantic: 0.360730, loss_idmrf: 0.365465, loss_adv_gen: -82.115341 2023-02-12 19:43:29,541 - INFO - [Train] step: 53599, loss_adv_disc: -1.153496 2023-02-12 19:43:29,749 - INFO - [Train] step: 53599, loss_mpn: 0.008889, loss_rec: 0.021301, loss_semantic: 0.296351, loss_idmrf: 1.179010, loss_adv_gen: -67.186821 2023-02-12 19:43:59,034 - INFO - [Train] step: 53699, loss_adv_disc: -0.712955 2023-02-12 19:43:59,244 - INFO - [Train] step: 53699, loss_mpn: 0.010560, loss_rec: 0.015322, loss_semantic: 0.270433, loss_idmrf: 0.775553, loss_adv_gen: -95.843857 2023-02-12 19:44:28,532 - INFO - [Train] step: 53799, loss_adv_disc: -2.854035 2023-02-12 19:44:28,742 - INFO - [Train] step: 53799, loss_mpn: 0.015267, loss_rec: 0.025011, loss_semantic: 0.360371, loss_idmrf: 1.003648, loss_adv_gen: -79.051071 2023-02-12 19:44:58,036 - INFO - [Train] step: 53899, loss_adv_disc: -2.448479 2023-02-12 19:44:58,245 - INFO - [Train] step: 53899, loss_mpn: 0.025840, loss_rec: 0.030666, loss_semantic: 0.446619, loss_idmrf: 1.302438, loss_adv_gen: -74.798492 2023-02-12 19:45:27,539 - INFO - [Train] step: 53999, loss_adv_disc: -4.222551 2023-02-12 19:45:27,748 - INFO - [Train] step: 53999, loss_mpn: 0.027270, loss_rec: 0.026375, loss_semantic: 0.343316, loss_idmrf: 0.869948, loss_adv_gen: -65.655663 2023-02-12 19:46:04,911 - INFO - [Eval] step: 53999, bce: 0.214767, psnr: 29.448793, ssim: 0.958135 2023-02-12 19:46:35,436 - INFO - [Train] step: 54099, loss_adv_disc: -1.364628 2023-02-12 19:46:35,646 - INFO - [Train] step: 54099, loss_mpn: 0.020148, loss_rec: 0.023998, loss_semantic: 0.384975, loss_idmrf: 1.673068, loss_adv_gen: -66.475525 2023-02-12 19:47:04,954 - INFO - [Train] step: 54199, loss_adv_disc: -0.994695 2023-02-12 19:47:05,163 - INFO - [Train] step: 54199, loss_mpn: 0.011587, loss_rec: 0.017862, loss_semantic: 0.283356, loss_idmrf: 0.850865, loss_adv_gen: -74.708221 2023-02-12 19:47:34,461 - INFO - [Train] step: 54299, loss_adv_disc: -1.910192 2023-02-12 19:47:34,670 - INFO - [Train] step: 54299, loss_mpn: 0.013402, loss_rec: 0.019706, loss_semantic: 0.299295, loss_idmrf: 1.083163, loss_adv_gen: -114.929352 2023-02-12 19:48:03,972 - INFO - [Train] step: 54399, loss_adv_disc: -3.115757 2023-02-12 19:48:04,181 - INFO - [Train] step: 54399, loss_mpn: 0.012010, loss_rec: 0.023797, loss_semantic: 0.354180, loss_idmrf: 0.678295, loss_adv_gen: -91.600525 2023-02-12 19:48:33,477 - INFO - [Train] step: 54499, loss_adv_disc: -1.065683 2023-02-12 19:48:33,687 - INFO - [Train] step: 54499, loss_mpn: 0.008158, loss_rec: 0.012532, loss_semantic: 0.237641, loss_idmrf: 0.742880, loss_adv_gen: -98.203590 2023-02-12 19:49:02,982 - INFO - [Train] step: 54599, loss_adv_disc: -1.416939 2023-02-12 19:49:03,192 - INFO - [Train] step: 54599, loss_mpn: 0.007111, loss_rec: 0.020124, loss_semantic: 0.318212, loss_idmrf: 1.238653, loss_adv_gen: -92.879845 2023-02-12 19:49:32,489 - INFO - [Train] step: 54699, loss_adv_disc: -3.411561 2023-02-12 19:49:32,698 - INFO - [Train] step: 54699, loss_mpn: 0.018246, loss_rec: 0.030389, loss_semantic: 0.409694, loss_idmrf: 2.028714, loss_adv_gen: -104.348022 2023-02-12 19:50:01,985 - INFO - [Train] step: 54799, loss_adv_disc: -3.946653 2023-02-12 19:50:02,194 - INFO - [Train] step: 54799, loss_mpn: 0.040976, loss_rec: 0.048790, loss_semantic: 0.528529, loss_idmrf: 1.803460, loss_adv_gen: -111.905914 2023-02-12 19:50:31,479 - INFO - [Train] step: 54899, loss_adv_disc: -1.838722 2023-02-12 19:50:31,688 - INFO - [Train] step: 54899, loss_mpn: 0.016595, loss_rec: 0.014711, loss_semantic: 0.248383, loss_idmrf: 0.660278, loss_adv_gen: -105.343994 2023-02-12 19:51:00,976 - INFO - [Train] step: 54999, loss_adv_disc: -2.886314 2023-02-12 19:51:01,185 - INFO - [Train] step: 54999, loss_mpn: 0.016061, loss_rec: 0.029713, loss_semantic: 0.351803, loss_idmrf: 0.655793, loss_adv_gen: -57.752594 2023-02-12 19:51:38,372 - INFO - [Eval] step: 54999, bce: 0.187451, psnr: 29.354498, ssim: 0.958257 2023-02-12 19:52:08,833 - INFO - [Train] step: 55099, loss_adv_disc: -1.533258 2023-02-12 19:52:09,042 - INFO - [Train] step: 55099, loss_mpn: 0.015485, loss_rec: 0.016981, loss_semantic: 0.301325, loss_idmrf: 0.751906, loss_adv_gen: -105.179306 2023-02-12 19:52:38,349 - INFO - [Train] step: 55199, loss_adv_disc: -3.465842 2023-02-12 19:52:38,558 - INFO - [Train] step: 55199, loss_mpn: 0.012754, loss_rec: 0.030780, loss_semantic: 0.383153, loss_idmrf: 1.355478, loss_adv_gen: -78.447098 2023-02-12 19:53:07,845 - INFO - [Train] step: 55299, loss_adv_disc: -3.589386 2023-02-12 19:53:08,054 - INFO - [Train] step: 55299, loss_mpn: 0.011801, loss_rec: 0.021169, loss_semantic: 0.301335, loss_idmrf: 0.648533, loss_adv_gen: -66.107346 2023-02-12 19:53:37,359 - INFO - [Train] step: 55399, loss_adv_disc: -0.444913 2023-02-12 19:53:37,568 - INFO - [Train] step: 55399, loss_mpn: 0.006815, loss_rec: 0.016888, loss_semantic: 0.291594, loss_idmrf: 1.143174, loss_adv_gen: -57.573685 2023-02-12 19:54:06,861 - INFO - [Train] step: 55499, loss_adv_disc: -0.849530 2023-02-12 19:54:07,070 - INFO - [Train] step: 55499, loss_mpn: 0.035110, loss_rec: 0.023963, loss_semantic: 0.329210, loss_idmrf: 0.966587, loss_adv_gen: -90.655457 2023-02-12 19:54:36,375 - INFO - [Train] step: 55599, loss_adv_disc: -2.464707 2023-02-12 19:54:36,585 - INFO - [Train] step: 55599, loss_mpn: 0.016308, loss_rec: 0.018391, loss_semantic: 0.288575, loss_idmrf: 0.786954, loss_adv_gen: -88.950851 2023-02-12 19:55:05,881 - INFO - [Train] step: 55699, loss_adv_disc: -4.033987 2023-02-12 19:55:06,090 - INFO - [Train] step: 55699, loss_mpn: 0.017409, loss_rec: 0.033368, loss_semantic: 0.404971, loss_idmrf: 1.066720, loss_adv_gen: -98.599152 2023-02-12 19:55:35,399 - INFO - [Train] step: 55799, loss_adv_disc: -3.560221 2023-02-12 19:55:35,607 - INFO - [Train] step: 55799, loss_mpn: 0.020592, loss_rec: 0.027303, loss_semantic: 0.327827, loss_idmrf: 0.871356, loss_adv_gen: -105.081360 2023-02-12 19:56:04,907 - INFO - [Train] step: 55899, loss_adv_disc: -3.305433 2023-02-12 19:56:05,116 - INFO - [Train] step: 55899, loss_mpn: 0.020567, loss_rec: 0.027804, loss_semantic: 0.367570, loss_idmrf: 0.845386, loss_adv_gen: -88.347977 2023-02-12 19:56:34,417 - INFO - [Train] step: 55999, loss_adv_disc: -1.323038 2023-02-12 19:56:34,626 - INFO - [Train] step: 55999, loss_mpn: 0.013041, loss_rec: 0.018861, loss_semantic: 0.312818, loss_idmrf: 1.000875, loss_adv_gen: -57.947937 2023-02-12 19:57:11,790 - INFO - [Eval] step: 55999, bce: 0.187632, psnr: 29.502460, ssim: 0.958651 2023-02-12 19:57:42,338 - INFO - [Train] step: 56099, loss_adv_disc: -2.082334 2023-02-12 19:57:42,547 - INFO - [Train] step: 56099, loss_mpn: 0.007763, loss_rec: 0.019574, loss_semantic: 0.290304, loss_idmrf: 0.708374, loss_adv_gen: -78.708603 2023-02-12 19:58:11,856 - INFO - [Train] step: 56199, loss_adv_disc: -4.655578 2023-02-12 19:58:12,067 - INFO - [Train] step: 56199, loss_mpn: 0.016059, loss_rec: 0.034596, loss_semantic: 0.377282, loss_idmrf: 1.203077, loss_adv_gen: -90.103912 2023-02-12 19:58:41,382 - INFO - [Train] step: 56299, loss_adv_disc: -1.711516 2023-02-12 19:58:41,591 - INFO - [Train] step: 56299, loss_mpn: 0.014362, loss_rec: 0.029764, loss_semantic: 0.339368, loss_idmrf: 1.080772, loss_adv_gen: -73.854462 2023-02-12 19:59:10,899 - INFO - [Train] step: 56399, loss_adv_disc: -0.968657 2023-02-12 19:59:11,107 - INFO - [Train] step: 56399, loss_mpn: 0.003675, loss_rec: 0.014704, loss_semantic: 0.254786, loss_idmrf: 0.513764, loss_adv_gen: -110.156570 2023-02-12 19:59:40,413 - INFO - [Train] step: 56499, loss_adv_disc: -1.634028 2023-02-12 19:59:40,623 - INFO - [Train] step: 56499, loss_mpn: 0.008245, loss_rec: 0.021549, loss_semantic: 0.328980, loss_idmrf: 1.381191, loss_adv_gen: -83.735260 2023-02-12 20:00:09,916 - INFO - [Train] step: 56599, loss_adv_disc: -2.758683 2023-02-12 20:00:10,126 - INFO - [Train] step: 56599, loss_mpn: 0.011223, loss_rec: 0.024394, loss_semantic: 0.345085, loss_idmrf: 1.576615, loss_adv_gen: -81.840546 2023-02-12 20:00:39,413 - INFO - [Train] step: 56699, loss_adv_disc: -2.262849 2023-02-12 20:00:39,623 - INFO - [Train] step: 56699, loss_mpn: 0.015749, loss_rec: 0.021547, loss_semantic: 0.336734, loss_idmrf: 0.806564, loss_adv_gen: -68.612915 2023-02-12 20:01:08,918 - INFO - [Train] step: 56799, loss_adv_disc: -2.795112 2023-02-12 20:01:09,129 - INFO - [Train] step: 56799, loss_mpn: 0.021247, loss_rec: 0.028694, loss_semantic: 0.380349, loss_idmrf: 1.434934, loss_adv_gen: -71.875458 2023-02-12 20:01:38,425 - INFO - [Train] step: 56899, loss_adv_disc: -0.868286 2023-02-12 20:01:38,636 - INFO - [Train] step: 56899, loss_mpn: 0.015993, loss_rec: 0.020003, loss_semantic: 0.331811, loss_idmrf: 1.457388, loss_adv_gen: -58.831833 2023-02-12 20:02:07,926 - INFO - [Train] step: 56999, loss_adv_disc: -0.789754 2023-02-12 20:02:08,134 - INFO - [Train] step: 56999, loss_mpn: 0.016638, loss_rec: 0.021772, loss_semantic: 0.344716, loss_idmrf: 1.508822, loss_adv_gen: -74.623550 2023-02-12 20:02:45,290 - INFO - [Eval] step: 56999, bce: 0.227397, psnr: 29.391455, ssim: 0.958339 2023-02-12 20:03:15,626 - INFO - [Train] step: 57099, loss_adv_disc: -2.761654 2023-02-12 20:03:15,835 - INFO - [Train] step: 57099, loss_mpn: 0.009793, loss_rec: 0.024114, loss_semantic: 0.355766, loss_idmrf: 0.817144, loss_adv_gen: -116.768173 2023-02-12 20:03:45,138 - INFO - [Train] step: 57199, loss_adv_disc: -2.712404 2023-02-12 20:03:45,347 - INFO - [Train] step: 57199, loss_mpn: 0.018073, loss_rec: 0.029034, loss_semantic: 0.364290, loss_idmrf: 1.789201, loss_adv_gen: -60.002716 2023-02-12 20:04:14,656 - INFO - [Train] step: 57299, loss_adv_disc: -1.350332 2023-02-12 20:04:14,866 - INFO - [Train] step: 57299, loss_mpn: 0.010033, loss_rec: 0.023826, loss_semantic: 0.356732, loss_idmrf: 1.198814, loss_adv_gen: -67.250397 2023-02-12 20:04:44,155 - INFO - [Train] step: 57399, loss_adv_disc: -2.416155 2023-02-12 20:04:44,364 - INFO - [Train] step: 57399, loss_mpn: 0.013665, loss_rec: 0.020868, loss_semantic: 0.367269, loss_idmrf: 2.106461, loss_adv_gen: -95.285416 2023-02-12 20:05:13,663 - INFO - [Train] step: 57499, loss_adv_disc: -1.196605 2023-02-12 20:05:13,872 - INFO - [Train] step: 57499, loss_mpn: 0.012793, loss_rec: 0.016853, loss_semantic: 0.280076, loss_idmrf: 0.626047, loss_adv_gen: -40.531693 2023-02-12 20:05:43,172 - INFO - [Train] step: 57599, loss_adv_disc: -2.223887 2023-02-12 20:05:43,382 - INFO - [Train] step: 57599, loss_mpn: 0.014655, loss_rec: 0.022555, loss_semantic: 0.316974, loss_idmrf: 1.277864, loss_adv_gen: -100.258362 2023-02-12 20:06:12,684 - INFO - [Train] step: 57699, loss_adv_disc: -2.962488 2023-02-12 20:06:12,894 - INFO - [Train] step: 57699, loss_mpn: 0.010846, loss_rec: 0.023274, loss_semantic: 0.341277, loss_idmrf: 1.869294, loss_adv_gen: -127.701057 2023-02-12 20:06:42,188 - INFO - [Train] step: 57799, loss_adv_disc: -0.885595 2023-02-12 20:06:42,397 - INFO - [Train] step: 57799, loss_mpn: 0.018584, loss_rec: 0.022804, loss_semantic: 0.375528, loss_idmrf: 1.218095, loss_adv_gen: -93.293213 2023-02-12 20:07:11,693 - INFO - [Train] step: 57899, loss_adv_disc: -1.538133 2023-02-12 20:07:11,902 - INFO - [Train] step: 57899, loss_mpn: 0.013996, loss_rec: 0.020593, loss_semantic: 0.328018, loss_idmrf: 1.170421, loss_adv_gen: -106.219528 2023-02-12 20:07:41,206 - INFO - [Train] step: 57999, loss_adv_disc: -2.256503 2023-02-12 20:07:41,416 - INFO - [Train] step: 57999, loss_mpn: 0.032725, loss_rec: 0.031700, loss_semantic: 0.431627, loss_idmrf: 1.417008, loss_adv_gen: -79.556915 2023-02-12 20:08:18,583 - INFO - [Eval] step: 57999, bce: 0.227314, psnr: 29.398300, ssim: 0.958094 2023-02-12 20:08:48,927 - INFO - [Train] step: 58099, loss_adv_disc: -2.380032 2023-02-12 20:08:49,136 - INFO - [Train] step: 58099, loss_mpn: 0.012436, loss_rec: 0.018568, loss_semantic: 0.307035, loss_idmrf: 1.102854, loss_adv_gen: -77.851387 2023-02-12 20:09:18,435 - INFO - [Train] step: 58199, loss_adv_disc: -1.628753 2023-02-12 20:09:18,646 - INFO - [Train] step: 58199, loss_mpn: 0.014636, loss_rec: 0.018766, loss_semantic: 0.292987, loss_idmrf: 0.811472, loss_adv_gen: -89.222725 2023-02-12 20:09:47,946 - INFO - [Train] step: 58299, loss_adv_disc: -2.451613 2023-02-12 20:09:48,155 - INFO - [Train] step: 58299, loss_mpn: 0.011714, loss_rec: 0.023199, loss_semantic: 0.337949, loss_idmrf: 0.987955, loss_adv_gen: -87.135071 2023-02-12 20:10:17,452 - INFO - [Train] step: 58399, loss_adv_disc: -1.320832 2023-02-12 20:10:17,661 - INFO - [Train] step: 58399, loss_mpn: 0.015208, loss_rec: 0.021054, loss_semantic: 0.318262, loss_idmrf: 0.818400, loss_adv_gen: -93.065689 2023-02-12 20:10:46,942 - INFO - [Train] step: 58499, loss_adv_disc: -2.424925 2023-02-12 20:10:47,152 - INFO - [Train] step: 58499, loss_mpn: 0.019880, loss_rec: 0.020395, loss_semantic: 0.326860, loss_idmrf: 0.725335, loss_adv_gen: -132.200989 2023-02-12 20:11:16,444 - INFO - [Train] step: 58599, loss_adv_disc: -2.163678 2023-02-12 20:11:16,653 - INFO - [Train] step: 58599, loss_mpn: 0.007345, loss_rec: 0.019984, loss_semantic: 0.308711, loss_idmrf: 0.656295, loss_adv_gen: -97.064125 2023-02-12 20:11:45,942 - INFO - [Train] step: 58699, loss_adv_disc: -2.172320 2023-02-12 20:11:46,151 - INFO - [Train] step: 58699, loss_mpn: 0.019341, loss_rec: 0.020081, loss_semantic: 0.301307, loss_idmrf: 1.147762, loss_adv_gen: -69.637413 2023-02-12 20:12:15,450 - INFO - [Train] step: 58799, loss_adv_disc: -2.398038 2023-02-12 20:12:15,659 - INFO - [Train] step: 58799, loss_mpn: 0.012404, loss_rec: 0.024142, loss_semantic: 0.366488, loss_idmrf: 0.639623, loss_adv_gen: -93.978035 2023-02-12 20:12:44,958 - INFO - [Train] step: 58899, loss_adv_disc: -1.669145 2023-02-12 20:12:45,167 - INFO - [Train] step: 58899, loss_mpn: 0.011152, loss_rec: 0.024529, loss_semantic: 0.352448, loss_idmrf: 1.038370, loss_adv_gen: -97.385864 2023-02-12 20:13:14,457 - INFO - [Train] step: 58999, loss_adv_disc: -0.343768 2023-02-12 20:13:14,667 - INFO - [Train] step: 58999, loss_mpn: 0.005202, loss_rec: 0.014032, loss_semantic: 0.272366, loss_idmrf: 1.193827, loss_adv_gen: -81.807922 2023-02-12 20:13:51,847 - INFO - [Eval] step: 58999, bce: 0.222707, psnr: 29.370173, ssim: 0.958634 2023-02-12 20:14:22,187 - INFO - [Train] step: 59099, loss_adv_disc: -2.458357 2023-02-12 20:14:22,396 - INFO - [Train] step: 59099, loss_mpn: 0.014192, loss_rec: 0.023307, loss_semantic: 0.347817, loss_idmrf: 1.364341, loss_adv_gen: -74.272797 2023-02-12 20:14:51,691 - INFO - [Train] step: 59199, loss_adv_disc: -0.302998 2023-02-12 20:14:51,902 - INFO - [Train] step: 59199, loss_mpn: 0.016307, loss_rec: 0.024369, loss_semantic: 0.373546, loss_idmrf: 1.060199, loss_adv_gen: -120.296944 2023-02-12 20:15:21,206 - INFO - [Train] step: 59299, loss_adv_disc: -1.755460 2023-02-12 20:15:21,416 - INFO - [Train] step: 59299, loss_mpn: 0.008886, loss_rec: 0.016614, loss_semantic: 0.262822, loss_idmrf: 0.669115, loss_adv_gen: -53.649994 2023-02-12 20:15:50,712 - INFO - [Train] step: 59399, loss_adv_disc: -1.033675 2023-02-12 20:15:50,922 - INFO - [Train] step: 59399, loss_mpn: 0.010227, loss_rec: 0.024849, loss_semantic: 0.318690, loss_idmrf: 1.509957, loss_adv_gen: -112.534660 2023-02-12 20:16:20,217 - INFO - [Train] step: 59499, loss_adv_disc: -2.955933 2023-02-12 20:16:20,427 - INFO - [Train] step: 59499, loss_mpn: 0.017206, loss_rec: 0.028843, loss_semantic: 0.376678, loss_idmrf: 1.224646, loss_adv_gen: -116.756096 2023-02-12 20:16:49,716 - INFO - [Train] step: 59599, loss_adv_disc: -2.778207 2023-02-12 20:16:49,925 - INFO - [Train] step: 59599, loss_mpn: 0.008924, loss_rec: 0.026379, loss_semantic: 0.347794, loss_idmrf: 1.999091, loss_adv_gen: -94.547531 2023-02-12 20:17:19,425 - INFO - [Train] step: 59699, loss_adv_disc: -1.696153 2023-02-12 20:17:19,635 - INFO - [Train] step: 59699, loss_mpn: 0.006386, loss_rec: 0.014135, loss_semantic: 0.236222, loss_idmrf: 0.958720, loss_adv_gen: -41.851273 2023-02-12 20:17:48,922 - INFO - [Train] step: 59799, loss_adv_disc: -1.955612 2023-02-12 20:17:49,132 - INFO - [Train] step: 59799, loss_mpn: 0.018486, loss_rec: 0.024579, loss_semantic: 0.362311, loss_idmrf: 1.577052, loss_adv_gen: -99.243805 2023-02-12 20:18:18,434 - INFO - [Train] step: 59899, loss_adv_disc: -2.187907 2023-02-12 20:18:18,643 - INFO - [Train] step: 59899, loss_mpn: 0.020480, loss_rec: 0.023331, loss_semantic: 0.372383, loss_idmrf: 1.526654, loss_adv_gen: -81.266197 2023-02-12 20:18:47,942 - INFO - [Train] step: 59999, loss_adv_disc: -1.475896 2023-02-12 20:18:48,151 - INFO - [Train] step: 59999, loss_mpn: 0.019685, loss_rec: 0.020596, loss_semantic: 0.289600, loss_idmrf: 1.476321, loss_adv_gen: -94.476212 2023-02-12 20:19:25,315 - INFO - [Eval] step: 59999, bce: 0.294605, psnr: 29.210405, ssim: 0.958446 2023-02-12 20:19:55,771 - INFO - [Train] step: 60099, loss_adv_disc: -4.445138 2023-02-12 20:19:55,981 - INFO - [Train] step: 60099, loss_mpn: 0.009855, loss_rec: 0.023556, loss_semantic: 0.333127, loss_idmrf: 1.284974, loss_adv_gen: -47.594528 2023-02-12 20:20:25,275 - INFO - [Train] step: 60199, loss_adv_disc: -1.911898 2023-02-12 20:20:25,484 - INFO - [Train] step: 60199, loss_mpn: 0.012971, loss_rec: 0.019928, loss_semantic: 0.312153, loss_idmrf: 1.257880, loss_adv_gen: -66.287033 2023-02-12 20:20:54,776 - INFO - [Train] step: 60299, loss_adv_disc: -2.587398 2023-02-12 20:20:54,985 - INFO - [Train] step: 60299, loss_mpn: 0.018533, loss_rec: 0.025809, loss_semantic: 0.364575, loss_idmrf: 1.504420, loss_adv_gen: -70.975052 2023-02-12 20:21:24,277 - INFO - [Train] step: 60399, loss_adv_disc: -1.135465 2023-02-12 20:21:24,486 - INFO - [Train] step: 60399, loss_mpn: 0.011203, loss_rec: 0.017365, loss_semantic: 0.291439, loss_idmrf: 1.237777, loss_adv_gen: -84.856743 2023-02-12 20:21:53,783 - INFO - [Train] step: 60499, loss_adv_disc: -1.112794 2023-02-12 20:21:53,993 - INFO - [Train] step: 60499, loss_mpn: 0.004172, loss_rec: 0.012766, loss_semantic: 0.237006, loss_idmrf: 0.853736, loss_adv_gen: -101.836243 2023-02-12 20:22:23,302 - INFO - [Train] step: 60599, loss_adv_disc: -0.455096 2023-02-12 20:22:23,512 - INFO - [Train] step: 60599, loss_mpn: 0.014553, loss_rec: 0.020710, loss_semantic: 0.289077, loss_idmrf: 1.196774, loss_adv_gen: -83.260567 2023-02-12 20:22:52,795 - INFO - [Train] step: 60699, loss_adv_disc: -1.318252 2023-02-12 20:22:53,006 - INFO - [Train] step: 60699, loss_mpn: 0.003901, loss_rec: 0.015229, loss_semantic: 0.227469, loss_idmrf: 1.332391, loss_adv_gen: -94.741196 2023-02-12 20:23:22,309 - INFO - [Train] step: 60799, loss_adv_disc: -2.564342 2023-02-12 20:23:22,518 - INFO - [Train] step: 60799, loss_mpn: 0.021271, loss_rec: 0.023038, loss_semantic: 0.343042, loss_idmrf: 2.344107, loss_adv_gen: -55.702553 2023-02-12 20:23:51,813 - INFO - [Train] step: 60899, loss_adv_disc: -1.988332 2023-02-12 20:23:52,022 - INFO - [Train] step: 60899, loss_mpn: 0.011553, loss_rec: 0.022858, loss_semantic: 0.323586, loss_idmrf: 2.686563, loss_adv_gen: -55.407013 2023-02-12 20:24:21,309 - INFO - [Train] step: 60999, loss_adv_disc: -1.780610 2023-02-12 20:24:21,519 - INFO - [Train] step: 60999, loss_mpn: 0.011205, loss_rec: 0.021297, loss_semantic: 0.299008, loss_idmrf: 0.900403, loss_adv_gen: -75.799774 2023-02-12 20:24:58,718 - INFO - [Eval] step: 60999, bce: 0.227459, psnr: 29.360298, ssim: 0.957763 2023-02-12 20:25:29,055 - INFO - [Train] step: 61099, loss_adv_disc: -1.933060 2023-02-12 20:25:29,266 - INFO - [Train] step: 61099, loss_mpn: 0.021871, loss_rec: 0.024718, loss_semantic: 0.355495, loss_idmrf: 1.837337, loss_adv_gen: -35.882965 2023-02-12 20:25:58,559 - INFO - [Train] step: 61199, loss_adv_disc: -1.892022 2023-02-12 20:25:58,768 - INFO - [Train] step: 61199, loss_mpn: 0.027481, loss_rec: 0.028273, loss_semantic: 0.373819, loss_idmrf: 1.440577, loss_adv_gen: -97.507858 2023-02-12 20:26:28,062 - INFO - [Train] step: 61299, loss_adv_disc: -3.394790 2023-02-12 20:26:28,271 - INFO - [Train] step: 61299, loss_mpn: 0.006460, loss_rec: 0.021970, loss_semantic: 0.311478, loss_idmrf: 0.702727, loss_adv_gen: -84.326775 2023-02-12 20:26:57,554 - INFO - [Train] step: 61399, loss_adv_disc: -3.564257 2023-02-12 20:26:57,763 - INFO - [Train] step: 61399, loss_mpn: 0.008657, loss_rec: 0.023075, loss_semantic: 0.300943, loss_idmrf: 0.893494, loss_adv_gen: -86.416351 2023-02-12 20:27:27,055 - INFO - [Train] step: 61499, loss_adv_disc: -1.294622 2023-02-12 20:27:27,266 - INFO - [Train] step: 61499, loss_mpn: 0.014906, loss_rec: 0.026015, loss_semantic: 0.385390, loss_idmrf: 1.396772, loss_adv_gen: -67.483665 2023-02-12 20:27:56,552 - INFO - [Train] step: 61599, loss_adv_disc: -3.446497 2023-02-12 20:27:56,763 - INFO - [Train] step: 61599, loss_mpn: 0.012878, loss_rec: 0.024672, loss_semantic: 0.351953, loss_idmrf: 0.955452, loss_adv_gen: -75.795654 2023-02-12 20:28:26,060 - INFO - [Train] step: 61699, loss_adv_disc: -1.949156 2023-02-12 20:28:26,269 - INFO - [Train] step: 61699, loss_mpn: 0.022376, loss_rec: 0.025792, loss_semantic: 0.399170, loss_idmrf: 2.766810, loss_adv_gen: -55.261978 2023-02-12 20:28:55,559 - INFO - [Train] step: 61799, loss_adv_disc: -1.695251 2023-02-12 20:28:55,768 - INFO - [Train] step: 61799, loss_mpn: 0.007467, loss_rec: 0.017843, loss_semantic: 0.298160, loss_idmrf: 0.966411, loss_adv_gen: -67.085068 2023-02-12 20:29:25,071 - INFO - [Train] step: 61899, loss_adv_disc: -2.691393 2023-02-12 20:29:25,280 - INFO - [Train] step: 61899, loss_mpn: 0.010678, loss_rec: 0.019842, loss_semantic: 0.310016, loss_idmrf: 1.737013, loss_adv_gen: -87.385956 2023-02-12 20:29:54,577 - INFO - [Train] step: 61999, loss_adv_disc: -6.525076 2023-02-12 20:29:54,788 - INFO - [Train] step: 61999, loss_mpn: 0.018335, loss_rec: 0.038500, loss_semantic: 0.458143, loss_idmrf: 1.262014, loss_adv_gen: -86.927711 2023-02-12 20:30:31,954 - INFO - [Eval] step: 61999, bce: 0.207327, psnr: 29.306715, ssim: 0.958894 2023-02-12 20:31:02,284 - INFO - [Train] step: 62099, loss_adv_disc: -2.654778 2023-02-12 20:31:02,494 - INFO - [Train] step: 62099, loss_mpn: 0.012052, loss_rec: 0.022301, loss_semantic: 0.314478, loss_idmrf: 1.095222, loss_adv_gen: -46.431961 2023-02-12 20:31:31,815 - INFO - [Train] step: 62199, loss_adv_disc: -1.044177 2023-02-12 20:31:32,024 - INFO - [Train] step: 62199, loss_mpn: 0.010944, loss_rec: 0.020925, loss_semantic: 0.326426, loss_idmrf: 1.198376, loss_adv_gen: -23.568802 2023-02-12 20:32:01,324 - INFO - [Train] step: 62299, loss_adv_disc: -2.465047 2023-02-12 20:32:01,535 - INFO - [Train] step: 62299, loss_mpn: 0.007034, loss_rec: 0.019141, loss_semantic: 0.278972, loss_idmrf: 0.931390, loss_adv_gen: -71.468369 2023-02-12 20:32:30,836 - INFO - [Train] step: 62399, loss_adv_disc: -2.557241 2023-02-12 20:32:31,045 - INFO - [Train] step: 62399, loss_mpn: 0.021502, loss_rec: 0.024414, loss_semantic: 0.333618, loss_idmrf: 1.053975, loss_adv_gen: -72.943253 2023-02-12 20:33:00,347 - INFO - [Train] step: 62499, loss_adv_disc: -1.476736 2023-02-12 20:33:00,557 - INFO - [Train] step: 62499, loss_mpn: 0.013999, loss_rec: 0.025088, loss_semantic: 0.348540, loss_idmrf: 1.119727, loss_adv_gen: -62.634567 2023-02-12 20:33:29,857 - INFO - [Train] step: 62599, loss_adv_disc: -1.633311 2023-02-12 20:33:30,066 - INFO - [Train] step: 62599, loss_mpn: 0.011216, loss_rec: 0.016783, loss_semantic: 0.287028, loss_idmrf: 0.913877, loss_adv_gen: -67.804581 2023-02-12 20:33:59,367 - INFO - [Train] step: 62699, loss_adv_disc: -0.799928 2023-02-12 20:33:59,577 - INFO - [Train] step: 62699, loss_mpn: 0.015715, loss_rec: 0.020131, loss_semantic: 0.327887, loss_idmrf: 1.319197, loss_adv_gen: -63.491653 2023-02-12 20:34:28,879 - INFO - [Train] step: 62799, loss_adv_disc: -1.870113 2023-02-12 20:34:29,088 - INFO - [Train] step: 62799, loss_mpn: 0.008999, loss_rec: 0.023888, loss_semantic: 0.346071, loss_idmrf: 0.875023, loss_adv_gen: -34.726059 2023-02-12 20:34:58,380 - INFO - [Train] step: 62899, loss_adv_disc: -3.187500 2023-02-12 20:34:58,590 - INFO - [Train] step: 62899, loss_mpn: 0.011187, loss_rec: 0.022208, loss_semantic: 0.338415, loss_idmrf: 1.048016, loss_adv_gen: -57.468338 2023-02-12 20:35:27,888 - INFO - [Train] step: 62999, loss_adv_disc: -2.497921 2023-02-12 20:35:28,098 - INFO - [Train] step: 62999, loss_mpn: 0.008737, loss_rec: 0.018970, loss_semantic: 0.272150, loss_idmrf: 0.492711, loss_adv_gen: -65.404938 2023-02-12 20:36:05,290 - INFO - [Eval] step: 62999, bce: 0.188681, psnr: 29.341843, ssim: 0.958179 2023-02-12 20:36:35,600 - INFO - [Train] step: 63099, loss_adv_disc: -2.439116 2023-02-12 20:36:35,809 - INFO - [Train] step: 63099, loss_mpn: 0.018291, loss_rec: 0.029292, loss_semantic: 0.380904, loss_idmrf: 1.097306, loss_adv_gen: -74.730591 2023-02-12 20:37:05,112 - INFO - [Train] step: 63199, loss_adv_disc: -0.907376 2023-02-12 20:37:05,321 - INFO - [Train] step: 63199, loss_mpn: 0.022502, loss_rec: 0.018327, loss_semantic: 0.347268, loss_idmrf: 1.358313, loss_adv_gen: -84.198227 2023-02-12 20:37:34,622 - INFO - [Train] step: 63299, loss_adv_disc: -2.138724 2023-02-12 20:37:34,831 - INFO - [Train] step: 63299, loss_mpn: 0.008526, loss_rec: 0.023349, loss_semantic: 0.338574, loss_idmrf: 1.171306, loss_adv_gen: -91.489304 2023-02-12 20:38:04,110 - INFO - [Train] step: 63399, loss_adv_disc: -1.983230 2023-02-12 20:38:04,319 - INFO - [Train] step: 63399, loss_mpn: 0.013409, loss_rec: 0.023461, loss_semantic: 0.325338, loss_idmrf: 1.292459, loss_adv_gen: -75.881638 2023-02-12 20:38:33,613 - INFO - [Train] step: 63499, loss_adv_disc: -2.312161 2023-02-12 20:38:33,823 - INFO - [Train] step: 63499, loss_mpn: 0.008897, loss_rec: 0.016473, loss_semantic: 0.281240, loss_idmrf: 0.840100, loss_adv_gen: -93.070404 2023-02-12 20:39:03,123 - INFO - [Train] step: 63599, loss_adv_disc: -1.800785 2023-02-12 20:39:03,335 - INFO - [Train] step: 63599, loss_mpn: 0.012079, loss_rec: 0.024364, loss_semantic: 0.325525, loss_idmrf: 1.194067, loss_adv_gen: -70.242859 2023-02-12 20:39:32,639 - INFO - [Train] step: 63699, loss_adv_disc: -3.595148 2023-02-12 20:39:32,848 - INFO - [Train] step: 63699, loss_mpn: 0.046786, loss_rec: 0.029651, loss_semantic: 0.346434, loss_idmrf: 1.699037, loss_adv_gen: -57.286613 2023-02-12 20:40:02,154 - INFO - [Train] step: 63799, loss_adv_disc: -1.404692 2023-02-12 20:40:02,364 - INFO - [Train] step: 63799, loss_mpn: 0.012211, loss_rec: 0.018689, loss_semantic: 0.266198, loss_idmrf: 1.230947, loss_adv_gen: -57.965668 2023-02-12 20:40:31,670 - INFO - [Train] step: 63899, loss_adv_disc: -3.482700 2023-02-12 20:40:31,879 - INFO - [Train] step: 63899, loss_mpn: 0.006405, loss_rec: 0.023345, loss_semantic: 0.310388, loss_idmrf: 0.734679, loss_adv_gen: -100.490631 2023-02-12 20:41:01,167 - INFO - [Train] step: 63999, loss_adv_disc: -2.204623 2023-02-12 20:41:01,377 - INFO - [Train] step: 63999, loss_mpn: 0.011057, loss_rec: 0.020058, loss_semantic: 0.327534, loss_idmrf: 0.686581, loss_adv_gen: -82.438423 2023-02-12 20:41:38,545 - INFO - [Eval] step: 63999, bce: 0.240235, psnr: 29.201040, ssim: 0.958906 2023-02-12 20:42:08,866 - INFO - [Train] step: 64099, loss_adv_disc: -4.937222 2023-02-12 20:42:09,076 - INFO - [Train] step: 64099, loss_mpn: 0.030754, loss_rec: 0.038048, loss_semantic: 0.441849, loss_idmrf: 1.286017, loss_adv_gen: -92.748474 2023-02-12 20:42:38,373 - INFO - [Train] step: 64199, loss_adv_disc: -1.606140 2023-02-12 20:42:38,582 - INFO - [Train] step: 64199, loss_mpn: 0.015441, loss_rec: 0.026746, loss_semantic: 0.355930, loss_idmrf: 1.659739, loss_adv_gen: -116.125427 2023-02-12 20:43:07,882 - INFO - [Train] step: 64299, loss_adv_disc: -3.056216 2023-02-12 20:43:08,090 - INFO - [Train] step: 64299, loss_mpn: 0.015795, loss_rec: 0.026726, loss_semantic: 0.356020, loss_idmrf: 1.468450, loss_adv_gen: -83.122009 2023-02-12 20:43:37,382 - INFO - [Train] step: 64399, loss_adv_disc: -0.639578 2023-02-12 20:43:37,593 - INFO - [Train] step: 64399, loss_mpn: 0.011440, loss_rec: 0.019658, loss_semantic: 0.321466, loss_idmrf: 1.209082, loss_adv_gen: -71.584457 2023-02-12 20:44:06,876 - INFO - [Train] step: 64499, loss_adv_disc: -1.749318 2023-02-12 20:44:07,087 - INFO - [Train] step: 64499, loss_mpn: 0.007354, loss_rec: 0.018056, loss_semantic: 0.275412, loss_idmrf: 0.661474, loss_adv_gen: -55.107780 2023-02-12 20:44:36,389 - INFO - [Train] step: 64599, loss_adv_disc: -1.844682 2023-02-12 20:44:36,599 - INFO - [Train] step: 64599, loss_mpn: 0.017200, loss_rec: 0.024419, loss_semantic: 0.373415, loss_idmrf: 1.009859, loss_adv_gen: -32.814606 2023-02-12 20:45:05,887 - INFO - [Train] step: 64699, loss_adv_disc: -2.140878 2023-02-12 20:45:06,096 - INFO - [Train] step: 64699, loss_mpn: 0.015438, loss_rec: 0.019848, loss_semantic: 0.326161, loss_idmrf: 1.006193, loss_adv_gen: -51.328781 2023-02-12 20:45:35,391 - INFO - [Train] step: 64799, loss_adv_disc: -4.264380 2023-02-12 20:45:35,600 - INFO - [Train] step: 64799, loss_mpn: 0.042679, loss_rec: 0.057170, loss_semantic: 0.525425, loss_idmrf: 2.598725, loss_adv_gen: -64.887894 2023-02-12 20:46:04,895 - INFO - [Train] step: 64899, loss_adv_disc: -0.744364 2023-02-12 20:46:05,104 - INFO - [Train] step: 64899, loss_mpn: 0.007976, loss_rec: 0.018023, loss_semantic: 0.293331, loss_idmrf: 1.301613, loss_adv_gen: -83.579178 2023-02-12 20:46:34,396 - INFO - [Train] step: 64999, loss_adv_disc: -0.893059 2023-02-12 20:46:34,605 - INFO - [Train] step: 64999, loss_mpn: 0.015532, loss_rec: 0.023543, loss_semantic: 0.347162, loss_idmrf: 1.187006, loss_adv_gen: -95.316208 2023-02-12 20:47:11,786 - INFO - [Eval] step: 64999, bce: 0.177505, psnr: 29.413471, ssim: 0.958802 2023-02-12 20:47:42,232 - INFO - [Train] step: 65099, loss_adv_disc: -4.053308 2023-02-12 20:47:42,441 - INFO - [Train] step: 65099, loss_mpn: 0.008651, loss_rec: 0.024212, loss_semantic: 0.344539, loss_idmrf: 1.976563, loss_adv_gen: -42.739151 2023-02-12 20:48:11,735 - INFO - [Train] step: 65199, loss_adv_disc: -1.908587 2023-02-12 20:48:11,944 - INFO - [Train] step: 65199, loss_mpn: 0.009674, loss_rec: 0.014929, loss_semantic: 0.235875, loss_idmrf: 1.736368, loss_adv_gen: -57.469742 2023-02-12 20:48:41,236 - INFO - [Train] step: 65299, loss_adv_disc: -3.278842 2023-02-12 20:48:41,446 - INFO - [Train] step: 65299, loss_mpn: 0.010772, loss_rec: 0.029426, loss_semantic: 0.330994, loss_idmrf: 1.105828, loss_adv_gen: -60.095535 2023-02-12 20:49:10,743 - INFO - [Train] step: 65399, loss_adv_disc: -2.760396 2023-02-12 20:49:10,952 - INFO - [Train] step: 65399, loss_mpn: 0.037739, loss_rec: 0.019543, loss_semantic: 0.292189, loss_idmrf: 0.612481, loss_adv_gen: -33.662857 2023-02-12 20:49:40,256 - INFO - [Train] step: 65499, loss_adv_disc: -1.081099 2023-02-12 20:49:40,465 - INFO - [Train] step: 65499, loss_mpn: 0.010962, loss_rec: 0.016273, loss_semantic: 0.293681, loss_idmrf: 0.625786, loss_adv_gen: -50.430061 2023-02-12 20:50:09,769 - INFO - [Train] step: 65599, loss_adv_disc: -2.100213 2023-02-12 20:50:09,980 - INFO - [Train] step: 65599, loss_mpn: 0.035477, loss_rec: 0.031705, loss_semantic: 0.419631, loss_idmrf: 1.971213, loss_adv_gen: -80.661575 2023-02-12 20:50:39,268 - INFO - [Train] step: 65699, loss_adv_disc: -1.228416 2023-02-12 20:50:39,476 - INFO - [Train] step: 65699, loss_mpn: 0.016865, loss_rec: 0.023437, loss_semantic: 0.316175, loss_idmrf: 1.099643, loss_adv_gen: -89.519188 2023-02-12 20:51:08,775 - INFO - [Train] step: 65799, loss_adv_disc: -2.335534 2023-02-12 20:51:08,984 - INFO - [Train] step: 65799, loss_mpn: 0.011851, loss_rec: 0.023671, loss_semantic: 0.318379, loss_idmrf: 1.109860, loss_adv_gen: -80.973808 2023-02-12 20:51:38,281 - INFO - [Train] step: 65899, loss_adv_disc: -0.550847 2023-02-12 20:51:38,490 - INFO - [Train] step: 65899, loss_mpn: 0.007074, loss_rec: 0.016539, loss_semantic: 0.280873, loss_idmrf: 1.305851, loss_adv_gen: -62.654442 2023-02-12 20:52:07,803 - INFO - [Train] step: 65999, loss_adv_disc: -2.268604 2023-02-12 20:52:08,012 - INFO - [Train] step: 65999, loss_mpn: 0.009726, loss_rec: 0.023943, loss_semantic: 0.368609, loss_idmrf: 0.926895, loss_adv_gen: -62.659943 2023-02-12 20:52:45,196 - INFO - [Eval] step: 65999, bce: 0.283951, psnr: 29.513866, ssim: 0.958909 2023-02-12 20:53:15,750 - INFO - [Train] step: 66099, loss_adv_disc: -2.179287 2023-02-12 20:53:15,959 - INFO - [Train] step: 66099, loss_mpn: 0.018736, loss_rec: 0.028687, loss_semantic: 0.374382, loss_idmrf: 1.091718, loss_adv_gen: -46.342430 2023-02-12 20:53:45,257 - INFO - [Train] step: 66199, loss_adv_disc: -2.184360 2023-02-12 20:53:45,468 - INFO - [Train] step: 66199, loss_mpn: 0.011857, loss_rec: 0.019088, loss_semantic: 0.283679, loss_idmrf: 0.918297, loss_adv_gen: -55.028732 2023-02-12 20:54:14,775 - INFO - [Train] step: 66299, loss_adv_disc: -0.632387 2023-02-12 20:54:14,984 - INFO - [Train] step: 66299, loss_mpn: 0.014909, loss_rec: 0.026348, loss_semantic: 0.358586, loss_idmrf: 1.339422, loss_adv_gen: -43.674500 2023-02-12 20:54:44,290 - INFO - [Train] step: 66399, loss_adv_disc: -1.317910 2023-02-12 20:54:44,499 - INFO - [Train] step: 66399, loss_mpn: 0.013551, loss_rec: 0.018873, loss_semantic: 0.287108, loss_idmrf: 0.941970, loss_adv_gen: -53.561958 2023-02-12 20:55:13,798 - INFO - [Train] step: 66499, loss_adv_disc: -2.272582 2023-02-12 20:55:14,009 - INFO - [Train] step: 66499, loss_mpn: 0.020320, loss_rec: 0.027766, loss_semantic: 0.355462, loss_idmrf: 1.563582, loss_adv_gen: -63.542458 2023-02-12 20:55:43,303 - INFO - [Train] step: 66599, loss_adv_disc: -3.030508 2023-02-12 20:55:43,513 - INFO - [Train] step: 66599, loss_mpn: 0.012065, loss_rec: 0.027303, loss_semantic: 0.392820, loss_idmrf: 0.783151, loss_adv_gen: -59.651962 2023-02-12 20:56:12,814 - INFO - [Train] step: 66699, loss_adv_disc: -1.808617 2023-02-12 20:56:13,023 - INFO - [Train] step: 66699, loss_mpn: 0.011521, loss_rec: 0.021795, loss_semantic: 0.341325, loss_idmrf: 1.025125, loss_adv_gen: -43.372421 2023-02-12 20:56:42,531 - INFO - [Train] step: 66799, loss_adv_disc: -4.923433 2023-02-12 20:56:42,740 - INFO - [Train] step: 66799, loss_mpn: 0.015149, loss_rec: 0.033825, loss_semantic: 0.429327, loss_idmrf: 1.629720, loss_adv_gen: -90.743362 2023-02-12 20:57:12,034 - INFO - [Train] step: 66899, loss_adv_disc: -1.276895 2023-02-12 20:57:12,244 - INFO - [Train] step: 66899, loss_mpn: 0.019933, loss_rec: 0.023618, loss_semantic: 0.315569, loss_idmrf: 1.226826, loss_adv_gen: -66.616226 2023-02-12 20:57:41,548 - INFO - [Train] step: 66999, loss_adv_disc: -3.050347 2023-02-12 20:57:41,758 - INFO - [Train] step: 66999, loss_mpn: 0.011057, loss_rec: 0.024034, loss_semantic: 0.328491, loss_idmrf: 0.848529, loss_adv_gen: -35.957558 2023-02-12 20:58:18,902 - INFO - [Eval] step: 66999, bce: 0.234289, psnr: 29.570223, ssim: 0.958701 2023-02-12 20:58:49,470 - INFO - [Train] step: 67099, loss_adv_disc: -3.032548 2023-02-12 20:58:49,679 - INFO - [Train] step: 67099, loss_mpn: 0.008453, loss_rec: 0.022154, loss_semantic: 0.289923, loss_idmrf: 0.767523, loss_adv_gen: -53.847633 2023-02-12 20:59:18,985 - INFO - [Train] step: 67199, loss_adv_disc: -1.827188 2023-02-12 20:59:19,194 - INFO - [Train] step: 67199, loss_mpn: 0.004711, loss_rec: 0.013690, loss_semantic: 0.224299, loss_idmrf: 0.941844, loss_adv_gen: -27.291245 2023-02-12 20:59:48,491 - INFO - [Train] step: 67299, loss_adv_disc: -1.544516 2023-02-12 20:59:48,700 - INFO - [Train] step: 67299, loss_mpn: 0.010019, loss_rec: 0.026462, loss_semantic: 0.368966, loss_idmrf: 0.971092, loss_adv_gen: -61.662125 2023-02-12 21:00:17,992 - INFO - [Train] step: 67399, loss_adv_disc: -1.758093 2023-02-12 21:00:18,201 - INFO - [Train] step: 67399, loss_mpn: 0.012807, loss_rec: 0.021194, loss_semantic: 0.355774, loss_idmrf: 1.602686, loss_adv_gen: -63.539848 2023-02-12 21:00:47,503 - INFO - [Train] step: 67499, loss_adv_disc: -4.292874 2023-02-12 21:00:47,712 - INFO - [Train] step: 67499, loss_mpn: 0.025968, loss_rec: 0.040109, loss_semantic: 0.423033, loss_idmrf: 1.333880, loss_adv_gen: -74.659836 2023-02-12 21:01:17,009 - INFO - [Train] step: 67599, loss_adv_disc: -2.562881 2023-02-12 21:01:17,219 - INFO - [Train] step: 67599, loss_mpn: 0.008786, loss_rec: 0.022146, loss_semantic: 0.312632, loss_idmrf: 0.865675, loss_adv_gen: -76.487762 2023-02-12 21:01:46,512 - INFO - [Train] step: 67699, loss_adv_disc: -2.483669 2023-02-12 21:01:46,721 - INFO - [Train] step: 67699, loss_mpn: 0.012997, loss_rec: 0.018542, loss_semantic: 0.326704, loss_idmrf: 2.078700, loss_adv_gen: -77.548706 2023-02-12 21:02:16,006 - INFO - [Train] step: 67799, loss_adv_disc: -2.507890 2023-02-12 21:02:16,215 - INFO - [Train] step: 67799, loss_mpn: 0.020710, loss_rec: 0.025503, loss_semantic: 0.376440, loss_idmrf: 1.426388, loss_adv_gen: -60.328278 2023-02-12 21:02:45,509 - INFO - [Train] step: 67899, loss_adv_disc: -1.448907 2023-02-12 21:02:45,718 - INFO - [Train] step: 67899, loss_mpn: 0.010392, loss_rec: 0.019801, loss_semantic: 0.311277, loss_idmrf: 1.061103, loss_adv_gen: -50.874504 2023-02-12 21:03:15,031 - INFO - [Train] step: 67999, loss_adv_disc: -1.468448 2023-02-12 21:03:15,241 - INFO - [Train] step: 67999, loss_mpn: 0.010515, loss_rec: 0.017678, loss_semantic: 0.276809, loss_idmrf: 0.644167, loss_adv_gen: -69.350494 2023-02-12 21:03:52,396 - INFO - [Eval] step: 67999, bce: 0.182142, psnr: 29.353348, ssim: 0.959181 2023-02-12 21:04:22,709 - INFO - [Train] step: 68099, loss_adv_disc: -1.470368 2023-02-12 21:04:22,918 - INFO - [Train] step: 68099, loss_mpn: 0.012970, loss_rec: 0.020651, loss_semantic: 0.324430, loss_idmrf: 0.735479, loss_adv_gen: -71.755272 2023-02-12 21:04:52,213 - INFO - [Train] step: 68199, loss_adv_disc: -1.750768 2023-02-12 21:04:52,422 - INFO - [Train] step: 68199, loss_mpn: 0.009509, loss_rec: 0.021411, loss_semantic: 0.307829, loss_idmrf: 1.556021, loss_adv_gen: -30.091141 2023-02-12 21:05:21,720 - INFO - [Train] step: 68299, loss_adv_disc: -1.866131 2023-02-12 21:05:21,931 - INFO - [Train] step: 68299, loss_mpn: 0.005745, loss_rec: 0.018067, loss_semantic: 0.292236, loss_idmrf: 0.666921, loss_adv_gen: -70.810120 2023-02-12 21:05:51,220 - INFO - [Train] step: 68399, loss_adv_disc: -3.098092 2023-02-12 21:05:51,430 - INFO - [Train] step: 68399, loss_mpn: 0.033626, loss_rec: 0.030624, loss_semantic: 0.380430, loss_idmrf: 1.165794, loss_adv_gen: -28.666557 2023-02-12 21:06:20,723 - INFO - [Train] step: 68499, loss_adv_disc: -4.046457 2023-02-12 21:06:20,932 - INFO - [Train] step: 68499, loss_mpn: 0.024197, loss_rec: 0.026714, loss_semantic: 0.420435, loss_idmrf: 1.499463, loss_adv_gen: -86.407547 2023-02-12 21:06:50,237 - INFO - [Train] step: 68599, loss_adv_disc: -3.949851 2023-02-12 21:06:50,447 - INFO - [Train] step: 68599, loss_mpn: 0.020998, loss_rec: 0.028578, loss_semantic: 0.401368, loss_idmrf: 1.862396, loss_adv_gen: -29.817764 2023-02-12 21:07:19,735 - INFO - [Train] step: 68699, loss_adv_disc: -1.697625 2023-02-12 21:07:19,944 - INFO - [Train] step: 68699, loss_mpn: 0.006644, loss_rec: 0.018489, loss_semantic: 0.303649, loss_idmrf: 2.108049, loss_adv_gen: -87.633308 2023-02-12 21:07:49,234 - INFO - [Train] step: 68799, loss_adv_disc: -1.151481 2023-02-12 21:07:49,443 - INFO - [Train] step: 68799, loss_mpn: 0.016718, loss_rec: 0.030499, loss_semantic: 0.412661, loss_idmrf: 1.504141, loss_adv_gen: -44.044556 2023-02-12 21:08:18,741 - INFO - [Train] step: 68899, loss_adv_disc: -0.867682 2023-02-12 21:08:18,951 - INFO - [Train] step: 68899, loss_mpn: 0.012179, loss_rec: 0.017132, loss_semantic: 0.311864, loss_idmrf: 1.889711, loss_adv_gen: -46.858635 2023-02-12 21:08:48,238 - INFO - [Train] step: 68999, loss_adv_disc: -1.545669 2023-02-12 21:08:48,449 - INFO - [Train] step: 68999, loss_mpn: 0.018208, loss_rec: 0.018937, loss_semantic: 0.307857, loss_idmrf: 1.285715, loss_adv_gen: -78.601456 2023-02-12 21:09:25,617 - INFO - [Eval] step: 68999, bce: 0.313845, psnr: 29.494946, ssim: 0.958817 2023-02-12 21:09:55,947 - INFO - [Train] step: 69099, loss_adv_disc: -3.332678 2023-02-12 21:09:56,156 - INFO - [Train] step: 69099, loss_mpn: 0.008385, loss_rec: 0.024329, loss_semantic: 0.344229, loss_idmrf: 0.733199, loss_adv_gen: -41.880615 2023-02-12 21:10:25,447 - INFO - [Train] step: 69199, loss_adv_disc: -1.373078 2023-02-12 21:10:25,656 - INFO - [Train] step: 69199, loss_mpn: 0.007194, loss_rec: 0.018641, loss_semantic: 0.274255, loss_idmrf: 0.676193, loss_adv_gen: -28.123978 2023-02-12 21:10:54,941 - INFO - [Train] step: 69299, loss_adv_disc: -2.440302 2023-02-12 21:10:55,151 - INFO - [Train] step: 69299, loss_mpn: 0.019131, loss_rec: 0.018331, loss_semantic: 0.268842, loss_idmrf: 0.582062, loss_adv_gen: -63.471359 2023-02-12 21:11:24,445 - INFO - [Train] step: 69399, loss_adv_disc: -0.605531 2023-02-12 21:11:24,654 - INFO - [Train] step: 69399, loss_mpn: 0.020446, loss_rec: 0.022491, loss_semantic: 0.330231, loss_idmrf: 1.099785, loss_adv_gen: -48.208572 2023-02-12 21:11:53,938 - INFO - [Train] step: 69499, loss_adv_disc: -4.079074 2023-02-12 21:11:54,147 - INFO - [Train] step: 69499, loss_mpn: 0.020885, loss_rec: 0.029750, loss_semantic: 0.375879, loss_idmrf: 1.673370, loss_adv_gen: -89.729515 2023-02-12 21:12:23,446 - INFO - [Train] step: 69599, loss_adv_disc: -4.225589 2023-02-12 21:12:23,655 - INFO - [Train] step: 69599, loss_mpn: 0.019185, loss_rec: 0.033425, loss_semantic: 0.415669, loss_idmrf: 1.640262, loss_adv_gen: -53.592377 2023-02-12 21:12:52,941 - INFO - [Train] step: 69699, loss_adv_disc: -2.427955 2023-02-12 21:12:53,151 - INFO - [Train] step: 69699, loss_mpn: 0.021050, loss_rec: 0.028781, loss_semantic: 0.426637, loss_idmrf: 1.447016, loss_adv_gen: -29.481674 2023-02-12 21:13:22,445 - INFO - [Train] step: 69799, loss_adv_disc: -2.755979 2023-02-12 21:13:22,654 - INFO - [Train] step: 69799, loss_mpn: 0.006923, loss_rec: 0.021537, loss_semantic: 0.312878, loss_idmrf: 1.160190, loss_adv_gen: -52.102264 2023-02-12 21:13:51,951 - INFO - [Train] step: 69899, loss_adv_disc: -3.879722 2023-02-12 21:13:52,160 - INFO - [Train] step: 69899, loss_mpn: 0.021926, loss_rec: 0.031677, loss_semantic: 0.360229, loss_idmrf: 1.563713, loss_adv_gen: -49.856552 2023-02-12 21:14:21,450 - INFO - [Train] step: 69999, loss_adv_disc: -1.000557 2023-02-12 21:14:21,661 - INFO - [Train] step: 69999, loss_mpn: 0.015333, loss_rec: 0.019876, loss_semantic: 0.321901, loss_idmrf: 0.780718, loss_adv_gen: -22.526367 2023-02-12 21:14:58,854 - INFO - [Eval] step: 69999, bce: 0.201087, psnr: 29.440516, ssim: 0.959623 2023-02-12 21:15:29,297 - INFO - [Train] step: 70099, loss_adv_disc: -0.431820 2023-02-12 21:15:29,506 - INFO - [Train] step: 70099, loss_mpn: 0.022419, loss_rec: 0.023142, loss_semantic: 0.360328, loss_idmrf: 1.072268, loss_adv_gen: -61.172371 2023-02-12 21:15:58,796 - INFO - [Train] step: 70199, loss_adv_disc: -1.414873 2023-02-12 21:15:59,006 - INFO - [Train] step: 70199, loss_mpn: 0.004839, loss_rec: 0.015951, loss_semantic: 0.289024, loss_idmrf: 0.848902, loss_adv_gen: -34.415863 2023-02-12 21:16:28,293 - INFO - [Train] step: 70299, loss_adv_disc: -3.029736 2023-02-12 21:16:28,502 - INFO - [Train] step: 70299, loss_mpn: 0.019330, loss_rec: 0.030450, loss_semantic: 0.379830, loss_idmrf: 1.073340, loss_adv_gen: -66.002075 2023-02-12 21:16:57,796 - INFO - [Train] step: 70399, loss_adv_disc: -0.315073 2023-02-12 21:16:58,006 - INFO - [Train] step: 70399, loss_mpn: 0.013614, loss_rec: 0.022816, loss_semantic: 0.336758, loss_idmrf: 1.382591, loss_adv_gen: -36.869049 2023-02-12 21:17:27,294 - INFO - [Train] step: 70499, loss_adv_disc: -1.273747 2023-02-12 21:17:27,503 - INFO - [Train] step: 70499, loss_mpn: 0.013135, loss_rec: 0.020346, loss_semantic: 0.303443, loss_idmrf: 0.821364, loss_adv_gen: -42.114380 2023-02-12 21:17:56,788 - INFO - [Train] step: 70599, loss_adv_disc: -2.124272 2023-02-12 21:17:56,998 - INFO - [Train] step: 70599, loss_mpn: 0.009214, loss_rec: 0.020623, loss_semantic: 0.293132, loss_idmrf: 0.719492, loss_adv_gen: -86.270523 2023-02-12 21:18:26,284 - INFO - [Train] step: 70699, loss_adv_disc: -1.098114 2023-02-12 21:18:26,493 - INFO - [Train] step: 70699, loss_mpn: 0.006741, loss_rec: 0.014492, loss_semantic: 0.261432, loss_idmrf: 1.125136, loss_adv_gen: -34.520340 2023-02-12 21:18:55,801 - INFO - [Train] step: 70799, loss_adv_disc: -1.633057 2023-02-12 21:18:56,011 - INFO - [Train] step: 70799, loss_mpn: 0.018513, loss_rec: 0.029835, loss_semantic: 0.431680, loss_idmrf: 1.353910, loss_adv_gen: -79.816216 2023-02-12 21:19:25,295 - INFO - [Train] step: 70899, loss_adv_disc: -1.403503 2023-02-12 21:19:25,504 - INFO - [Train] step: 70899, loss_mpn: 0.027041, loss_rec: 0.025739, loss_semantic: 0.405577, loss_idmrf: 1.415896, loss_adv_gen: 5.157623 2023-02-12 21:19:54,799 - INFO - [Train] step: 70999, loss_adv_disc: -1.944867 2023-02-12 21:19:55,008 - INFO - [Train] step: 70999, loss_mpn: 0.014358, loss_rec: 0.020800, loss_semantic: 0.336411, loss_idmrf: 0.971768, loss_adv_gen: -85.264381 2023-02-12 21:20:32,199 - INFO - [Eval] step: 70999, bce: 0.220269, psnr: 29.473652, ssim: 0.959555 2023-02-12 21:21:02,527 - INFO - [Train] step: 71099, loss_adv_disc: -3.456838 2023-02-12 21:21:02,736 - INFO - [Train] step: 71099, loss_mpn: 0.012788, loss_rec: 0.027743, loss_semantic: 0.396344, loss_idmrf: 1.527968, loss_adv_gen: -53.587585 2023-02-12 21:21:32,020 - INFO - [Train] step: 71199, loss_adv_disc: -2.780605 2023-02-12 21:21:32,229 - INFO - [Train] step: 71199, loss_mpn: 0.010308, loss_rec: 0.021733, loss_semantic: 0.302761, loss_idmrf: 0.973229, loss_adv_gen: -32.839844 2023-02-12 21:22:01,531 - INFO - [Train] step: 71299, loss_adv_disc: -3.013721 2023-02-12 21:22:01,740 - INFO - [Train] step: 71299, loss_mpn: 0.022005, loss_rec: 0.025937, loss_semantic: 0.316281, loss_idmrf: 1.636544, loss_adv_gen: -56.161957 2023-02-12 21:22:31,038 - INFO - [Train] step: 71399, loss_adv_disc: -1.012286 2023-02-12 21:22:31,247 - INFO - [Train] step: 71399, loss_mpn: 0.011092, loss_rec: 0.029167, loss_semantic: 0.431375, loss_idmrf: 1.083081, loss_adv_gen: -37.305878 2023-02-12 21:23:00,534 - INFO - [Train] step: 71499, loss_adv_disc: -2.524217 2023-02-12 21:23:00,743 - INFO - [Train] step: 71499, loss_mpn: 0.015642, loss_rec: 0.027775, loss_semantic: 0.360311, loss_idmrf: 1.292624, loss_adv_gen: -64.429649 2023-02-12 21:23:30,027 - INFO - [Train] step: 71599, loss_adv_disc: -2.349478 2023-02-12 21:23:30,236 - INFO - [Train] step: 71599, loss_mpn: 0.013973, loss_rec: 0.024379, loss_semantic: 0.368885, loss_idmrf: 1.162121, loss_adv_gen: -91.415802 2023-02-12 21:23:59,536 - INFO - [Train] step: 71699, loss_adv_disc: -1.028637 2023-02-12 21:23:59,746 - INFO - [Train] step: 71699, loss_mpn: 0.008475, loss_rec: 0.020064, loss_semantic: 0.334370, loss_idmrf: 0.990030, loss_adv_gen: -34.212799 2023-02-12 21:24:29,032 - INFO - [Train] step: 71799, loss_adv_disc: -1.429072 2023-02-12 21:24:29,242 - INFO - [Train] step: 71799, loss_mpn: 0.013389, loss_rec: 0.024276, loss_semantic: 0.321409, loss_idmrf: 1.020213, loss_adv_gen: -13.550598 2023-02-12 21:24:58,530 - INFO - [Train] step: 71899, loss_adv_disc: -1.691852 2023-02-12 21:24:58,739 - INFO - [Train] step: 71899, loss_mpn: 0.014452, loss_rec: 0.026050, loss_semantic: 0.384248, loss_idmrf: 1.716169, loss_adv_gen: -40.509186 2023-02-12 21:25:28,029 - INFO - [Train] step: 71999, loss_adv_disc: -1.373135 2023-02-12 21:25:28,237 - INFO - [Train] step: 71999, loss_mpn: 0.018275, loss_rec: 0.027552, loss_semantic: 0.376156, loss_idmrf: 0.930697, loss_adv_gen: -25.627899 2023-02-12 21:26:05,420 - INFO - [Eval] step: 71999, bce: 0.213143, psnr: 29.476858, ssim: 0.959560 2023-02-12 21:26:35,749 - INFO - [Train] step: 72099, loss_adv_disc: -1.884004 2023-02-12 21:26:35,958 - INFO - [Train] step: 72099, loss_mpn: 0.007285, loss_rec: 0.016476, loss_semantic: 0.292408, loss_idmrf: 0.660436, loss_adv_gen: -43.202438 2023-02-12 21:27:05,250 - INFO - [Train] step: 72199, loss_adv_disc: -0.523182 2023-02-12 21:27:05,459 - INFO - [Train] step: 72199, loss_mpn: 0.005790, loss_rec: 0.016915, loss_semantic: 0.286560, loss_idmrf: 0.431293, loss_adv_gen: -51.906189 2023-02-12 21:27:34,750 - INFO - [Train] step: 72299, loss_adv_disc: -1.657384 2023-02-12 21:27:34,959 - INFO - [Train] step: 72299, loss_mpn: 0.026309, loss_rec: 0.024100, loss_semantic: 0.391826, loss_idmrf: 1.664013, loss_adv_gen: -60.109779 2023-02-12 21:28:04,252 - INFO - [Train] step: 72399, loss_adv_disc: -1.076741 2023-02-12 21:28:04,460 - INFO - [Train] step: 72399, loss_mpn: 0.010305, loss_rec: 0.017147, loss_semantic: 0.321734, loss_idmrf: 2.702593, loss_adv_gen: -69.227234 2023-02-12 21:28:33,738 - INFO - [Train] step: 72499, loss_adv_disc: -2.465907 2023-02-12 21:28:33,949 - INFO - [Train] step: 72499, loss_mpn: 0.014154, loss_rec: 0.025708, loss_semantic: 0.370025, loss_idmrf: 1.467248, loss_adv_gen: -65.247864 2023-02-12 21:29:03,238 - INFO - [Train] step: 72599, loss_adv_disc: -1.269054 2023-02-12 21:29:03,447 - INFO - [Train] step: 72599, loss_mpn: 0.015361, loss_rec: 0.027228, loss_semantic: 0.345680, loss_idmrf: 1.226199, loss_adv_gen: -48.038681 2023-02-12 21:29:32,737 - INFO - [Train] step: 72699, loss_adv_disc: -0.979583 2023-02-12 21:29:32,948 - INFO - [Train] step: 72699, loss_mpn: 0.005480, loss_rec: 0.014308, loss_semantic: 0.221794, loss_idmrf: 0.883848, loss_adv_gen: -77.211136 2023-02-12 21:30:02,242 - INFO - [Train] step: 72799, loss_adv_disc: -1.368594 2023-02-12 21:30:02,451 - INFO - [Train] step: 72799, loss_mpn: 0.044310, loss_rec: 0.026370, loss_semantic: 0.327070, loss_idmrf: 1.310525, loss_adv_gen: -49.803001 2023-02-12 21:30:31,741 - INFO - [Train] step: 72899, loss_adv_disc: -1.385313 2023-02-12 21:30:31,951 - INFO - [Train] step: 72899, loss_mpn: 0.005758, loss_rec: 0.016514, loss_semantic: 0.252293, loss_idmrf: 0.562531, loss_adv_gen: -31.607414 2023-02-12 21:31:01,237 - INFO - [Train] step: 72999, loss_adv_disc: -2.134885 2023-02-12 21:31:01,446 - INFO - [Train] step: 72999, loss_mpn: 0.012255, loss_rec: 0.018623, loss_semantic: 0.293801, loss_idmrf: 1.523090, loss_adv_gen: -59.181847 2023-02-12 21:31:38,628 - INFO - [Eval] step: 72999, bce: 0.169464, psnr: 29.525417, ssim: 0.959316 2023-02-12 21:32:08,947 - INFO - [Train] step: 73099, loss_adv_disc: -0.635405 2023-02-12 21:32:09,156 - INFO - [Train] step: 73099, loss_mpn: 0.014173, loss_rec: 0.028390, loss_semantic: 0.343728, loss_idmrf: 1.590183, loss_adv_gen: -70.256905 2023-02-12 21:32:38,458 - INFO - [Train] step: 73199, loss_adv_disc: -0.898788 2023-02-12 21:32:38,667 - INFO - [Train] step: 73199, loss_mpn: 0.006542, loss_rec: 0.016354, loss_semantic: 0.273708, loss_idmrf: 1.951222, loss_adv_gen: -29.730362 2023-02-12 21:33:07,950 - INFO - [Train] step: 73299, loss_adv_disc: -2.857425 2023-02-12 21:33:08,161 - INFO - [Train] step: 73299, loss_mpn: 0.010121, loss_rec: 0.023318, loss_semantic: 0.348987, loss_idmrf: 0.866920, loss_adv_gen: -31.506058 2023-02-12 21:33:37,452 - INFO - [Train] step: 73399, loss_adv_disc: -2.670118 2023-02-12 21:33:37,661 - INFO - [Train] step: 73399, loss_mpn: 0.014232, loss_rec: 0.029233, loss_semantic: 0.378379, loss_idmrf: 0.993713, loss_adv_gen: -1.311707 2023-02-12 21:34:06,944 - INFO - [Train] step: 73499, loss_adv_disc: -1.707668 2023-02-12 21:34:07,153 - INFO - [Train] step: 73499, loss_mpn: 0.006130, loss_rec: 0.015694, loss_semantic: 0.284229, loss_idmrf: 1.003670, loss_adv_gen: -64.978516 2023-02-12 21:34:36,437 - INFO - [Train] step: 73599, loss_adv_disc: -1.060075 2023-02-12 21:34:36,646 - INFO - [Train] step: 73599, loss_mpn: 0.006543, loss_rec: 0.014794, loss_semantic: 0.250263, loss_idmrf: 0.843417, loss_adv_gen: -62.592773 2023-02-12 21:35:05,938 - INFO - [Train] step: 73699, loss_adv_disc: -1.434207 2023-02-12 21:35:06,148 - INFO - [Train] step: 73699, loss_mpn: 0.008778, loss_rec: 0.019124, loss_semantic: 0.298285, loss_idmrf: 0.573199, loss_adv_gen: 10.554688 2023-02-12 21:35:35,651 - INFO - [Train] step: 73799, loss_adv_disc: -3.271966 2023-02-12 21:35:35,861 - INFO - [Train] step: 73799, loss_mpn: 0.007811, loss_rec: 0.025069, loss_semantic: 0.331823, loss_idmrf: 0.988223, loss_adv_gen: -91.417877 2023-02-12 21:36:05,144 - INFO - [Train] step: 73899, loss_adv_disc: -0.887449 2023-02-12 21:36:05,353 - INFO - [Train] step: 73899, loss_mpn: 0.009228, loss_rec: 0.014328, loss_semantic: 0.237118, loss_idmrf: 0.973336, loss_adv_gen: -25.054962 2023-02-12 21:36:34,652 - INFO - [Train] step: 73999, loss_adv_disc: -1.849455 2023-02-12 21:36:34,861 - INFO - [Train] step: 73999, loss_mpn: 0.013257, loss_rec: 0.023812, loss_semantic: 0.353456, loss_idmrf: 1.186634, loss_adv_gen: -39.017883 2023-02-12 21:37:12,033 - INFO - [Eval] step: 73999, bce: 0.175106, psnr: 29.583344, ssim: 0.959993 2023-02-12 21:37:42,516 - INFO - [Train] step: 74099, loss_adv_disc: -1.602568 2023-02-12 21:37:42,726 - INFO - [Train] step: 74099, loss_mpn: 0.011670, loss_rec: 0.021626, loss_semantic: 0.379507, loss_idmrf: 2.440815, loss_adv_gen: -46.151924 2023-02-12 21:38:12,000 - INFO - [Train] step: 74199, loss_adv_disc: -1.562775 2023-02-12 21:38:12,211 - INFO - [Train] step: 74199, loss_mpn: 0.015140, loss_rec: 0.027729, loss_semantic: 0.340237, loss_idmrf: 1.321341, loss_adv_gen: -45.090591 2023-02-12 21:38:41,483 - INFO - [Train] step: 74299, loss_adv_disc: -1.466455 2023-02-12 21:38:41,692 - INFO - [Train] step: 74299, loss_mpn: 0.009573, loss_rec: 0.016208, loss_semantic: 0.243085, loss_idmrf: 0.831500, loss_adv_gen: -59.343719 2023-02-12 21:39:10,957 - INFO - [Train] step: 74399, loss_adv_disc: -1.799070 2023-02-12 21:39:11,165 - INFO - [Train] step: 74399, loss_mpn: 0.010457, loss_rec: 0.018303, loss_semantic: 0.279585, loss_idmrf: 0.637411, loss_adv_gen: -36.124840 2023-02-12 21:39:40,433 - INFO - [Train] step: 74499, loss_adv_disc: -0.891940 2023-02-12 21:39:40,642 - INFO - [Train] step: 74499, loss_mpn: 0.006888, loss_rec: 0.014591, loss_semantic: 0.239977, loss_idmrf: 2.974409, loss_adv_gen: -30.407242 2023-02-12 21:40:09,910 - INFO - [Train] step: 74599, loss_adv_disc: -4.381166 2023-02-12 21:40:10,120 - INFO - [Train] step: 74599, loss_mpn: 0.013713, loss_rec: 0.031638, loss_semantic: 0.402833, loss_idmrf: 1.152858, loss_adv_gen: -34.066513 2023-02-12 21:40:39,383 - INFO - [Train] step: 74699, loss_adv_disc: -2.144645 2023-02-12 21:40:39,591 - INFO - [Train] step: 74699, loss_mpn: 0.011164, loss_rec: 0.021494, loss_semantic: 0.313223, loss_idmrf: 1.001765, loss_adv_gen: -27.607224 2023-02-12 21:41:08,840 - INFO - [Train] step: 74799, loss_adv_disc: -1.335196 2023-02-12 21:41:09,049 - INFO - [Train] step: 74799, loss_mpn: 0.012281, loss_rec: 0.022507, loss_semantic: 0.325198, loss_idmrf: 1.320781, loss_adv_gen: -44.054108 2023-02-12 21:41:38,310 - INFO - [Train] step: 74899, loss_adv_disc: -0.737202 2023-02-12 21:41:38,520 - INFO - [Train] step: 74899, loss_mpn: 0.010384, loss_rec: 0.020744, loss_semantic: 0.313617, loss_idmrf: 0.852506, loss_adv_gen: -20.351273 2023-02-12 21:42:07,783 - INFO - [Train] step: 74999, loss_adv_disc: -1.912043 2023-02-12 21:42:07,992 - INFO - [Train] step: 74999, loss_mpn: 0.010021, loss_rec: 0.019042, loss_semantic: 0.338162, loss_idmrf: 1.209713, loss_adv_gen: -24.191650 2023-02-12 21:42:45,162 - INFO - [Eval] step: 74999, bce: 0.226098, psnr: 29.478821, ssim: 0.959512 2023-02-12 21:43:15,630 - INFO - [Train] step: 75099, loss_adv_disc: -0.847669 2023-02-12 21:43:15,839 - INFO - [Train] step: 75099, loss_mpn: 0.006445, loss_rec: 0.015781, loss_semantic: 0.253824, loss_idmrf: 1.406732, loss_adv_gen: -46.673225 2023-02-12 21:43:45,141 - INFO - [Train] step: 75199, loss_adv_disc: -2.541757 2023-02-12 21:43:45,351 - INFO - [Train] step: 75199, loss_mpn: 0.018399, loss_rec: 0.025520, loss_semantic: 0.378720, loss_idmrf: 1.095098, loss_adv_gen: -61.844482 2023-02-12 21:44:14,669 - INFO - [Train] step: 75299, loss_adv_disc: -1.807647 2023-02-12 21:44:14,878 - INFO - [Train] step: 75299, loss_mpn: 0.023424, loss_rec: 0.022936, loss_semantic: 0.355386, loss_idmrf: 1.373829, loss_adv_gen: -11.285446 2023-02-12 21:44:44,180 - INFO - [Train] step: 75399, loss_adv_disc: -1.380262 2023-02-12 21:44:44,389 - INFO - [Train] step: 75399, loss_mpn: 0.019337, loss_rec: 0.018448, loss_semantic: 0.287480, loss_idmrf: 1.747099, loss_adv_gen: -32.879578 2023-02-12 21:45:13,690 - INFO - [Train] step: 75499, loss_adv_disc: -2.464078 2023-02-12 21:45:13,899 - INFO - [Train] step: 75499, loss_mpn: 0.015781, loss_rec: 0.020887, loss_semantic: 0.320113, loss_idmrf: 0.969004, loss_adv_gen: -8.547363 2023-02-12 21:45:43,177 - INFO - [Train] step: 75599, loss_adv_disc: -1.252559 2023-02-12 21:45:43,386 - INFO - [Train] step: 75599, loss_mpn: 0.004754, loss_rec: 0.015670, loss_semantic: 0.249595, loss_idmrf: 0.506315, loss_adv_gen: -30.307831 2023-02-12 21:46:12,676 - INFO - [Train] step: 75699, loss_adv_disc: -2.356029 2023-02-12 21:46:12,887 - INFO - [Train] step: 75699, loss_mpn: 0.015313, loss_rec: 0.023763, loss_semantic: 0.348951, loss_idmrf: 1.131225, loss_adv_gen: -11.490738 2023-02-12 21:46:42,195 - INFO - [Train] step: 75799, loss_adv_disc: -2.358633 2023-02-12 21:46:42,404 - INFO - [Train] step: 75799, loss_mpn: 0.019427, loss_rec: 0.028120, loss_semantic: 0.392224, loss_idmrf: 1.284557, loss_adv_gen: -43.659256 2023-02-12 21:47:11,706 - INFO - [Train] step: 75899, loss_adv_disc: -2.161181 2023-02-12 21:47:11,915 - INFO - [Train] step: 75899, loss_mpn: 0.008757, loss_rec: 0.019659, loss_semantic: 0.298303, loss_idmrf: 0.463418, loss_adv_gen: -44.445801 2023-02-12 21:47:41,205 - INFO - [Train] step: 75999, loss_adv_disc: -0.898391 2023-02-12 21:47:41,415 - INFO - [Train] step: 75999, loss_mpn: 0.005460, loss_rec: 0.019044, loss_semantic: 0.295422, loss_idmrf: 1.227332, loss_adv_gen: -31.178757 2023-02-12 21:48:18,567 - INFO - [Eval] step: 75999, bce: 0.211443, psnr: 29.422033, ssim: 0.959761 2023-02-12 21:48:48,887 - INFO - [Train] step: 76099, loss_adv_disc: -1.762845 2023-02-12 21:48:49,097 - INFO - [Train] step: 76099, loss_mpn: 0.006320, loss_rec: 0.015622, loss_semantic: 0.278861, loss_idmrf: 1.189617, loss_adv_gen: -58.597244 2023-02-12 21:49:18,390 - INFO - [Train] step: 76199, loss_adv_disc: -0.732783 2023-02-12 21:49:18,598 - INFO - [Train] step: 76199, loss_mpn: 0.025986, loss_rec: 0.031872, loss_semantic: 0.368722, loss_idmrf: 1.491676, loss_adv_gen: -36.562378 2023-02-12 21:49:47,885 - INFO - [Train] step: 76299, loss_adv_disc: -2.303626 2023-02-12 21:49:48,094 - INFO - [Train] step: 76299, loss_mpn: 0.014670, loss_rec: 0.020314, loss_semantic: 0.301468, loss_idmrf: 1.681816, loss_adv_gen: -57.371552 2023-02-12 21:50:17,391 - INFO - [Train] step: 76399, loss_adv_disc: 0.055652 2023-02-12 21:50:17,599 - INFO - [Train] step: 76399, loss_mpn: 0.007622, loss_rec: 0.018077, loss_semantic: 0.326310, loss_idmrf: 1.206865, loss_adv_gen: -26.842865 2023-02-12 21:50:46,887 - INFO - [Train] step: 76499, loss_adv_disc: -2.323917 2023-02-12 21:50:47,097 - INFO - [Train] step: 76499, loss_mpn: 0.007266, loss_rec: 0.020839, loss_semantic: 0.322761, loss_idmrf: 0.880452, loss_adv_gen: 15.684189 2023-02-12 21:51:16,380 - INFO - [Train] step: 76599, loss_adv_disc: -1.293300 2023-02-12 21:51:16,590 - INFO - [Train] step: 76599, loss_mpn: 0.014782, loss_rec: 0.016837, loss_semantic: 0.281380, loss_idmrf: 0.900026, loss_adv_gen: -67.974487 2023-02-12 21:51:45,883 - INFO - [Train] step: 76699, loss_adv_disc: -2.672648 2023-02-12 21:51:46,092 - INFO - [Train] step: 76699, loss_mpn: 0.011081, loss_rec: 0.024339, loss_semantic: 0.327133, loss_idmrf: 1.383722, loss_adv_gen: -56.354614 2023-02-12 21:52:15,381 - INFO - [Train] step: 76799, loss_adv_disc: -1.891977 2023-02-12 21:52:15,590 - INFO - [Train] step: 76799, loss_mpn: 0.032545, loss_rec: 0.028276, loss_semantic: 0.432388, loss_idmrf: 2.386503, loss_adv_gen: -12.806229 2023-02-12 21:52:44,880 - INFO - [Train] step: 76899, loss_adv_disc: 0.180206 2023-02-12 21:52:45,089 - INFO - [Train] step: 76899, loss_mpn: 0.016219, loss_rec: 0.027141, loss_semantic: 0.384469, loss_idmrf: 1.119208, loss_adv_gen: -59.837967 2023-02-12 21:53:14,375 - INFO - [Train] step: 76999, loss_adv_disc: -2.186698 2023-02-12 21:53:14,585 - INFO - [Train] step: 76999, loss_mpn: 0.008247, loss_rec: 0.022318, loss_semantic: 0.325164, loss_idmrf: 0.515373, loss_adv_gen: 13.725266 2023-02-12 21:53:51,762 - INFO - [Eval] step: 76999, bce: 0.269488, psnr: 29.308018, ssim: 0.959410 2023-02-12 21:54:22,091 - INFO - [Train] step: 77099, loss_adv_disc: -0.841599 2023-02-12 21:54:22,300 - INFO - [Train] step: 77099, loss_mpn: 0.013934, loss_rec: 0.024167, loss_semantic: 0.363232, loss_idmrf: 3.929152, loss_adv_gen: -22.377533 2023-02-12 21:54:51,600 - INFO - [Train] step: 77199, loss_adv_disc: -4.953798 2023-02-12 21:54:51,809 - INFO - [Train] step: 77199, loss_mpn: 0.020467, loss_rec: 0.034309, loss_semantic: 0.388925, loss_idmrf: 1.297982, loss_adv_gen: 4.049469 2023-02-12 21:55:21,116 - INFO - [Train] step: 77299, loss_adv_disc: -3.314212 2023-02-12 21:55:21,326 - INFO - [Train] step: 77299, loss_mpn: 0.011570, loss_rec: 0.026237, loss_semantic: 0.365827, loss_idmrf: 1.731829, loss_adv_gen: -19.944855 2023-02-12 21:55:50,620 - INFO - [Train] step: 77399, loss_adv_disc: -2.846416 2023-02-12 21:55:50,829 - INFO - [Train] step: 77399, loss_mpn: 0.022157, loss_rec: 0.027341, loss_semantic: 0.365681, loss_idmrf: 1.296392, loss_adv_gen: -25.670532 2023-02-12 21:56:20,120 - INFO - [Train] step: 77499, loss_adv_disc: -0.969175 2023-02-12 21:56:20,329 - INFO - [Train] step: 77499, loss_mpn: 0.012870, loss_rec: 0.023140, loss_semantic: 0.361856, loss_idmrf: 0.727183, loss_adv_gen: -14.977386 2023-02-12 21:56:49,627 - INFO - [Train] step: 77599, loss_adv_disc: -2.120253 2023-02-12 21:56:49,838 - INFO - [Train] step: 77599, loss_mpn: 0.016091, loss_rec: 0.024959, loss_semantic: 0.360970, loss_idmrf: 1.354609, loss_adv_gen: -2.078445 2023-02-12 21:57:19,138 - INFO - [Train] step: 77699, loss_adv_disc: -1.710395 2023-02-12 21:57:19,347 - INFO - [Train] step: 77699, loss_mpn: 0.021035, loss_rec: 0.027014, loss_semantic: 0.417861, loss_idmrf: 1.550774, loss_adv_gen: -30.157867 2023-02-12 21:57:48,637 - INFO - [Train] step: 77799, loss_adv_disc: -2.706655 2023-02-12 21:57:48,845 - INFO - [Train] step: 77799, loss_mpn: 0.010614, loss_rec: 0.020136, loss_semantic: 0.311300, loss_idmrf: 0.693098, loss_adv_gen: -35.336044 2023-02-12 21:58:18,142 - INFO - [Train] step: 77899, loss_adv_disc: -2.042899 2023-02-12 21:58:18,353 - INFO - [Train] step: 77899, loss_mpn: 0.026713, loss_rec: 0.030419, loss_semantic: 0.401147, loss_idmrf: 1.447123, loss_adv_gen: -29.810501 2023-02-12 21:58:47,634 - INFO - [Train] step: 77999, loss_adv_disc: -0.846563 2023-02-12 21:58:47,843 - INFO - [Train] step: 77999, loss_mpn: 0.010993, loss_rec: 0.011758, loss_semantic: 0.203056, loss_idmrf: 2.170007, loss_adv_gen: -24.297211 2023-02-12 21:59:25,001 - INFO - [Eval] step: 77999, bce: 0.244966, psnr: 29.355749, ssim: 0.959745 2023-02-12 21:59:55,337 - INFO - [Train] step: 78099, loss_adv_disc: -1.869050 2023-02-12 21:59:55,548 - INFO - [Train] step: 78099, loss_mpn: 0.013706, loss_rec: 0.017059, loss_semantic: 0.271400, loss_idmrf: 1.106075, loss_adv_gen: -32.159592 2023-02-12 22:00:24,853 - INFO - [Train] step: 78199, loss_adv_disc: -0.694928 2023-02-12 22:00:25,064 - INFO - [Train] step: 78199, loss_mpn: 0.010231, loss_rec: 0.018603, loss_semantic: 0.303646, loss_idmrf: 1.553169, loss_adv_gen: -10.599823 2023-02-12 22:00:54,365 - INFO - [Train] step: 78299, loss_adv_disc: -1.492590 2023-02-12 22:00:54,574 - INFO - [Train] step: 78299, loss_mpn: 0.029208, loss_rec: 0.043565, loss_semantic: 0.531334, loss_idmrf: 3.110640, loss_adv_gen: 6.624603 2023-02-12 22:01:23,861 - INFO - [Train] step: 78399, loss_adv_disc: -5.917741 2023-02-12 22:01:24,070 - INFO - [Train] step: 78399, loss_mpn: 0.014746, loss_rec: 0.030828, loss_semantic: 0.370716, loss_idmrf: 0.641258, loss_adv_gen: -30.985611 2023-02-12 22:01:53,374 - INFO - [Train] step: 78499, loss_adv_disc: -2.458356 2023-02-12 22:01:53,584 - INFO - [Train] step: 78499, loss_mpn: 0.008769, loss_rec: 0.019324, loss_semantic: 0.308658, loss_idmrf: 1.052042, loss_adv_gen: -12.480179 2023-02-12 22:02:22,867 - INFO - [Train] step: 78599, loss_adv_disc: -3.685525 2023-02-12 22:02:23,076 - INFO - [Train] step: 78599, loss_mpn: 0.039421, loss_rec: 0.031946, loss_semantic: 0.342704, loss_idmrf: 1.548755, loss_adv_gen: -7.087158 2023-02-12 22:02:52,383 - INFO - [Train] step: 78699, loss_adv_disc: -0.522042 2023-02-12 22:02:52,592 - INFO - [Train] step: 78699, loss_mpn: 0.009137, loss_rec: 0.023709, loss_semantic: 0.276450, loss_idmrf: 0.697814, loss_adv_gen: -23.835068 2023-02-12 22:03:21,897 - INFO - [Train] step: 78799, loss_adv_disc: -2.328640 2023-02-12 22:03:22,106 - INFO - [Train] step: 78799, loss_mpn: 0.026196, loss_rec: 0.029264, loss_semantic: 0.370760, loss_idmrf: 1.673305, loss_adv_gen: 4.738907 2023-02-12 22:03:51,410 - INFO - [Train] step: 78899, loss_adv_disc: -2.983714 2023-02-12 22:03:51,621 - INFO - [Train] step: 78899, loss_mpn: 0.012416, loss_rec: 0.025609, loss_semantic: 0.381932, loss_idmrf: 1.316743, loss_adv_gen: -22.635071 2023-02-12 22:04:20,936 - INFO - [Train] step: 78999, loss_adv_disc: -1.434405 2023-02-12 22:04:21,146 - INFO - [Train] step: 78999, loss_mpn: 0.005963, loss_rec: 0.016383, loss_semantic: 0.260136, loss_idmrf: 0.628291, loss_adv_gen: -24.092926 2023-02-12 22:04:58,316 - INFO - [Eval] step: 78999, bce: 0.180264, psnr: 29.433878, ssim: 0.959858 2023-02-12 22:05:28,637 - INFO - [Train] step: 79099, loss_adv_disc: -0.945140 2023-02-12 22:05:28,846 - INFO - [Train] step: 79099, loss_mpn: 0.014133, loss_rec: 0.015775, loss_semantic: 0.278595, loss_idmrf: 1.339719, loss_adv_gen: -10.491653 2023-02-12 22:05:58,131 - INFO - [Train] step: 79199, loss_adv_disc: -0.996181 2023-02-12 22:05:58,340 - INFO - [Train] step: 79199, loss_mpn: 0.005289, loss_rec: 0.018869, loss_semantic: 0.287186, loss_idmrf: 1.125021, loss_adv_gen: -11.777649 2023-02-12 22:06:27,633 - INFO - [Train] step: 79299, loss_adv_disc: -5.420825 2023-02-12 22:06:27,842 - INFO - [Train] step: 79299, loss_mpn: 0.043465, loss_rec: 0.035519, loss_semantic: 0.402694, loss_idmrf: 1.329745, loss_adv_gen: -10.706894 2023-02-12 22:06:57,140 - INFO - [Train] step: 79399, loss_adv_disc: -2.673816 2023-02-12 22:06:57,349 - INFO - [Train] step: 79399, loss_mpn: 0.008349, loss_rec: 0.019968, loss_semantic: 0.313932, loss_idmrf: 0.868498, loss_adv_gen: 3.465012 2023-02-12 22:07:26,647 - INFO - [Train] step: 79499, loss_adv_disc: -1.943445 2023-02-12 22:07:26,856 - INFO - [Train] step: 79499, loss_mpn: 0.009392, loss_rec: 0.016650, loss_semantic: 0.268713, loss_idmrf: 1.130947, loss_adv_gen: -45.081940 2023-02-12 22:07:56,155 - INFO - [Train] step: 79599, loss_adv_disc: -3.704866 2023-02-12 22:07:56,364 - INFO - [Train] step: 79599, loss_mpn: 0.016093, loss_rec: 0.029642, loss_semantic: 0.399155, loss_idmrf: 1.332579, loss_adv_gen: -18.639648 2023-02-12 22:08:25,660 - INFO - [Train] step: 79699, loss_adv_disc: -1.880555 2023-02-12 22:08:25,869 - INFO - [Train] step: 79699, loss_mpn: 0.009652, loss_rec: 0.020044, loss_semantic: 0.331316, loss_idmrf: 0.837624, loss_adv_gen: 2.887909 2023-02-12 22:08:55,168 - INFO - [Train] step: 79799, loss_adv_disc: -2.015677 2023-02-12 22:08:55,378 - INFO - [Train] step: 79799, loss_mpn: 0.017161, loss_rec: 0.023390, loss_semantic: 0.339819, loss_idmrf: 1.030102, loss_adv_gen: -21.604004 2023-02-12 22:09:24,665 - INFO - [Train] step: 79899, loss_adv_disc: -1.833834 2023-02-12 22:09:24,873 - INFO - [Train] step: 79899, loss_mpn: 0.010883, loss_rec: 0.019006, loss_semantic: 0.313928, loss_idmrf: 1.036283, loss_adv_gen: -47.980209 2023-02-12 22:09:54,170 - INFO - [Train] step: 79999, loss_adv_disc: -3.740714 2023-02-12 22:09:54,380 - INFO - [Train] step: 79999, loss_mpn: 0.013000, loss_rec: 0.025038, loss_semantic: 0.373298, loss_idmrf: 0.936110, loss_adv_gen: 0.577438 2023-02-12 22:10:31,564 - INFO - [Eval] step: 79999, bce: 0.205440, psnr: 29.614763, ssim: 0.959764 2023-02-12 22:11:02,256 - INFO - [Train] step: 80099, loss_adv_disc: -1.954846 2023-02-12 22:11:02,465 - INFO - [Train] step: 80099, loss_mpn: 0.013018, loss_rec: 0.024143, loss_semantic: 0.317573, loss_idmrf: 0.981092, loss_adv_gen: -5.573532 2023-02-12 22:11:31,756 - INFO - [Train] step: 80199, loss_adv_disc: -1.932872 2023-02-12 22:11:31,966 - INFO - [Train] step: 80199, loss_mpn: 0.013036, loss_rec: 0.023987, loss_semantic: 0.385737, loss_idmrf: 1.310633, loss_adv_gen: -27.336014 2023-02-12 22:12:01,272 - INFO - [Train] step: 80299, loss_adv_disc: -1.728409 2023-02-12 22:12:01,483 - INFO - [Train] step: 80299, loss_mpn: 0.003626, loss_rec: 0.014644, loss_semantic: 0.259559, loss_idmrf: 0.893902, loss_adv_gen: -13.748108 2023-02-12 22:12:30,760 - INFO - [Train] step: 80399, loss_adv_disc: -1.229165 2023-02-12 22:12:30,969 - INFO - [Train] step: 80399, loss_mpn: 0.010315, loss_rec: 0.020912, loss_semantic: 0.332469, loss_idmrf: 0.912975, loss_adv_gen: -10.223984 2023-02-12 22:13:00,268 - INFO - [Train] step: 80499, loss_adv_disc: -2.142443 2023-02-12 22:13:00,477 - INFO - [Train] step: 80499, loss_mpn: 0.011445, loss_rec: 0.024147, loss_semantic: 0.333957, loss_idmrf: 1.182527, loss_adv_gen: -33.646790 2023-02-12 22:13:29,767 - INFO - [Train] step: 80599, loss_adv_disc: -1.524771 2023-02-12 22:13:29,977 - INFO - [Train] step: 80599, loss_mpn: 0.011521, loss_rec: 0.018568, loss_semantic: 0.312590, loss_idmrf: 1.109116, loss_adv_gen: -6.599350 2023-02-12 22:13:59,262 - INFO - [Train] step: 80699, loss_adv_disc: -3.415359 2023-02-12 22:13:59,471 - INFO - [Train] step: 80699, loss_mpn: 0.017199, loss_rec: 0.022234, loss_semantic: 0.306993, loss_idmrf: 0.666777, loss_adv_gen: 16.524124 2023-02-12 22:14:28,767 - INFO - [Train] step: 80799, loss_adv_disc: -2.240194 2023-02-12 22:14:28,975 - INFO - [Train] step: 80799, loss_mpn: 0.007608, loss_rec: 0.018088, loss_semantic: 0.267332, loss_idmrf: 0.492199, loss_adv_gen: -21.447037 2023-02-12 22:14:58,488 - INFO - [Train] step: 80899, loss_adv_disc: -0.793497 2023-02-12 22:14:58,697 - INFO - [Train] step: 80899, loss_mpn: 0.013266, loss_rec: 0.024574, loss_semantic: 0.347341, loss_idmrf: 1.034722, loss_adv_gen: -3.172165 2023-02-12 22:15:27,986 - INFO - [Train] step: 80999, loss_adv_disc: -1.906720 2023-02-12 22:15:28,195 - INFO - [Train] step: 80999, loss_mpn: 0.017004, loss_rec: 0.023209, loss_semantic: 0.368184, loss_idmrf: 1.295650, loss_adv_gen: -14.581955 2023-02-12 22:16:05,347 - INFO - [Eval] step: 80999, bce: 0.195160, psnr: 29.480499, ssim: 0.960152 2023-02-12 22:16:35,688 - INFO - [Train] step: 81099, loss_adv_disc: -1.384778 2023-02-12 22:16:35,896 - INFO - [Train] step: 81099, loss_mpn: 0.005045, loss_rec: 0.014305, loss_semantic: 0.243195, loss_idmrf: 0.527773, loss_adv_gen: 18.854034 2023-02-12 22:17:05,193 - INFO - [Train] step: 81199, loss_adv_disc: -2.199818 2023-02-12 22:17:05,402 - INFO - [Train] step: 81199, loss_mpn: 0.040804, loss_rec: 0.029125, loss_semantic: 0.384294, loss_idmrf: 1.915502, loss_adv_gen: -1.895004 2023-02-12 22:17:34,696 - INFO - [Train] step: 81299, loss_adv_disc: -2.393034 2023-02-12 22:17:34,907 - INFO - [Train] step: 81299, loss_mpn: 0.016880, loss_rec: 0.021008, loss_semantic: 0.359613, loss_idmrf: 1.491770, loss_adv_gen: 12.128220 2023-02-12 22:18:04,194 - INFO - [Train] step: 81399, loss_adv_disc: -3.299304 2023-02-12 22:18:04,403 - INFO - [Train] step: 81399, loss_mpn: 0.014245, loss_rec: 0.032467, loss_semantic: 0.391594, loss_idmrf: 1.105607, loss_adv_gen: -4.272964 2023-02-12 22:18:33,683 - INFO - [Train] step: 81499, loss_adv_disc: -0.786523 2023-02-12 22:18:33,894 - INFO - [Train] step: 81499, loss_mpn: 0.012494, loss_rec: 0.022089, loss_semantic: 0.316979, loss_idmrf: 1.499170, loss_adv_gen: -4.324432 2023-02-12 22:19:03,180 - INFO - [Train] step: 81599, loss_adv_disc: -1.239787 2023-02-12 22:19:03,391 - INFO - [Train] step: 81599, loss_mpn: 0.011870, loss_rec: 0.021669, loss_semantic: 0.316628, loss_idmrf: 1.241662, loss_adv_gen: -23.222687 2023-02-12 22:19:32,682 - INFO - [Train] step: 81699, loss_adv_disc: -3.129382 2023-02-12 22:19:32,892 - INFO - [Train] step: 81699, loss_mpn: 0.025611, loss_rec: 0.033635, loss_semantic: 0.420178, loss_idmrf: 1.507029, loss_adv_gen: 26.161728 2023-02-12 22:20:02,171 - INFO - [Train] step: 81799, loss_adv_disc: -1.751543 2023-02-12 22:20:02,382 - INFO - [Train] step: 81799, loss_mpn: 0.006118, loss_rec: 0.022196, loss_semantic: 0.301933, loss_idmrf: 0.890265, loss_adv_gen: -1.251083 2023-02-12 22:20:31,660 - INFO - [Train] step: 81899, loss_adv_disc: -2.294317 2023-02-12 22:20:31,868 - INFO - [Train] step: 81899, loss_mpn: 0.010397, loss_rec: 0.023446, loss_semantic: 0.324697, loss_idmrf: 0.527594, loss_adv_gen: 11.055801 2023-02-12 22:21:01,154 - INFO - [Train] step: 81999, loss_adv_disc: -1.653113 2023-02-12 22:21:01,364 - INFO - [Train] step: 81999, loss_mpn: 0.007606, loss_rec: 0.016752, loss_semantic: 0.263431, loss_idmrf: 2.341574, loss_adv_gen: -20.123428 2023-02-12 22:21:38,553 - INFO - [Eval] step: 81999, bce: 0.212853, psnr: 29.343424, ssim: 0.959769 2023-02-12 22:22:08,875 - INFO - [Train] step: 82099, loss_adv_disc: -2.685766 2023-02-12 22:22:09,084 - INFO - [Train] step: 82099, loss_mpn: 0.014007, loss_rec: 0.019701, loss_semantic: 0.310383, loss_idmrf: 1.262186, loss_adv_gen: -1.794891 2023-02-12 22:22:38,372 - INFO - [Train] step: 82199, loss_adv_disc: -1.113030 2023-02-12 22:22:38,582 - INFO - [Train] step: 82199, loss_mpn: 0.021257, loss_rec: 0.026576, loss_semantic: 0.393461, loss_idmrf: 1.336469, loss_adv_gen: -5.483688 2023-02-12 22:23:07,873 - INFO - [Train] step: 82299, loss_adv_disc: -1.548123 2023-02-12 22:23:08,081 - INFO - [Train] step: 82299, loss_mpn: 0.012246, loss_rec: 0.020468, loss_semantic: 0.292406, loss_idmrf: 1.356920, loss_adv_gen: -10.126953 2023-02-12 22:23:37,363 - INFO - [Train] step: 82399, loss_adv_disc: -3.318037 2023-02-12 22:23:37,572 - INFO - [Train] step: 82399, loss_mpn: 0.011543, loss_rec: 0.022613, loss_semantic: 0.352956, loss_idmrf: 1.257604, loss_adv_gen: 10.006912 2023-02-12 22:24:06,857 - INFO - [Train] step: 82499, loss_adv_disc: -2.064742 2023-02-12 22:24:07,067 - INFO - [Train] step: 82499, loss_mpn: 0.012422, loss_rec: 0.021517, loss_semantic: 0.315286, loss_idmrf: 1.129812, loss_adv_gen: -5.061310 2023-02-12 22:24:36,344 - INFO - [Train] step: 82599, loss_adv_disc: 0.172682 2023-02-12 22:24:36,554 - INFO - [Train] step: 82599, loss_mpn: 0.012576, loss_rec: 0.023421, loss_semantic: 0.334479, loss_idmrf: 1.477735, loss_adv_gen: -9.905151 2023-02-12 22:25:05,841 - INFO - [Train] step: 82699, loss_adv_disc: -1.891251 2023-02-12 22:25:06,051 - INFO - [Train] step: 82699, loss_mpn: 0.011026, loss_rec: 0.021462, loss_semantic: 0.314751, loss_idmrf: 0.609500, loss_adv_gen: -25.305420 2023-02-12 22:25:35,332 - INFO - [Train] step: 82799, loss_adv_disc: -1.756757 2023-02-12 22:25:35,542 - INFO - [Train] step: 82799, loss_mpn: 0.009673, loss_rec: 0.022299, loss_semantic: 0.296469, loss_idmrf: 0.604528, loss_adv_gen: 9.052139 2023-02-12 22:26:04,821 - INFO - [Train] step: 82899, loss_adv_disc: -3.912753 2023-02-12 22:26:05,031 - INFO - [Train] step: 82899, loss_mpn: 0.014304, loss_rec: 0.026747, loss_semantic: 0.336144, loss_idmrf: 1.128212, loss_adv_gen: 10.291565 2023-02-12 22:26:34,314 - INFO - [Train] step: 82999, loss_adv_disc: -2.641112 2023-02-12 22:26:34,523 - INFO - [Train] step: 82999, loss_mpn: 0.018135, loss_rec: 0.021465, loss_semantic: 0.326331, loss_idmrf: 1.163360, loss_adv_gen: 12.538666 2023-02-12 22:27:11,691 - INFO - [Eval] step: 82999, bce: 0.259439, psnr: 29.434824, ssim: 0.959572 2023-02-12 22:27:42,020 - INFO - [Train] step: 83099, loss_adv_disc: -1.854239 2023-02-12 22:27:42,229 - INFO - [Train] step: 83099, loss_mpn: 0.013305, loss_rec: 0.024816, loss_semantic: 0.354490, loss_idmrf: 1.155751, loss_adv_gen: -35.695221 2023-02-12 22:28:11,509 - INFO - [Train] step: 83199, loss_adv_disc: -5.303917 2023-02-12 22:28:11,718 - INFO - [Train] step: 83199, loss_mpn: 0.036127, loss_rec: 0.044794, loss_semantic: 0.407443, loss_idmrf: 1.683210, loss_adv_gen: 15.574738 2023-02-12 22:28:41,010 - INFO - [Train] step: 83299, loss_adv_disc: -3.863278 2023-02-12 22:28:41,221 - INFO - [Train] step: 83299, loss_mpn: 0.019194, loss_rec: 0.028987, loss_semantic: 0.355459, loss_idmrf: 2.104625, loss_adv_gen: -29.801331 2023-02-12 22:29:10,497 - INFO - [Train] step: 83399, loss_adv_disc: -0.666235 2023-02-12 22:29:10,706 - INFO - [Train] step: 83399, loss_mpn: 0.031817, loss_rec: 0.032100, loss_semantic: 0.418056, loss_idmrf: 2.055821, loss_adv_gen: -25.406769 2023-02-12 22:29:39,998 - INFO - [Train] step: 83499, loss_adv_disc: -1.871648 2023-02-12 22:29:40,207 - INFO - [Train] step: 83499, loss_mpn: 0.009200, loss_rec: 0.018592, loss_semantic: 0.266195, loss_idmrf: 0.715677, loss_adv_gen: -2.813675 2023-02-12 22:30:09,488 - INFO - [Train] step: 83599, loss_adv_disc: -2.040806 2023-02-12 22:30:09,697 - INFO - [Train] step: 83599, loss_mpn: 0.017686, loss_rec: 0.021136, loss_semantic: 0.316977, loss_idmrf: 1.369921, loss_adv_gen: -5.230347 2023-02-12 22:30:38,994 - INFO - [Train] step: 83699, loss_adv_disc: -2.911713 2023-02-12 22:30:39,203 - INFO - [Train] step: 83699, loss_mpn: 0.013632, loss_rec: 0.026632, loss_semantic: 0.368659, loss_idmrf: 1.713869, loss_adv_gen: -12.311432 2023-02-12 22:31:08,490 - INFO - [Train] step: 83799, loss_adv_disc: -1.573983 2023-02-12 22:31:08,699 - INFO - [Train] step: 83799, loss_mpn: 0.008004, loss_rec: 0.019203, loss_semantic: 0.292114, loss_idmrf: 0.802957, loss_adv_gen: -21.593292 2023-02-12 22:31:37,975 - INFO - [Train] step: 83899, loss_adv_disc: -1.295604 2023-02-12 22:31:38,183 - INFO - [Train] step: 83899, loss_mpn: 0.035844, loss_rec: 0.025547, loss_semantic: 0.303224, loss_idmrf: 1.509167, loss_adv_gen: 8.185944 2023-02-12 22:32:07,474 - INFO - [Train] step: 83999, loss_adv_disc: -1.305321 2023-02-12 22:32:07,683 - INFO - [Train] step: 83999, loss_mpn: 0.014502, loss_rec: 0.028332, loss_semantic: 0.392771, loss_idmrf: 1.674671, loss_adv_gen: -22.083939 2023-02-12 22:32:44,854 - INFO - [Eval] step: 83999, bce: 0.227091, psnr: 29.698452, ssim: 0.960404 2023-02-12 22:33:15,394 - INFO - [Train] step: 84099, loss_adv_disc: -2.061622 2023-02-12 22:33:15,603 - INFO - [Train] step: 84099, loss_mpn: 0.026596, loss_rec: 0.030299, loss_semantic: 0.365827, loss_idmrf: 0.752204, loss_adv_gen: 9.087082 2023-02-12 22:33:44,895 - INFO - [Train] step: 84199, loss_adv_disc: -1.424239 2023-02-12 22:33:45,104 - INFO - [Train] step: 84199, loss_mpn: 0.007965, loss_rec: 0.018912, loss_semantic: 0.292395, loss_idmrf: 1.015371, loss_adv_gen: 16.790253 2023-02-12 22:34:14,397 - INFO - [Train] step: 84299, loss_adv_disc: -2.842542 2023-02-12 22:34:14,607 - INFO - [Train] step: 84299, loss_mpn: 0.012008, loss_rec: 0.021443, loss_semantic: 0.333010, loss_idmrf: 0.797953, loss_adv_gen: 4.505051 2023-02-12 22:34:43,889 - INFO - [Train] step: 84399, loss_adv_disc: -3.406497 2023-02-12 22:34:44,098 - INFO - [Train] step: 84399, loss_mpn: 0.020828, loss_rec: 0.026032, loss_semantic: 0.406164, loss_idmrf: 1.752105, loss_adv_gen: -1.479828 2023-02-12 22:35:13,388 - INFO - [Train] step: 84499, loss_adv_disc: -0.875422 2023-02-12 22:35:13,597 - INFO - [Train] step: 84499, loss_mpn: 0.007535, loss_rec: 0.017292, loss_semantic: 0.278894, loss_idmrf: 0.618171, loss_adv_gen: 26.819199 2023-02-12 22:35:42,889 - INFO - [Train] step: 84599, loss_adv_disc: -1.599960 2023-02-12 22:35:43,099 - INFO - [Train] step: 84599, loss_mpn: 0.016189, loss_rec: 0.021389, loss_semantic: 0.316380, loss_idmrf: 0.945099, loss_adv_gen: 16.799698 2023-02-12 22:36:12,383 - INFO - [Train] step: 84699, loss_adv_disc: -1.210018 2023-02-12 22:36:12,592 - INFO - [Train] step: 84699, loss_mpn: 0.040770, loss_rec: 0.029183, loss_semantic: 0.359197, loss_idmrf: 1.570401, loss_adv_gen: -5.110779 2023-02-12 22:36:41,872 - INFO - [Train] step: 84799, loss_adv_disc: -2.146945 2023-02-12 22:36:42,083 - INFO - [Train] step: 84799, loss_mpn: 0.011774, loss_rec: 0.025158, loss_semantic: 0.351535, loss_idmrf: 0.748179, loss_adv_gen: -36.865891 2023-02-12 22:37:11,367 - INFO - [Train] step: 84899, loss_adv_disc: -2.920732 2023-02-12 22:37:11,576 - INFO - [Train] step: 84899, loss_mpn: 0.017043, loss_rec: 0.020808, loss_semantic: 0.309455, loss_idmrf: 0.685178, loss_adv_gen: 2.887573 2023-02-12 22:37:40,857 - INFO - [Train] step: 84999, loss_adv_disc: -2.183505 2023-02-12 22:37:41,066 - INFO - [Train] step: 84999, loss_mpn: 0.015849, loss_rec: 0.022237, loss_semantic: 0.347815, loss_idmrf: 1.528895, loss_adv_gen: 16.759644 2023-02-12 22:38:18,220 - INFO - [Eval] step: 84999, bce: 0.188199, psnr: 29.375647, ssim: 0.960063 2023-02-12 22:38:48,681 - INFO - [Train] step: 85099, loss_adv_disc: -2.599847 2023-02-12 22:38:48,890 - INFO - [Train] step: 85099, loss_mpn: 0.014380, loss_rec: 0.029810, loss_semantic: 0.415858, loss_idmrf: 1.149213, loss_adv_gen: 7.259567 2023-02-12 22:39:18,173 - INFO - [Train] step: 85199, loss_adv_disc: -3.062160 2023-02-12 22:39:18,383 - INFO - [Train] step: 85199, loss_mpn: 0.011563, loss_rec: 0.023880, loss_semantic: 0.355982, loss_idmrf: 1.135275, loss_adv_gen: 11.056839 2023-02-12 22:39:47,680 - INFO - [Train] step: 85299, loss_adv_disc: -2.362738 2023-02-12 22:39:47,889 - INFO - [Train] step: 85299, loss_mpn: 0.010951, loss_rec: 0.019874, loss_semantic: 0.297069, loss_idmrf: 1.274630, loss_adv_gen: 20.899750 2023-02-12 22:40:17,173 - INFO - [Train] step: 85399, loss_adv_disc: -2.133623 2023-02-12 22:40:17,381 - INFO - [Train] step: 85399, loss_mpn: 0.008740, loss_rec: 0.017253, loss_semantic: 0.286146, loss_idmrf: 0.948331, loss_adv_gen: -36.660095 2023-02-12 22:40:46,666 - INFO - [Train] step: 85499, loss_adv_disc: -2.122717 2023-02-12 22:40:46,874 - INFO - [Train] step: 85499, loss_mpn: 0.014356, loss_rec: 0.022839, loss_semantic: 0.359607, loss_idmrf: 1.582760, loss_adv_gen: 35.417862 2023-02-12 22:41:16,165 - INFO - [Train] step: 85599, loss_adv_disc: -0.524845 2023-02-12 22:41:16,375 - INFO - [Train] step: 85599, loss_mpn: 0.016080, loss_rec: 0.020722, loss_semantic: 0.308988, loss_idmrf: 0.970930, loss_adv_gen: 19.111923 2023-02-12 22:41:45,663 - INFO - [Train] step: 85699, loss_adv_disc: -4.337905 2023-02-12 22:41:45,872 - INFO - [Train] step: 85699, loss_mpn: 0.024659, loss_rec: 0.031992, loss_semantic: 0.417738, loss_idmrf: 0.840500, loss_adv_gen: 4.708832 2023-02-12 22:42:15,164 - INFO - [Train] step: 85799, loss_adv_disc: -1.247869 2023-02-12 22:42:15,374 - INFO - [Train] step: 85799, loss_mpn: 0.010486, loss_rec: 0.014143, loss_semantic: 0.250406, loss_idmrf: 1.057371, loss_adv_gen: 6.306793 2023-02-12 22:42:44,648 - INFO - [Train] step: 85899, loss_adv_disc: -1.866068 2023-02-12 22:42:44,857 - INFO - [Train] step: 85899, loss_mpn: 0.020127, loss_rec: 0.030857, loss_semantic: 0.398664, loss_idmrf: 1.461052, loss_adv_gen: 10.438232 2023-02-12 22:43:14,143 - INFO - [Train] step: 85999, loss_adv_disc: -4.702723 2023-02-12 22:43:14,352 - INFO - [Train] step: 85999, loss_mpn: 0.022697, loss_rec: 0.019282, loss_semantic: 0.315688, loss_idmrf: 2.003258, loss_adv_gen: -1.060745 2023-02-12 22:43:51,506 - INFO - [Eval] step: 85999, bce: 0.226485, psnr: 29.559956, ssim: 0.960104 2023-02-12 22:44:21,834 - INFO - [Train] step: 86099, loss_adv_disc: -0.295491 2023-02-12 22:44:22,043 - INFO - [Train] step: 86099, loss_mpn: 0.007900, loss_rec: 0.012733, loss_semantic: 0.225708, loss_idmrf: 0.745022, loss_adv_gen: -1.513489 2023-02-12 22:44:51,327 - INFO - [Train] step: 86199, loss_adv_disc: -3.232905 2023-02-12 22:44:51,537 - INFO - [Train] step: 86199, loss_mpn: 0.019590, loss_rec: 0.026331, loss_semantic: 0.354718, loss_idmrf: 1.331614, loss_adv_gen: -7.958313 2023-02-12 22:45:20,827 - INFO - [Train] step: 86299, loss_adv_disc: -1.881432 2023-02-12 22:45:21,038 - INFO - [Train] step: 86299, loss_mpn: 0.006868, loss_rec: 0.019031, loss_semantic: 0.274666, loss_idmrf: 0.705134, loss_adv_gen: 18.449005 2023-02-12 22:45:50,321 - INFO - [Train] step: 86399, loss_adv_disc: -1.145922 2023-02-12 22:45:50,531 - INFO - [Train] step: 86399, loss_mpn: 0.006508, loss_rec: 0.016233, loss_semantic: 0.256258, loss_idmrf: 1.016549, loss_adv_gen: 15.945740 2023-02-12 22:46:19,809 - INFO - [Train] step: 86499, loss_adv_disc: -1.173688 2023-02-12 22:46:20,018 - INFO - [Train] step: 86499, loss_mpn: 0.004875, loss_rec: 0.015839, loss_semantic: 0.263512, loss_idmrf: 1.266933, loss_adv_gen: -41.751892 2023-02-12 22:46:49,313 - INFO - [Train] step: 86599, loss_adv_disc: -0.974515 2023-02-12 22:46:49,522 - INFO - [Train] step: 86599, loss_mpn: 0.015530, loss_rec: 0.023802, loss_semantic: 0.346080, loss_idmrf: 1.574933, loss_adv_gen: 23.652710 2023-02-12 22:47:18,799 - INFO - [Train] step: 86699, loss_adv_disc: -2.270790 2023-02-12 22:47:19,008 - INFO - [Train] step: 86699, loss_mpn: 0.017282, loss_rec: 0.032561, loss_semantic: 0.438176, loss_idmrf: 1.683188, loss_adv_gen: 21.022522 2023-02-12 22:47:48,288 - INFO - [Train] step: 86799, loss_adv_disc: -2.770969 2023-02-12 22:47:48,497 - INFO - [Train] step: 86799, loss_mpn: 0.009989, loss_rec: 0.027629, loss_semantic: 0.372735, loss_idmrf: 0.779297, loss_adv_gen: 15.239899 2023-02-12 22:48:17,771 - INFO - [Train] step: 86899, loss_adv_disc: -2.491087 2023-02-12 22:48:17,980 - INFO - [Train] step: 86899, loss_mpn: 0.030613, loss_rec: 0.042206, loss_semantic: 0.423085, loss_idmrf: 1.926254, loss_adv_gen: 0.479675 2023-02-12 22:48:47,268 - INFO - [Train] step: 86999, loss_adv_disc: -2.636595 2023-02-12 22:48:47,477 - INFO - [Train] step: 86999, loss_mpn: 0.010910, loss_rec: 0.025540, loss_semantic: 0.355855, loss_idmrf: 1.191296, loss_adv_gen: 4.286392 2023-02-12 22:49:24,654 - INFO - [Eval] step: 86999, bce: 0.238724, psnr: 29.569613, ssim: 0.959560 2023-02-12 22:49:54,975 - INFO - [Train] step: 87099, loss_adv_disc: -2.033104 2023-02-12 22:49:55,183 - INFO - [Train] step: 87099, loss_mpn: 0.008914, loss_rec: 0.019173, loss_semantic: 0.304040, loss_idmrf: 1.048409, loss_adv_gen: 41.686340 2023-02-12 22:50:24,469 - INFO - [Train] step: 87199, loss_adv_disc: -4.576338 2023-02-12 22:50:24,678 - INFO - [Train] step: 87199, loss_mpn: 0.010562, loss_rec: 0.035617, loss_semantic: 0.402997, loss_idmrf: 1.742866, loss_adv_gen: 27.384201 2023-02-12 22:50:53,959 - INFO - [Train] step: 87299, loss_adv_disc: -1.377123 2023-02-12 22:50:54,167 - INFO - [Train] step: 87299, loss_mpn: 0.021754, loss_rec: 0.030215, loss_semantic: 0.383458, loss_idmrf: 1.370546, loss_adv_gen: -4.719833 2023-02-12 22:51:23,451 - INFO - [Train] step: 87399, loss_adv_disc: -2.165551 2023-02-12 22:51:23,660 - INFO - [Train] step: 87399, loss_mpn: 0.019520, loss_rec: 0.026305, loss_semantic: 0.362676, loss_idmrf: 1.113311, loss_adv_gen: 25.606369 2023-02-12 22:51:52,933 - INFO - [Train] step: 87499, loss_adv_disc: -2.167655 2023-02-12 22:51:53,142 - INFO - [Train] step: 87499, loss_mpn: 0.012374, loss_rec: 0.019891, loss_semantic: 0.301857, loss_idmrf: 0.761809, loss_adv_gen: 15.091827 2023-02-12 22:52:22,416 - INFO - [Train] step: 87599, loss_adv_disc: -1.530239 2023-02-12 22:52:22,626 - INFO - [Train] step: 87599, loss_mpn: 0.008741, loss_rec: 0.017174, loss_semantic: 0.286846, loss_idmrf: 0.562895, loss_adv_gen: -10.686615 2023-02-12 22:52:51,921 - INFO - [Train] step: 87699, loss_adv_disc: -4.102802 2023-02-12 22:52:52,130 - INFO - [Train] step: 87699, loss_mpn: 0.010287, loss_rec: 0.024522, loss_semantic: 0.321317, loss_idmrf: 1.411938, loss_adv_gen: -5.934036 2023-02-12 22:53:21,415 - INFO - [Train] step: 87799, loss_adv_disc: -0.882149 2023-02-12 22:53:21,625 - INFO - [Train] step: 87799, loss_mpn: 0.014064, loss_rec: 0.018791, loss_semantic: 0.277237, loss_idmrf: 0.792930, loss_adv_gen: -16.969391 2023-02-12 22:53:50,912 - INFO - [Train] step: 87899, loss_adv_disc: -3.930111 2023-02-12 22:53:51,122 - INFO - [Train] step: 87899, loss_mpn: 0.026538, loss_rec: 0.028247, loss_semantic: 0.347056, loss_idmrf: 1.235671, loss_adv_gen: 7.555023 2023-02-12 22:54:20,596 - INFO - [Train] step: 87999, loss_adv_disc: -1.799998 2023-02-12 22:54:20,804 - INFO - [Train] step: 87999, loss_mpn: 0.016673, loss_rec: 0.027757, loss_semantic: 0.384795, loss_idmrf: 1.311292, loss_adv_gen: -15.984299 2023-02-12 22:54:57,985 - INFO - [Eval] step: 87999, bce: 0.208852, psnr: 29.609364, ssim: 0.960170 2023-02-12 22:55:28,298 - INFO - [Train] step: 88099, loss_adv_disc: -0.694711 2023-02-12 22:55:28,507 - INFO - [Train] step: 88099, loss_mpn: 0.009299, loss_rec: 0.018521, loss_semantic: 0.271008, loss_idmrf: 1.289457, loss_adv_gen: 8.517593 2023-02-12 22:55:57,789 - INFO - [Train] step: 88199, loss_adv_disc: -1.654199 2023-02-12 22:55:57,997 - INFO - [Train] step: 88199, loss_mpn: 0.011853, loss_rec: 0.018622, loss_semantic: 0.297453, loss_idmrf: 0.845039, loss_adv_gen: 10.536713 2023-02-12 22:56:27,288 - INFO - [Train] step: 88299, loss_adv_disc: -2.048277 2023-02-12 22:56:27,497 - INFO - [Train] step: 88299, loss_mpn: 0.006203, loss_rec: 0.016460, loss_semantic: 0.264777, loss_idmrf: 0.459634, loss_adv_gen: 30.065536 2023-02-12 22:56:56,783 - INFO - [Train] step: 88399, loss_adv_disc: -1.170121 2023-02-12 22:56:56,992 - INFO - [Train] step: 88399, loss_mpn: 0.004985, loss_rec: 0.015100, loss_semantic: 0.253353, loss_idmrf: 0.923472, loss_adv_gen: 32.590820 2023-02-12 22:57:26,273 - INFO - [Train] step: 88499, loss_adv_disc: -0.676622 2023-02-12 22:57:26,482 - INFO - [Train] step: 88499, loss_mpn: 0.009017, loss_rec: 0.015637, loss_semantic: 0.261410, loss_idmrf: 1.039613, loss_adv_gen: 37.004135 2023-02-12 22:57:55,771 - INFO - [Train] step: 88599, loss_adv_disc: -3.199141 2023-02-12 22:57:55,981 - INFO - [Train] step: 88599, loss_mpn: 0.011977, loss_rec: 0.025262, loss_semantic: 0.348552, loss_idmrf: 1.373847, loss_adv_gen: 7.484833 2023-02-12 22:58:25,259 - INFO - [Train] step: 88699, loss_adv_disc: -2.282774 2023-02-12 22:58:25,468 - INFO - [Train] step: 88699, loss_mpn: 0.004897, loss_rec: 0.017821, loss_semantic: 0.285942, loss_idmrf: 0.951750, loss_adv_gen: -16.504700 2023-02-12 22:58:54,739 - INFO - [Train] step: 88799, loss_adv_disc: -1.006118 2023-02-12 22:58:54,949 - INFO - [Train] step: 88799, loss_mpn: 0.013463, loss_rec: 0.026622, loss_semantic: 0.328155, loss_idmrf: 1.289775, loss_adv_gen: 8.089172 2023-02-12 22:59:24,225 - INFO - [Train] step: 88899, loss_adv_disc: -0.012936 2023-02-12 22:59:24,434 - INFO - [Train] step: 88899, loss_mpn: 0.011258, loss_rec: 0.023178, loss_semantic: 0.330777, loss_idmrf: 1.395392, loss_adv_gen: -8.399399 2023-02-12 22:59:53,727 - INFO - [Train] step: 88999, loss_adv_disc: -1.760600 2023-02-12 22:59:53,935 - INFO - [Train] step: 88999, loss_mpn: 0.011513, loss_rec: 0.021993, loss_semantic: 0.317674, loss_idmrf: 0.893112, loss_adv_gen: -12.760788 2023-02-12 23:00:31,110 - INFO - [Eval] step: 88999, bce: 0.192495, psnr: 29.364573, ssim: 0.960369 2023-02-12 23:01:01,433 - INFO - [Train] step: 89099, loss_adv_disc: -2.534625 2023-02-12 23:01:01,642 - INFO - [Train] step: 89099, loss_mpn: 0.009743, loss_rec: 0.024122, loss_semantic: 0.378278, loss_idmrf: 0.856037, loss_adv_gen: 26.906036 2023-02-12 23:01:30,925 - INFO - [Train] step: 89199, loss_adv_disc: -1.590876 2023-02-12 23:01:31,134 - INFO - [Train] step: 89199, loss_mpn: 0.010254, loss_rec: 0.021274, loss_semantic: 0.311404, loss_idmrf: 0.765905, loss_adv_gen: 10.778320 2023-02-12 23:02:00,411 - INFO - [Train] step: 89299, loss_adv_disc: -3.242207 2023-02-12 23:02:00,620 - INFO - [Train] step: 89299, loss_mpn: 0.009426, loss_rec: 0.023970, loss_semantic: 0.346735, loss_idmrf: 0.909902, loss_adv_gen: 37.161652 2023-02-12 23:02:29,901 - INFO - [Train] step: 89399, loss_adv_disc: -1.214535 2023-02-12 23:02:30,109 - INFO - [Train] step: 89399, loss_mpn: 0.012563, loss_rec: 0.020839, loss_semantic: 0.346860, loss_idmrf: 0.786266, loss_adv_gen: 18.001709 2023-02-12 23:02:59,402 - INFO - [Train] step: 89499, loss_adv_disc: -1.577059 2023-02-12 23:02:59,612 - INFO - [Train] step: 89499, loss_mpn: 0.010237, loss_rec: 0.023017, loss_semantic: 0.290351, loss_idmrf: 0.716984, loss_adv_gen: 8.327850 2023-02-12 23:03:28,885 - INFO - [Train] step: 89599, loss_adv_disc: -1.172501 2023-02-12 23:03:29,093 - INFO - [Train] step: 89599, loss_mpn: 0.005797, loss_rec: 0.014376, loss_semantic: 0.236088, loss_idmrf: 1.051965, loss_adv_gen: 5.179504 2023-02-12 23:03:58,380 - INFO - [Train] step: 89699, loss_adv_disc: -1.264019 2023-02-12 23:03:58,589 - INFO - [Train] step: 89699, loss_mpn: 0.006720, loss_rec: 0.019182, loss_semantic: 0.278959, loss_idmrf: 1.156227, loss_adv_gen: 22.859497 2023-02-12 23:04:27,870 - INFO - [Train] step: 89799, loss_adv_disc: -1.543325 2023-02-12 23:04:28,079 - INFO - [Train] step: 89799, loss_mpn: 0.037993, loss_rec: 0.024059, loss_semantic: 0.300190, loss_idmrf: 1.335723, loss_adv_gen: 6.400848 2023-02-12 23:04:57,351 - INFO - [Train] step: 89899, loss_adv_disc: -1.700919 2023-02-12 23:04:57,560 - INFO - [Train] step: 89899, loss_mpn: 0.015497, loss_rec: 0.026579, loss_semantic: 0.353798, loss_idmrf: 1.827608, loss_adv_gen: 45.832916 2023-02-12 23:05:26,830 - INFO - [Train] step: 89999, loss_adv_disc: -2.272173 2023-02-12 23:05:27,038 - INFO - [Train] step: 89999, loss_mpn: 0.015357, loss_rec: 0.024759, loss_semantic: 0.378283, loss_idmrf: 1.403883, loss_adv_gen: 3.761703 2023-02-12 23:06:04,230 - INFO - [Eval] step: 89999, bce: 0.245090, psnr: 29.742336, ssim: 0.959814 2023-02-12 23:06:34,878 - INFO - [Train] step: 90099, loss_adv_disc: -1.275551 2023-02-12 23:06:35,089 - INFO - [Train] step: 90099, loss_mpn: 0.010013, loss_rec: 0.018038, loss_semantic: 0.273438, loss_idmrf: 0.680710, loss_adv_gen: -9.509262 2023-02-12 23:07:04,370 - INFO - [Train] step: 90199, loss_adv_disc: -2.897920 2023-02-12 23:07:04,581 - INFO - [Train] step: 90199, loss_mpn: 0.007949, loss_rec: 0.024239, loss_semantic: 0.335838, loss_idmrf: 0.733817, loss_adv_gen: 16.263672 2023-02-12 23:07:33,870 - INFO - [Train] step: 90299, loss_adv_disc: -3.344249 2023-02-12 23:07:34,079 - INFO - [Train] step: 90299, loss_mpn: 0.019946, loss_rec: 0.023859, loss_semantic: 0.321477, loss_idmrf: 1.387035, loss_adv_gen: 0.904022 2023-02-12 23:08:03,355 - INFO - [Train] step: 90399, loss_adv_disc: -2.814696 2023-02-12 23:08:03,564 - INFO - [Train] step: 90399, loss_mpn: 0.043506, loss_rec: 0.026508, loss_semantic: 0.356641, loss_idmrf: 0.775598, loss_adv_gen: 37.660751 2023-02-12 23:08:32,842 - INFO - [Train] step: 90499, loss_adv_disc: -2.244023 2023-02-12 23:08:33,051 - INFO - [Train] step: 90499, loss_mpn: 0.007977, loss_rec: 0.015630, loss_semantic: 0.257737, loss_idmrf: 1.307551, loss_adv_gen: 27.113403 2023-02-12 23:09:02,328 - INFO - [Train] step: 90599, loss_adv_disc: -2.370589 2023-02-12 23:09:02,536 - INFO - [Train] step: 90599, loss_mpn: 0.008889, loss_rec: 0.020706, loss_semantic: 0.338355, loss_idmrf: 1.680592, loss_adv_gen: 37.242081 2023-02-12 23:09:31,828 - INFO - [Train] step: 90699, loss_adv_disc: -1.682405 2023-02-12 23:09:32,038 - INFO - [Train] step: 90699, loss_mpn: 0.018369, loss_rec: 0.024092, loss_semantic: 0.349372, loss_idmrf: 0.982044, loss_adv_gen: 9.344421 2023-02-12 23:10:01,330 - INFO - [Train] step: 90799, loss_adv_disc: -2.335721 2023-02-12 23:10:01,539 - INFO - [Train] step: 90799, loss_mpn: 0.028429, loss_rec: 0.028547, loss_semantic: 0.398257, loss_idmrf: 1.688701, loss_adv_gen: 6.936996 2023-02-12 23:10:30,827 - INFO - [Train] step: 90899, loss_adv_disc: -1.315806 2023-02-12 23:10:31,037 - INFO - [Train] step: 90899, loss_mpn: 0.014587, loss_rec: 0.019561, loss_semantic: 0.302764, loss_idmrf: 1.179077, loss_adv_gen: 45.661163 2023-02-12 23:11:00,321 - INFO - [Train] step: 90999, loss_adv_disc: -2.068295 2023-02-12 23:11:00,530 - INFO - [Train] step: 90999, loss_mpn: 0.013626, loss_rec: 0.025153, loss_semantic: 0.347251, loss_idmrf: 0.962201, loss_adv_gen: 7.953857 2023-02-12 23:11:37,705 - INFO - [Eval] step: 90999, bce: 0.187079, psnr: 29.633524, ssim: 0.959863 2023-02-12 23:12:08,016 - INFO - [Train] step: 91099, loss_adv_disc: -2.071609 2023-02-12 23:12:08,226 - INFO - [Train] step: 91099, loss_mpn: 0.011025, loss_rec: 0.019732, loss_semantic: 0.304933, loss_idmrf: 0.783460, loss_adv_gen: 32.428528 2023-02-12 23:12:37,519 - INFO - [Train] step: 91199, loss_adv_disc: -2.290299 2023-02-12 23:12:37,727 - INFO - [Train] step: 91199, loss_mpn: 0.012277, loss_rec: 0.021695, loss_semantic: 0.298362, loss_idmrf: 1.158478, loss_adv_gen: 12.822037 2023-02-12 23:13:07,006 - INFO - [Train] step: 91299, loss_adv_disc: -3.255817 2023-02-12 23:13:07,215 - INFO - [Train] step: 91299, loss_mpn: 0.009954, loss_rec: 0.022992, loss_semantic: 0.323687, loss_idmrf: 0.946588, loss_adv_gen: 52.449493 2023-02-12 23:13:36,508 - INFO - [Train] step: 91399, loss_adv_disc: -2.859841 2023-02-12 23:13:36,717 - INFO - [Train] step: 91399, loss_mpn: 0.018513, loss_rec: 0.021798, loss_semantic: 0.342798, loss_idmrf: 1.019126, loss_adv_gen: 13.479965 2023-02-12 23:14:05,994 - INFO - [Train] step: 91499, loss_adv_disc: -2.015166 2023-02-12 23:14:06,203 - INFO - [Train] step: 91499, loss_mpn: 0.003618, loss_rec: 0.017131, loss_semantic: 0.263530, loss_idmrf: 0.533798, loss_adv_gen: 5.799530 2023-02-12 23:14:35,479 - INFO - [Train] step: 91599, loss_adv_disc: -2.162205 2023-02-12 23:14:35,689 - INFO - [Train] step: 91599, loss_mpn: 0.013652, loss_rec: 0.019602, loss_semantic: 0.305864, loss_idmrf: 0.934743, loss_adv_gen: 2.177551 2023-02-12 23:15:04,978 - INFO - [Train] step: 91699, loss_adv_disc: -3.595319 2023-02-12 23:15:05,188 - INFO - [Train] step: 91699, loss_mpn: 0.011767, loss_rec: 0.027846, loss_semantic: 0.380262, loss_idmrf: 1.386422, loss_adv_gen: 20.448700 2023-02-12 23:15:34,468 - INFO - [Train] step: 91799, loss_adv_disc: -2.258083 2023-02-12 23:15:34,678 - INFO - [Train] step: 91799, loss_mpn: 0.015338, loss_rec: 0.023363, loss_semantic: 0.345113, loss_idmrf: 1.197403, loss_adv_gen: 18.718170 2023-02-12 23:16:03,964 - INFO - [Train] step: 91899, loss_adv_disc: -1.904118 2023-02-12 23:16:04,175 - INFO - [Train] step: 91899, loss_mpn: 0.011975, loss_rec: 0.019686, loss_semantic: 0.296047, loss_idmrf: 0.774503, loss_adv_gen: 7.820847 2023-02-12 23:16:33,450 - INFO - [Train] step: 91999, loss_adv_disc: -3.374937 2023-02-12 23:16:33,659 - INFO - [Train] step: 91999, loss_mpn: 0.011986, loss_rec: 0.024965, loss_semantic: 0.309911, loss_idmrf: 0.861377, loss_adv_gen: 49.800323 2023-02-12 23:17:10,825 - INFO - [Eval] step: 91999, bce: 0.293027, psnr: 29.445091, ssim: 0.959714 2023-02-12 23:17:41,151 - INFO - [Train] step: 92099, loss_adv_disc: -2.540604 2023-02-12 23:17:41,360 - INFO - [Train] step: 92099, loss_mpn: 0.010452, loss_rec: 0.021151, loss_semantic: 0.299822, loss_idmrf: 0.832366, loss_adv_gen: 3.457794 2023-02-12 23:18:10,655 - INFO - [Train] step: 92199, loss_adv_disc: -1.015079 2023-02-12 23:18:10,864 - INFO - [Train] step: 92199, loss_mpn: 0.003949, loss_rec: 0.015347, loss_semantic: 0.272474, loss_idmrf: 0.733950, loss_adv_gen: 19.874664 2023-02-12 23:18:40,140 - INFO - [Train] step: 92299, loss_adv_disc: -1.289920 2023-02-12 23:18:40,351 - INFO - [Train] step: 92299, loss_mpn: 0.004928, loss_rec: 0.014189, loss_semantic: 0.250844, loss_idmrf: 0.847363, loss_adv_gen: 42.136841 2023-02-12 23:19:09,634 - INFO - [Train] step: 92399, loss_adv_disc: -1.404711 2023-02-12 23:19:09,843 - INFO - [Train] step: 92399, loss_mpn: 0.005561, loss_rec: 0.020467, loss_semantic: 0.313894, loss_idmrf: 0.621334, loss_adv_gen: 32.155670 2023-02-12 23:19:39,146 - INFO - [Train] step: 92499, loss_adv_disc: -0.500072 2023-02-12 23:19:39,355 - INFO - [Train] step: 92499, loss_mpn: 0.007052, loss_rec: 0.015991, loss_semantic: 0.269340, loss_idmrf: 1.065239, loss_adv_gen: -4.498840 2023-02-12 23:20:08,632 - INFO - [Train] step: 92599, loss_adv_disc: -2.582837 2023-02-12 23:20:08,843 - INFO - [Train] step: 92599, loss_mpn: 0.015176, loss_rec: 0.021575, loss_semantic: 0.331674, loss_idmrf: 0.629503, loss_adv_gen: 31.273178 2023-02-12 23:20:38,130 - INFO - [Train] step: 92699, loss_adv_disc: -1.053490 2023-02-12 23:20:38,339 - INFO - [Train] step: 92699, loss_mpn: 0.009916, loss_rec: 0.019254, loss_semantic: 0.301340, loss_idmrf: 1.163476, loss_adv_gen: 11.960846 2023-02-12 23:21:07,619 - INFO - [Train] step: 92799, loss_adv_disc: -1.211599 2023-02-12 23:21:07,829 - INFO - [Train] step: 92799, loss_mpn: 0.009589, loss_rec: 0.021135, loss_semantic: 0.326048, loss_idmrf: 0.745327, loss_adv_gen: 39.651550 2023-02-12 23:21:37,131 - INFO - [Train] step: 92899, loss_adv_disc: -2.725896 2023-02-12 23:21:37,340 - INFO - [Train] step: 92899, loss_mpn: 0.004661, loss_rec: 0.016006, loss_semantic: 0.283285, loss_idmrf: 0.879264, loss_adv_gen: -2.264526 2023-02-12 23:22:06,635 - INFO - [Train] step: 92999, loss_adv_disc: -2.347195 2023-02-12 23:22:06,845 - INFO - [Train] step: 92999, loss_mpn: 0.009404, loss_rec: 0.020496, loss_semantic: 0.312576, loss_idmrf: 0.992978, loss_adv_gen: 15.451508 2023-02-12 23:22:44,016 - INFO - [Eval] step: 92999, bce: 0.183046, psnr: 29.671326, ssim: 0.960319 2023-02-12 23:23:14,327 - INFO - [Train] step: 93099, loss_adv_disc: -1.041051 2023-02-12 23:23:14,538 - INFO - [Train] step: 93099, loss_mpn: 0.003127, loss_rec: 0.014626, loss_semantic: 0.253543, loss_idmrf: 1.043261, loss_adv_gen: 49.892822 2023-02-12 23:23:43,818 - INFO - [Train] step: 93199, loss_adv_disc: -2.792303 2023-02-12 23:23:44,028 - INFO - [Train] step: 93199, loss_mpn: 0.018102, loss_rec: 0.030405, loss_semantic: 0.380928, loss_idmrf: 1.279510, loss_adv_gen: 38.999084 2023-02-12 23:24:13,302 - INFO - [Train] step: 93299, loss_adv_disc: -2.303699 2023-02-12 23:24:13,512 - INFO - [Train] step: 93299, loss_mpn: 0.016762, loss_rec: 0.025169, loss_semantic: 0.375074, loss_idmrf: 1.377540, loss_adv_gen: 9.224335 2023-02-12 23:24:42,787 - INFO - [Train] step: 93399, loss_adv_disc: -2.755570 2023-02-12 23:24:42,997 - INFO - [Train] step: 93399, loss_mpn: 0.020091, loss_rec: 0.028410, loss_semantic: 0.337910, loss_idmrf: 0.907781, loss_adv_gen: 56.145569 2023-02-12 23:25:12,274 - INFO - [Train] step: 93499, loss_adv_disc: -4.863779 2023-02-12 23:25:12,485 - INFO - [Train] step: 93499, loss_mpn: 0.022395, loss_rec: 0.035894, loss_semantic: 0.416698, loss_idmrf: 1.332159, loss_adv_gen: 10.050552 2023-02-12 23:25:41,758 - INFO - [Train] step: 93599, loss_adv_disc: -2.136067 2023-02-12 23:25:41,967 - INFO - [Train] step: 93599, loss_mpn: 0.014663, loss_rec: 0.027929, loss_semantic: 0.349520, loss_idmrf: 1.206887, loss_adv_gen: 39.666718 2023-02-12 23:26:11,249 - INFO - [Train] step: 93699, loss_adv_disc: -2.398403 2023-02-12 23:26:11,458 - INFO - [Train] step: 93699, loss_mpn: 0.014720, loss_rec: 0.020589, loss_semantic: 0.307399, loss_idmrf: 0.981645, loss_adv_gen: 23.273087 2023-02-12 23:26:40,725 - INFO - [Train] step: 93799, loss_adv_disc: -0.060402 2023-02-12 23:26:40,934 - INFO - [Train] step: 93799, loss_mpn: 0.007159, loss_rec: 0.016991, loss_semantic: 0.264517, loss_idmrf: 1.329045, loss_adv_gen: 46.528992 2023-02-12 23:27:10,211 - INFO - [Train] step: 93899, loss_adv_disc: -1.703609 2023-02-12 23:27:10,420 - INFO - [Train] step: 93899, loss_mpn: 0.013430, loss_rec: 0.017522, loss_semantic: 0.304942, loss_idmrf: 1.066471, loss_adv_gen: 22.988846 2023-02-12 23:27:39,696 - INFO - [Train] step: 93999, loss_adv_disc: -0.001142 2023-02-12 23:27:39,905 - INFO - [Train] step: 93999, loss_mpn: 0.015176, loss_rec: 0.029856, loss_semantic: 0.418469, loss_idmrf: 1.498970, loss_adv_gen: 9.566925 2023-02-12 23:28:17,083 - INFO - [Eval] step: 93999, bce: 0.198958, psnr: 29.756004, ssim: 0.960832 2023-02-12 23:28:47,627 - INFO - [Train] step: 94099, loss_adv_disc: -2.128485 2023-02-12 23:28:47,836 - INFO - [Train] step: 94099, loss_mpn: 0.010245, loss_rec: 0.020525, loss_semantic: 0.280251, loss_idmrf: 0.884868, loss_adv_gen: 6.847519 2023-02-12 23:29:17,133 - INFO - [Train] step: 94199, loss_adv_disc: -1.752343 2023-02-12 23:29:17,343 - INFO - [Train] step: 94199, loss_mpn: 0.013359, loss_rec: 0.022166, loss_semantic: 0.340163, loss_idmrf: 1.277305, loss_adv_gen: 27.151093 2023-02-12 23:29:46,639 - INFO - [Train] step: 94299, loss_adv_disc: -1.992763 2023-02-12 23:29:46,848 - INFO - [Train] step: 94299, loss_mpn: 0.008553, loss_rec: 0.023161, loss_semantic: 0.326350, loss_idmrf: 0.914571, loss_adv_gen: 67.853439 2023-02-12 23:30:16,118 - INFO - [Train] step: 94399, loss_adv_disc: -3.146342 2023-02-12 23:30:16,328 - INFO - [Train] step: 94399, loss_mpn: 0.007586, loss_rec: 0.020714, loss_semantic: 0.313589, loss_idmrf: 1.265472, loss_adv_gen: 36.949203 2023-02-12 23:30:45,602 - INFO - [Train] step: 94499, loss_adv_disc: -2.821059 2023-02-12 23:30:45,812 - INFO - [Train] step: 94499, loss_mpn: 0.008638, loss_rec: 0.016544, loss_semantic: 0.256347, loss_idmrf: 1.024493, loss_adv_gen: 48.086670 2023-02-12 23:31:15,092 - INFO - [Train] step: 94599, loss_adv_disc: -1.117393 2023-02-12 23:31:15,301 - INFO - [Train] step: 94599, loss_mpn: 0.015639, loss_rec: 0.020268, loss_semantic: 0.296542, loss_idmrf: 1.169383, loss_adv_gen: 53.799759 2023-02-12 23:31:44,577 - INFO - [Train] step: 94699, loss_adv_disc: -1.854741 2023-02-12 23:31:44,787 - INFO - [Train] step: 94699, loss_mpn: 0.013080, loss_rec: 0.024560, loss_semantic: 0.361756, loss_idmrf: 1.244203, loss_adv_gen: 6.881836 2023-02-12 23:32:14,063 - INFO - [Train] step: 94799, loss_adv_disc: -0.960469 2023-02-12 23:32:14,273 - INFO - [Train] step: 94799, loss_mpn: 0.011669, loss_rec: 0.022967, loss_semantic: 0.343228, loss_idmrf: 1.325891, loss_adv_gen: 49.212692 2023-02-12 23:32:43,560 - INFO - [Train] step: 94899, loss_adv_disc: -1.292549 2023-02-12 23:32:43,769 - INFO - [Train] step: 94899, loss_mpn: 0.030642, loss_rec: 0.014361, loss_semantic: 0.269855, loss_idmrf: 0.920401, loss_adv_gen: 52.103577 2023-02-12 23:33:13,038 - INFO - [Train] step: 94999, loss_adv_disc: -2.231826 2023-02-12 23:33:13,248 - INFO - [Train] step: 94999, loss_mpn: 0.007714, loss_rec: 0.022869, loss_semantic: 0.357592, loss_idmrf: 1.268089, loss_adv_gen: 11.504364 2023-02-12 23:33:50,411 - INFO - [Eval] step: 94999, bce: 0.179405, psnr: 29.463772, ssim: 0.960485 2023-02-12 23:34:21,035 - INFO - [Train] step: 95099, loss_adv_disc: -2.048538 2023-02-12 23:34:21,245 - INFO - [Train] step: 95099, loss_mpn: 0.014624, loss_rec: 0.029005, loss_semantic: 0.365496, loss_idmrf: 1.110644, loss_adv_gen: 27.007690 2023-02-12 23:34:50,535 - INFO - [Train] step: 95199, loss_adv_disc: -0.602155 2023-02-12 23:34:50,744 - INFO - [Train] step: 95199, loss_mpn: 0.010544, loss_rec: 0.020278, loss_semantic: 0.292497, loss_idmrf: 0.856112, loss_adv_gen: 47.462204 2023-02-12 23:35:20,039 - INFO - [Train] step: 95299, loss_adv_disc: -2.774182 2023-02-12 23:35:20,248 - INFO - [Train] step: 95299, loss_mpn: 0.017154, loss_rec: 0.025260, loss_semantic: 0.343450, loss_idmrf: 1.040106, loss_adv_gen: 38.529861 2023-02-12 23:35:49,529 - INFO - [Train] step: 95399, loss_adv_disc: -2.171955 2023-02-12 23:35:49,738 - INFO - [Train] step: 95399, loss_mpn: 0.023766, loss_rec: 0.021156, loss_semantic: 0.312862, loss_idmrf: 1.226270, loss_adv_gen: 37.191589 2023-02-12 23:36:19,033 - INFO - [Train] step: 95499, loss_adv_disc: -2.719986 2023-02-12 23:36:19,243 - INFO - [Train] step: 95499, loss_mpn: 0.008340, loss_rec: 0.024732, loss_semantic: 0.331353, loss_idmrf: 0.789878, loss_adv_gen: 18.675430 2023-02-12 23:36:48,526 - INFO - [Train] step: 95599, loss_adv_disc: -1.329358 2023-02-12 23:36:48,737 - INFO - [Train] step: 95599, loss_mpn: 0.015731, loss_rec: 0.029795, loss_semantic: 0.408388, loss_idmrf: 2.050893, loss_adv_gen: 8.380768 2023-02-12 23:37:18,021 - INFO - [Train] step: 95699, loss_adv_disc: -3.208468 2023-02-12 23:37:18,231 - INFO - [Train] step: 95699, loss_mpn: 0.010558, loss_rec: 0.020732, loss_semantic: 0.259257, loss_idmrf: 1.036332, loss_adv_gen: 45.046356 2023-02-12 23:37:47,530 - INFO - [Train] step: 95799, loss_adv_disc: -0.442210 2023-02-12 23:37:47,739 - INFO - [Train] step: 95799, loss_mpn: 0.017449, loss_rec: 0.026095, loss_semantic: 0.385863, loss_idmrf: 1.690787, loss_adv_gen: 53.987885 2023-02-12 23:38:17,035 - INFO - [Train] step: 95899, loss_adv_disc: -0.186369 2023-02-12 23:38:17,245 - INFO - [Train] step: 95899, loss_mpn: 0.009149, loss_rec: 0.016653, loss_semantic: 0.272815, loss_idmrf: 0.807927, loss_adv_gen: 6.424866 2023-02-12 23:38:46,532 - INFO - [Train] step: 95999, loss_adv_disc: -1.563838 2023-02-12 23:38:46,741 - INFO - [Train] step: 95999, loss_mpn: 0.013820, loss_rec: 0.022911, loss_semantic: 0.352683, loss_idmrf: 1.374703, loss_adv_gen: 38.104218 2023-02-12 23:39:23,921 - INFO - [Eval] step: 95999, bce: 0.228350, psnr: 29.458632, ssim: 0.960102 2023-02-12 23:39:54,232 - INFO - [Train] step: 96099, loss_adv_disc: -1.585813 2023-02-12 23:39:54,441 - INFO - [Train] step: 96099, loss_mpn: 0.027471, loss_rec: 0.033706, loss_semantic: 0.418054, loss_idmrf: 1.540161, loss_adv_gen: 37.535004 2023-02-12 23:40:23,720 - INFO - [Train] step: 96199, loss_adv_disc: -1.669062 2023-02-12 23:40:23,930 - INFO - [Train] step: 96199, loss_mpn: 0.004917, loss_rec: 0.017337, loss_semantic: 0.277722, loss_idmrf: 0.619569, loss_adv_gen: 35.490295 2023-02-12 23:40:53,197 - INFO - [Train] step: 96299, loss_adv_disc: -2.517097 2023-02-12 23:40:53,406 - INFO - [Train] step: 96299, loss_mpn: 0.017094, loss_rec: 0.025608, loss_semantic: 0.363232, loss_idmrf: 1.506509, loss_adv_gen: 29.749695 2023-02-12 23:41:22,676 - INFO - [Train] step: 96399, loss_adv_disc: -2.641939 2023-02-12 23:41:22,886 - INFO - [Train] step: 96399, loss_mpn: 0.005378, loss_rec: 0.021124, loss_semantic: 0.294077, loss_idmrf: 0.771866, loss_adv_gen: 65.291779 2023-02-12 23:41:52,154 - INFO - [Train] step: 96499, loss_adv_disc: -2.651909 2023-02-12 23:41:52,363 - INFO - [Train] step: 96499, loss_mpn: 0.015055, loss_rec: 0.025713, loss_semantic: 0.299258, loss_idmrf: 1.297437, loss_adv_gen: 44.744171 2023-02-12 23:42:21,640 - INFO - [Train] step: 96599, loss_adv_disc: -1.599800 2023-02-12 23:42:21,850 - INFO - [Train] step: 96599, loss_mpn: 0.010056, loss_rec: 0.021373, loss_semantic: 0.310605, loss_idmrf: 0.768300, loss_adv_gen: 23.980850 2023-02-12 23:42:51,140 - INFO - [Train] step: 96699, loss_adv_disc: -1.547398 2023-02-12 23:42:51,349 - INFO - [Train] step: 96699, loss_mpn: 0.012895, loss_rec: 0.027761, loss_semantic: 0.381098, loss_idmrf: 1.310478, loss_adv_gen: 67.316452 2023-02-12 23:43:20,634 - INFO - [Train] step: 96799, loss_adv_disc: -3.116447 2023-02-12 23:43:20,842 - INFO - [Train] step: 96799, loss_mpn: 0.017303, loss_rec: 0.025813, loss_semantic: 0.318578, loss_idmrf: 0.601287, loss_adv_gen: 55.097473 2023-02-12 23:43:50,132 - INFO - [Train] step: 96899, loss_adv_disc: -1.961031 2023-02-12 23:43:50,342 - INFO - [Train] step: 96899, loss_mpn: 0.012569, loss_rec: 0.021533, loss_semantic: 0.322357, loss_idmrf: 0.964167, loss_adv_gen: 28.988785 2023-02-12 23:44:19,627 - INFO - [Train] step: 96999, loss_adv_disc: -4.282440 2023-02-12 23:44:19,837 - INFO - [Train] step: 96999, loss_mpn: 0.016039, loss_rec: 0.035718, loss_semantic: 0.403399, loss_idmrf: 0.914110, loss_adv_gen: 61.213837 2023-02-12 23:44:57,028 - INFO - [Eval] step: 96999, bce: 0.215446, psnr: 29.699530, ssim: 0.960522 2023-02-12 23:45:27,344 - INFO - [Train] step: 97099, loss_adv_disc: -0.691589 2023-02-12 23:45:27,553 - INFO - [Train] step: 97099, loss_mpn: 0.007498, loss_rec: 0.016572, loss_semantic: 0.287448, loss_idmrf: 0.622034, loss_adv_gen: 53.815521 2023-02-12 23:45:56,839 - INFO - [Train] step: 97199, loss_adv_disc: -1.668211 2023-02-12 23:45:57,048 - INFO - [Train] step: 97199, loss_mpn: 0.019504, loss_rec: 0.026513, loss_semantic: 0.368916, loss_idmrf: 1.089405, loss_adv_gen: 48.248535 2023-02-12 23:46:26,327 - INFO - [Train] step: 97299, loss_adv_disc: -3.880022 2023-02-12 23:46:26,538 - INFO - [Train] step: 97299, loss_mpn: 0.017803, loss_rec: 0.022062, loss_semantic: 0.318014, loss_idmrf: 1.070219, loss_adv_gen: 12.505768 2023-02-12 23:46:55,830 - INFO - [Train] step: 97399, loss_adv_disc: -1.811782 2023-02-12 23:46:56,040 - INFO - [Train] step: 97399, loss_mpn: 0.022062, loss_rec: 0.019412, loss_semantic: 0.327529, loss_idmrf: 1.836595, loss_adv_gen: 28.556015 2023-02-12 23:47:25,323 - INFO - [Train] step: 97499, loss_adv_disc: -0.547901 2023-02-12 23:47:25,531 - INFO - [Train] step: 97499, loss_mpn: 0.023745, loss_rec: 0.018963, loss_semantic: 0.329990, loss_idmrf: 0.897504, loss_adv_gen: 54.324600 2023-02-12 23:47:54,812 - INFO - [Train] step: 97599, loss_adv_disc: -2.867587 2023-02-12 23:47:55,020 - INFO - [Train] step: 97599, loss_mpn: 0.010858, loss_rec: 0.027676, loss_semantic: 0.385914, loss_idmrf: 1.295514, loss_adv_gen: 29.263199 2023-02-12 23:48:24,305 - INFO - [Train] step: 97699, loss_adv_disc: -0.488182 2023-02-12 23:48:24,513 - INFO - [Train] step: 97699, loss_mpn: 0.020038, loss_rec: 0.026957, loss_semantic: 0.339315, loss_idmrf: 1.419404, loss_adv_gen: 63.036064 2023-02-12 23:48:53,800 - INFO - [Train] step: 97799, loss_adv_disc: -1.449921 2023-02-12 23:48:54,010 - INFO - [Train] step: 97799, loss_mpn: 0.023232, loss_rec: 0.025346, loss_semantic: 0.351984, loss_idmrf: 0.879842, loss_adv_gen: 26.288635 2023-02-12 23:49:23,293 - INFO - [Train] step: 97899, loss_adv_disc: -3.485407 2023-02-12 23:49:23,503 - INFO - [Train] step: 97899, loss_mpn: 0.017024, loss_rec: 0.027141, loss_semantic: 0.357046, loss_idmrf: 1.321779, loss_adv_gen: 57.368179 2023-02-12 23:49:52,797 - INFO - [Train] step: 97999, loss_adv_disc: -1.654136 2023-02-12 23:49:53,006 - INFO - [Train] step: 97999, loss_mpn: 0.010536, loss_rec: 0.022336, loss_semantic: 0.299986, loss_idmrf: 1.231412, loss_adv_gen: 69.623734 2023-02-12 23:50:30,184 - INFO - [Eval] step: 97999, bce: 0.237629, psnr: 29.505785, ssim: 0.960687 2023-02-12 23:51:00,512 - INFO - [Train] step: 98099, loss_adv_disc: -1.442266 2023-02-12 23:51:00,721 - INFO - [Train] step: 98099, loss_mpn: 0.014729, loss_rec: 0.018707, loss_semantic: 0.285661, loss_idmrf: 1.413507, loss_adv_gen: 27.607925 2023-02-12 23:51:30,007 - INFO - [Train] step: 98199, loss_adv_disc: -0.699554 2023-02-12 23:51:30,216 - INFO - [Train] step: 98199, loss_mpn: 0.008678, loss_rec: 0.019201, loss_semantic: 0.313276, loss_idmrf: 1.220963, loss_adv_gen: 12.763428 2023-02-12 23:51:59,498 - INFO - [Train] step: 98299, loss_adv_disc: -2.186783 2023-02-12 23:51:59,706 - INFO - [Train] step: 98299, loss_mpn: 0.015549, loss_rec: 0.019932, loss_semantic: 0.279974, loss_idmrf: 1.000920, loss_adv_gen: 25.407150 2023-02-12 23:52:28,984 - INFO - [Train] step: 98399, loss_adv_disc: -1.185582 2023-02-12 23:52:29,194 - INFO - [Train] step: 98399, loss_mpn: 0.006607, loss_rec: 0.020756, loss_semantic: 0.326948, loss_idmrf: 0.992291, loss_adv_gen: 49.981812 2023-02-12 23:52:58,476 - INFO - [Train] step: 98499, loss_adv_disc: -4.890911 2023-02-12 23:52:58,685 - INFO - [Train] step: 98499, loss_mpn: 0.041936, loss_rec: 0.041513, loss_semantic: 0.418352, loss_idmrf: 2.071633, loss_adv_gen: 50.534378 2023-02-12 23:53:27,962 - INFO - [Train] step: 98599, loss_adv_disc: -0.982718 2023-02-12 23:53:28,171 - INFO - [Train] step: 98599, loss_mpn: 0.011519, loss_rec: 0.021339, loss_semantic: 0.340193, loss_idmrf: 0.802767, loss_adv_gen: 49.656952 2023-02-12 23:53:57,442 - INFO - [Train] step: 98699, loss_adv_disc: -2.470068 2023-02-12 23:53:57,653 - INFO - [Train] step: 98699, loss_mpn: 0.009266, loss_rec: 0.022244, loss_semantic: 0.307537, loss_idmrf: 0.550980, loss_adv_gen: 56.911133 2023-02-12 23:54:26,939 - INFO - [Train] step: 98799, loss_adv_disc: -2.472844 2023-02-12 23:54:27,148 - INFO - [Train] step: 98799, loss_mpn: 0.010700, loss_rec: 0.022809, loss_semantic: 0.320249, loss_idmrf: 0.558384, loss_adv_gen: 30.458893 2023-02-12 23:54:56,434 - INFO - [Train] step: 98899, loss_adv_disc: -0.323264 2023-02-12 23:54:56,643 - INFO - [Train] step: 98899, loss_mpn: 0.022181, loss_rec: 0.030286, loss_semantic: 0.400485, loss_idmrf: 1.537602, loss_adv_gen: 25.057892 2023-02-12 23:55:25,903 - INFO - [Train] step: 98999, loss_adv_disc: -2.423753 2023-02-12 23:55:26,112 - INFO - [Train] step: 98999, loss_mpn: 0.016031, loss_rec: 0.024652, loss_semantic: 0.332759, loss_idmrf: 1.044863, loss_adv_gen: 74.834183 2023-02-12 23:56:03,278 - INFO - [Eval] step: 98999, bce: 0.197481, psnr: 29.744192, ssim: 0.960677 2023-02-12 23:56:33,591 - INFO - [Train] step: 99099, loss_adv_disc: -2.563568 2023-02-12 23:56:33,800 - INFO - [Train] step: 99099, loss_mpn: 0.012097, loss_rec: 0.024356, loss_semantic: 0.324642, loss_idmrf: 0.618830, loss_adv_gen: 43.534302 2023-02-12 23:57:03,094 - INFO - [Train] step: 99199, loss_adv_disc: -1.088324 2023-02-12 23:57:03,304 - INFO - [Train] step: 99199, loss_mpn: 0.008971, loss_rec: 0.016285, loss_semantic: 0.268190, loss_idmrf: 0.798352, loss_adv_gen: 41.851166 2023-02-12 23:57:32,578 - INFO - [Train] step: 99299, loss_adv_disc: -2.281595 2023-02-12 23:57:32,786 - INFO - [Train] step: 99299, loss_mpn: 0.023757, loss_rec: 0.039521, loss_semantic: 0.458058, loss_idmrf: 2.212524, loss_adv_gen: 47.994659 2023-02-12 23:58:02,067 - INFO - [Train] step: 99399, loss_adv_disc: -1.648328 2023-02-12 23:58:02,276 - INFO - [Train] step: 99399, loss_mpn: 0.011929, loss_rec: 0.025152, loss_semantic: 0.366852, loss_idmrf: 1.709585, loss_adv_gen: 52.345490 2023-02-12 23:58:31,564 - INFO - [Train] step: 99499, loss_adv_disc: -1.694670 2023-02-12 23:58:31,773 - INFO - [Train] step: 99499, loss_mpn: 0.013792, loss_rec: 0.017483, loss_semantic: 0.298034, loss_idmrf: 0.985321, loss_adv_gen: 42.897842 2023-02-12 23:59:01,058 - INFO - [Train] step: 99599, loss_adv_disc: -2.640099 2023-02-12 23:59:01,267 - INFO - [Train] step: 99599, loss_mpn: 0.005901, loss_rec: 0.016573, loss_semantic: 0.254065, loss_idmrf: 1.657698, loss_adv_gen: 47.017471 2023-02-12 23:59:30,547 - INFO - [Train] step: 99699, loss_adv_disc: -1.975363 2023-02-12 23:59:30,756 - INFO - [Train] step: 99699, loss_mpn: 0.010214, loss_rec: 0.017157, loss_semantic: 0.284289, loss_idmrf: 0.652470, loss_adv_gen: 76.310410 2023-02-13 00:00:00,036 - INFO - [Train] step: 99799, loss_adv_disc: -2.733145 2023-02-13 00:00:00,245 - INFO - [Train] step: 99799, loss_mpn: 0.029892, loss_rec: 0.022588, loss_semantic: 0.315824, loss_idmrf: 0.843975, loss_adv_gen: 57.948166 2023-02-13 00:00:29,527 - INFO - [Train] step: 99899, loss_adv_disc: -1.716365 2023-02-13 00:00:29,737 - INFO - [Train] step: 99899, loss_mpn: 0.023169, loss_rec: 0.030597, loss_semantic: 0.413842, loss_idmrf: 1.465317, loss_adv_gen: 49.350250 2023-02-13 00:00:59,018 - INFO - [Train] step: 99999, loss_adv_disc: -1.994717 2023-02-13 00:00:59,227 - INFO - [Train] step: 99999, loss_mpn: 0.011920, loss_rec: 0.018178, loss_semantic: 0.280914, loss_idmrf: 0.890173, loss_adv_gen: 48.812149 2023-02-13 00:01:36,378 - INFO - [Eval] step: 99999, bce: 0.235990, psnr: 29.709747, ssim: 0.960834 2023-02-13 00:01:37,545 - INFO - End of training