2023-02-12 14:28:40,985 - INFO - Experiment directory: runs/downstream-watermark-removal 2023-02-12 14:28:40,985 - INFO - Device: cuda 2023-02-12 14:28:40,985 - INFO - Number of devices: 1 2023-02-12 14:28:41,349 - INFO - Size of training set: 28352 2023-02-12 14:28:41,349 - INFO - Size of validation set: 4051 2023-02-12 14:28:41,349 - INFO - Batch size per device: 4 2023-02-12 14:28:41,349 - INFO - Effective batch size: 4 2023-02-12 14:28:42,908 - INFO - Successfully load mpn from ./runs/places-joint/ckpt/step149999.pt 2023-02-12 14:28:42,913 - INFO - Successfully load rin from ./runs/places-joint/ckpt/step149999.pt 2023-02-12 14:28:42,914 - INFO - Successfully load disc from ./runs/places-joint/ckpt/step149999.pt 2023-02-12 14:28:42,916 - INFO - Successfully load pdisc from ./runs/places-joint/ckpt/step149999.pt 2023-02-12 14:28:42,918 - INFO - Start training... 2023-02-12 14:29:14,414 - INFO - [Train] step: 99, loss_adv_disc: 1.299828 2023-02-12 14:29:14,621 - INFO - [Train] step: 99, loss_mpn: 0.037874, loss_rec: 0.041837, loss_semantic: 0.520840, loss_idmrf: 1.336823, loss_adv_gen: -128.098953 2023-02-12 14:29:43,591 - INFO - [Train] step: 199, loss_adv_disc: -1.303024 2023-02-12 14:29:43,800 - INFO - [Train] step: 199, loss_mpn: 0.036802, loss_rec: 0.040308, loss_semantic: 0.500949, loss_idmrf: 1.170398, loss_adv_gen: -188.719955 2023-02-12 14:30:12,854 - INFO - [Train] step: 299, loss_adv_disc: 0.880611 2023-02-12 14:30:13,062 - INFO - [Train] step: 299, loss_mpn: 0.022952, loss_rec: 0.032488, loss_semantic: 0.472941, loss_idmrf: 0.716021, loss_adv_gen: -164.942017 2023-02-12 14:30:42,137 - INFO - [Train] step: 399, loss_adv_disc: 1.241273 2023-02-12 14:30:42,346 - INFO - [Train] step: 399, loss_mpn: 0.020866, loss_rec: 0.041395, loss_semantic: 0.461653, loss_idmrf: 1.742797, loss_adv_gen: -94.545654 2023-02-12 14:31:11,414 - INFO - [Train] step: 499, loss_adv_disc: -4.018159 2023-02-12 14:31:11,624 - INFO - [Train] step: 499, loss_mpn: 0.042017, loss_rec: 0.038558, loss_semantic: 0.466815, loss_idmrf: 1.169562, loss_adv_gen: -143.637222 2023-02-12 14:31:40,708 - INFO - [Train] step: 599, loss_adv_disc: 0.942124 2023-02-12 14:31:40,916 - INFO - [Train] step: 599, loss_mpn: 0.012776, loss_rec: 0.029491, loss_semantic: 0.450419, loss_idmrf: 0.787979, loss_adv_gen: -97.693977 2023-02-12 14:32:10,001 - INFO - [Train] step: 699, loss_adv_disc: 3.814938 2023-02-12 14:32:10,211 - INFO - [Train] step: 699, loss_mpn: 0.028852, loss_rec: 0.043598, loss_semantic: 0.497849, loss_idmrf: 1.739692, loss_adv_gen: -155.648834 2023-02-12 14:32:39,303 - INFO - [Train] step: 799, loss_adv_disc: -1.610433 2023-02-12 14:32:39,512 - INFO - [Train] step: 799, loss_mpn: 0.029309, loss_rec: 0.045206, loss_semantic: 0.510565, loss_idmrf: 3.046881, loss_adv_gen: -96.266266 2023-02-12 14:33:08,610 - INFO - [Train] step: 899, loss_adv_disc: 1.118233 2023-02-12 14:33:08,819 - INFO - [Train] step: 899, loss_mpn: 0.012930, loss_rec: 0.022580, loss_semantic: 0.349494, loss_idmrf: 0.902996, loss_adv_gen: -88.348045 2023-02-12 14:33:37,905 - INFO - [Train] step: 999, loss_adv_disc: 0.263744 2023-02-12 14:33:38,114 - INFO - [Train] step: 999, loss_mpn: 0.023001, loss_rec: 0.043346, loss_semantic: 0.504102, loss_idmrf: 2.317935, loss_adv_gen: -102.413116 2023-02-12 14:34:15,528 - INFO - [Eval] step: 999, bce: 0.338886, psnr: 26.837254, ssim: 0.944074 2023-02-12 14:34:46,223 - INFO - [Train] step: 1099, loss_adv_disc: -2.614538 2023-02-12 14:34:46,432 - INFO - [Train] step: 1099, loss_mpn: 0.049012, loss_rec: 0.041178, loss_semantic: 0.505731, loss_idmrf: 1.993219, loss_adv_gen: -61.293377 2023-02-12 14:35:15,523 - INFO - [Train] step: 1199, loss_adv_disc: -3.433872 2023-02-12 14:35:15,731 - INFO - [Train] step: 1199, loss_mpn: 0.033315, loss_rec: 0.036747, loss_semantic: 0.420346, loss_idmrf: 1.440817, loss_adv_gen: -119.410286 2023-02-12 14:35:44,803 - INFO - [Train] step: 1299, loss_adv_disc: -1.207434 2023-02-12 14:35:45,012 - INFO - [Train] step: 1299, loss_mpn: 0.015608, loss_rec: 0.038091, loss_semantic: 0.484080, loss_idmrf: 0.916523, loss_adv_gen: -110.312073 2023-02-12 14:36:14,077 - INFO - [Train] step: 1399, loss_adv_disc: -5.074930 2023-02-12 14:36:14,286 - INFO - [Train] step: 1399, loss_mpn: 0.040863, loss_rec: 0.052102, loss_semantic: 0.532582, loss_idmrf: 1.521286, loss_adv_gen: -111.052315 2023-02-12 14:36:43,350 - INFO - [Train] step: 1499, loss_adv_disc: -1.698674 2023-02-12 14:36:43,558 - INFO - [Train] step: 1499, loss_mpn: 0.019298, loss_rec: 0.017522, loss_semantic: 0.335085, loss_idmrf: 1.877787, loss_adv_gen: -124.831696 2023-02-12 14:37:12,635 - INFO - [Train] step: 1599, loss_adv_disc: 0.078364 2023-02-12 14:37:12,844 - INFO - [Train] step: 1599, loss_mpn: 0.032950, loss_rec: 0.033342, loss_semantic: 0.452769, loss_idmrf: 1.889498, loss_adv_gen: -102.988449 2023-02-12 14:37:41,908 - INFO - [Train] step: 1699, loss_adv_disc: 3.190553 2023-02-12 14:37:42,117 - INFO - [Train] step: 1699, loss_mpn: 0.024712, loss_rec: 0.037171, loss_semantic: 0.477129, loss_idmrf: 1.474838, loss_adv_gen: -107.239792 2023-02-12 14:38:11,183 - INFO - [Train] step: 1799, loss_adv_disc: -0.911062 2023-02-12 14:38:11,391 - INFO - [Train] step: 1799, loss_mpn: 0.018543, loss_rec: 0.032870, loss_semantic: 0.416096, loss_idmrf: 2.942292, loss_adv_gen: -97.946312 2023-02-12 14:38:40,450 - INFO - [Train] step: 1899, loss_adv_disc: -2.992178 2023-02-12 14:38:40,658 - INFO - [Train] step: 1899, loss_mpn: 0.044249, loss_rec: 0.053050, loss_semantic: 0.493338, loss_idmrf: 1.900263, loss_adv_gen: -102.013123 2023-02-12 14:39:09,716 - INFO - [Train] step: 1999, loss_adv_disc: 1.107126 2023-02-12 14:39:09,925 - INFO - [Train] step: 1999, loss_mpn: 0.049591, loss_rec: 0.039920, loss_semantic: 0.413558, loss_idmrf: 1.118119, loss_adv_gen: -138.656616 2023-02-12 14:39:47,107 - INFO - [Eval] step: 1999, bce: 0.373344, psnr: 27.043365, ssim: 0.946821 2023-02-12 14:40:17,430 - INFO - [Train] step: 2099, loss_adv_disc: -2.116822 2023-02-12 14:40:17,638 - INFO - [Train] step: 2099, loss_mpn: 0.022930, loss_rec: 0.023332, loss_semantic: 0.383731, loss_idmrf: 1.122571, loss_adv_gen: -93.001617 2023-02-12 14:40:46,705 - INFO - [Train] step: 2199, loss_adv_disc: 1.641629 2023-02-12 14:40:46,913 - INFO - [Train] step: 2199, loss_mpn: 0.015344, loss_rec: 0.028494, loss_semantic: 0.344031, loss_idmrf: 1.670086, loss_adv_gen: -85.842712 2023-02-12 14:41:15,969 - INFO - [Train] step: 2299, loss_adv_disc: -1.291130 2023-02-12 14:41:16,177 - INFO - [Train] step: 2299, loss_mpn: 0.013970, loss_rec: 0.028013, loss_semantic: 0.418684, loss_idmrf: 1.546552, loss_adv_gen: -87.390312 2023-02-12 14:41:45,234 - INFO - [Train] step: 2399, loss_adv_disc: 2.109378 2023-02-12 14:41:45,443 - INFO - [Train] step: 2399, loss_mpn: 0.038823, loss_rec: 0.049957, loss_semantic: 0.547463, loss_idmrf: 1.906920, loss_adv_gen: -140.577087 2023-02-12 14:42:14,492 - INFO - [Train] step: 2499, loss_adv_disc: -2.110101 2023-02-12 14:42:14,700 - INFO - [Train] step: 2499, loss_mpn: 0.028055, loss_rec: 0.036691, loss_semantic: 0.471377, loss_idmrf: 1.584231, loss_adv_gen: -66.208374 2023-02-12 14:42:43,743 - INFO - [Train] step: 2599, loss_adv_disc: 1.007918 2023-02-12 14:42:43,951 - INFO - [Train] step: 2599, loss_mpn: 0.011720, loss_rec: 0.039426, loss_semantic: 0.476948, loss_idmrf: 1.416666, loss_adv_gen: -108.188354 2023-02-12 14:43:12,995 - INFO - [Train] step: 2699, loss_adv_disc: 0.309885 2023-02-12 14:43:13,203 - INFO - [Train] step: 2699, loss_mpn: 0.043115, loss_rec: 0.048996, loss_semantic: 0.506540, loss_idmrf: 1.823579, loss_adv_gen: -79.640060 2023-02-12 14:43:42,259 - INFO - [Train] step: 2799, loss_adv_disc: 1.769973 2023-02-12 14:43:42,470 - INFO - [Train] step: 2799, loss_mpn: 0.029426, loss_rec: 0.040913, loss_semantic: 0.455353, loss_idmrf: 1.369126, loss_adv_gen: -82.098961 2023-02-12 14:44:11,518 - INFO - [Train] step: 2899, loss_adv_disc: -2.826468 2023-02-12 14:44:11,727 - INFO - [Train] step: 2899, loss_mpn: 0.024427, loss_rec: 0.036017, loss_semantic: 0.450512, loss_idmrf: 1.423253, loss_adv_gen: -119.713104 2023-02-12 14:44:40,782 - INFO - [Train] step: 2999, loss_adv_disc: -0.511095 2023-02-12 14:44:40,992 - INFO - [Train] step: 2999, loss_mpn: 0.019680, loss_rec: 0.028514, loss_semantic: 0.381640, loss_idmrf: 1.594770, loss_adv_gen: -121.859711 2023-02-12 14:45:18,178 - INFO - [Eval] step: 2999, bce: 0.325142, psnr: 27.303087, ssim: 0.947815 2023-02-12 14:45:48,502 - INFO - [Train] step: 3099, loss_adv_disc: 0.043045 2023-02-12 14:45:48,711 - INFO - [Train] step: 3099, loss_mpn: 0.016258, loss_rec: 0.021658, loss_semantic: 0.378096, loss_idmrf: 1.504422, loss_adv_gen: -198.545273 2023-02-12 14:46:17,756 - INFO - [Train] step: 3199, loss_adv_disc: -1.038017 2023-02-12 14:46:17,965 - INFO - [Train] step: 3199, loss_mpn: 0.012933, loss_rec: 0.026565, loss_semantic: 0.426411, loss_idmrf: 1.404650, loss_adv_gen: -96.168388 2023-02-12 14:46:47,018 - INFO - [Train] step: 3299, loss_adv_disc: 0.046620 2023-02-12 14:46:47,226 - INFO - [Train] step: 3299, loss_mpn: 0.011906, loss_rec: 0.014265, loss_semantic: 0.269762, loss_idmrf: 0.559204, loss_adv_gen: -99.662216 2023-02-12 14:47:16,291 - INFO - [Train] step: 3399, loss_adv_disc: -0.021643 2023-02-12 14:47:16,499 - INFO - [Train] step: 3399, loss_mpn: 0.017540, loss_rec: 0.029993, loss_semantic: 0.392757, loss_idmrf: 1.623783, loss_adv_gen: -72.550812 2023-02-12 14:47:45,556 - INFO - [Train] step: 3499, loss_adv_disc: 0.026034 2023-02-12 14:47:45,764 - INFO - [Train] step: 3499, loss_mpn: 0.024654, loss_rec: 0.025382, loss_semantic: 0.367133, loss_idmrf: 1.421693, loss_adv_gen: -106.288788 2023-02-12 14:48:14,815 - INFO - [Train] step: 3599, loss_adv_disc: -0.349758 2023-02-12 14:48:15,023 - INFO - [Train] step: 3599, loss_mpn: 0.032561, loss_rec: 0.033780, loss_semantic: 0.448922, loss_idmrf: 1.408511, loss_adv_gen: -133.010773 2023-02-12 14:48:44,080 - INFO - [Train] step: 3699, loss_adv_disc: -3.845224 2023-02-12 14:48:44,288 - INFO - [Train] step: 3699, loss_mpn: 0.046263, loss_rec: 0.035030, loss_semantic: 0.449073, loss_idmrf: 1.693964, loss_adv_gen: -101.824783 2023-02-12 14:49:13,351 - INFO - [Train] step: 3799, loss_adv_disc: 0.691240 2023-02-12 14:49:13,559 - INFO - [Train] step: 3799, loss_mpn: 0.033147, loss_rec: 0.032454, loss_semantic: 0.412963, loss_idmrf: 1.078437, loss_adv_gen: -133.369232 2023-02-12 14:49:42,612 - INFO - [Train] step: 3899, loss_adv_disc: -1.092527 2023-02-12 14:49:42,820 - INFO - [Train] step: 3899, loss_mpn: 0.030574, loss_rec: 0.035086, loss_semantic: 0.450594, loss_idmrf: 1.633320, loss_adv_gen: -157.005127 2023-02-12 14:50:11,886 - INFO - [Train] step: 3999, loss_adv_disc: 0.675812 2023-02-12 14:50:12,095 - INFO - [Train] step: 3999, loss_mpn: 0.021050, loss_rec: 0.033965, loss_semantic: 0.449186, loss_idmrf: 1.248645, loss_adv_gen: -97.598892 2023-02-12 14:50:49,252 - INFO - [Eval] step: 3999, bce: 0.346579, psnr: 27.542393, ssim: 0.949793 2023-02-12 14:51:19,568 - INFO - [Train] step: 4099, loss_adv_disc: 0.768379 2023-02-12 14:51:19,777 - INFO - [Train] step: 4099, loss_mpn: 0.013409, loss_rec: 0.029262, loss_semantic: 0.393055, loss_idmrf: 1.478358, loss_adv_gen: -109.265968 2023-02-12 14:51:48,839 - INFO - [Train] step: 4199, loss_adv_disc: -0.045594 2023-02-12 14:51:49,047 - INFO - [Train] step: 4199, loss_mpn: 0.024171, loss_rec: 0.017396, loss_semantic: 0.286983, loss_idmrf: 0.789566, loss_adv_gen: -148.535095 2023-02-12 14:52:18,101 - INFO - [Train] step: 4299, loss_adv_disc: -0.430947 2023-02-12 14:52:18,310 - INFO - [Train] step: 4299, loss_mpn: 0.014691, loss_rec: 0.021662, loss_semantic: 0.326292, loss_idmrf: 0.946436, loss_adv_gen: -91.638748 2023-02-12 14:52:47,360 - INFO - [Train] step: 4399, loss_adv_disc: -0.410857 2023-02-12 14:52:47,568 - INFO - [Train] step: 4399, loss_mpn: 0.011707, loss_rec: 0.021423, loss_semantic: 0.334674, loss_idmrf: 0.605388, loss_adv_gen: -137.055222 2023-02-12 14:53:16,621 - INFO - [Train] step: 4499, loss_adv_disc: -0.533930 2023-02-12 14:53:16,829 - INFO - [Train] step: 4499, loss_mpn: 0.012462, loss_rec: 0.036778, loss_semantic: 0.428064, loss_idmrf: 1.709253, loss_adv_gen: -201.091476 2023-02-12 14:53:45,873 - INFO - [Train] step: 4599, loss_adv_disc: -1.762612 2023-02-12 14:53:46,082 - INFO - [Train] step: 4599, loss_mpn: 0.026616, loss_rec: 0.052115, loss_semantic: 0.480837, loss_idmrf: 1.606442, loss_adv_gen: -97.646194 2023-02-12 14:54:15,132 - INFO - [Train] step: 4699, loss_adv_disc: -0.790706 2023-02-12 14:54:15,340 - INFO - [Train] step: 4699, loss_mpn: 0.018837, loss_rec: 0.028844, loss_semantic: 0.392761, loss_idmrf: 1.836121, loss_adv_gen: -98.939705 2023-02-12 14:54:44,391 - INFO - [Train] step: 4799, loss_adv_disc: -0.965508 2023-02-12 14:54:44,601 - INFO - [Train] step: 4799, loss_mpn: 0.032487, loss_rec: 0.035681, loss_semantic: 0.437911, loss_idmrf: 1.694472, loss_adv_gen: -81.396034 2023-02-12 14:55:13,651 - INFO - [Train] step: 4899, loss_adv_disc: -1.041402 2023-02-12 14:55:13,860 - INFO - [Train] step: 4899, loss_mpn: 0.028312, loss_rec: 0.028561, loss_semantic: 0.396117, loss_idmrf: 1.250476, loss_adv_gen: -97.100685 2023-02-12 14:55:42,913 - INFO - [Train] step: 4999, loss_adv_disc: 0.711361 2023-02-12 14:55:43,122 - INFO - [Train] step: 4999, loss_mpn: 0.021475, loss_rec: 0.026028, loss_semantic: 0.379468, loss_idmrf: 1.322870, loss_adv_gen: -115.331512 2023-02-12 14:56:20,330 - INFO - [Eval] step: 4999, bce: 0.351477, psnr: 27.780710, ssim: 0.950434 2023-02-12 14:56:50,783 - INFO - [Train] step: 5099, loss_adv_disc: -2.985783 2023-02-12 14:56:50,991 - INFO - [Train] step: 5099, loss_mpn: 0.019004, loss_rec: 0.041138, loss_semantic: 0.508738, loss_idmrf: 0.819962, loss_adv_gen: -182.782623 2023-02-12 14:57:20,046 - INFO - [Train] step: 5199, loss_adv_disc: -1.645151 2023-02-12 14:57:20,256 - INFO - [Train] step: 5199, loss_mpn: 0.016225, loss_rec: 0.025021, loss_semantic: 0.370635, loss_idmrf: 0.914956, loss_adv_gen: -103.670837 2023-02-12 14:57:49,304 - INFO - [Train] step: 5299, loss_adv_disc: -0.446883 2023-02-12 14:57:49,512 - INFO - [Train] step: 5299, loss_mpn: 0.025473, loss_rec: 0.035687, loss_semantic: 0.464093, loss_idmrf: 1.667971, loss_adv_gen: -153.179428 2023-02-12 14:58:18,560 - INFO - [Train] step: 5399, loss_adv_disc: 0.041118 2023-02-12 14:58:18,767 - INFO - [Train] step: 5399, loss_mpn: 0.022868, loss_rec: 0.027386, loss_semantic: 0.379084, loss_idmrf: 1.039361, loss_adv_gen: -118.277161 2023-02-12 14:58:47,819 - INFO - [Train] step: 5499, loss_adv_disc: -0.695383 2023-02-12 14:58:48,028 - INFO - [Train] step: 5499, loss_mpn: 0.017166, loss_rec: 0.023030, loss_semantic: 0.343284, loss_idmrf: 1.867031, loss_adv_gen: -132.320648 2023-02-12 14:59:17,075 - INFO - [Train] step: 5599, loss_adv_disc: -0.396028 2023-02-12 14:59:17,283 - INFO - [Train] step: 5599, loss_mpn: 0.016136, loss_rec: 0.024782, loss_semantic: 0.387950, loss_idmrf: 1.010486, loss_adv_gen: -124.445305 2023-02-12 14:59:46,331 - INFO - [Train] step: 5699, loss_adv_disc: -2.364127 2023-02-12 14:59:46,539 - INFO - [Train] step: 5699, loss_mpn: 0.022420, loss_rec: 0.027926, loss_semantic: 0.412665, loss_idmrf: 1.334894, loss_adv_gen: -62.740250 2023-02-12 15:00:15,585 - INFO - [Train] step: 5799, loss_adv_disc: -0.426686 2023-02-12 15:00:15,793 - INFO - [Train] step: 5799, loss_mpn: 0.020284, loss_rec: 0.029912, loss_semantic: 0.416979, loss_idmrf: 2.073402, loss_adv_gen: -149.003220 2023-02-12 15:00:44,848 - INFO - [Train] step: 5899, loss_adv_disc: -0.774378 2023-02-12 15:00:45,057 - INFO - [Train] step: 5899, loss_mpn: 0.013465, loss_rec: 0.031138, loss_semantic: 0.415459, loss_idmrf: 0.909214, loss_adv_gen: -113.032188 2023-02-12 15:01:14,101 - INFO - [Train] step: 5999, loss_adv_disc: -0.389311 2023-02-12 15:01:14,312 - INFO - [Train] step: 5999, loss_mpn: 0.039259, loss_rec: 0.035546, loss_semantic: 0.422057, loss_idmrf: 1.271151, loss_adv_gen: -117.240402 2023-02-12 15:01:51,501 - INFO - [Eval] step: 5999, bce: 0.340868, psnr: 27.871330, ssim: 0.950925 2023-02-12 15:02:21,828 - INFO - [Train] step: 6099, loss_adv_disc: -0.167378 2023-02-12 15:02:22,037 - INFO - [Train] step: 6099, loss_mpn: 0.016576, loss_rec: 0.036952, loss_semantic: 0.467681, loss_idmrf: 1.342181, loss_adv_gen: -104.226501 2023-02-12 15:02:51,091 - INFO - [Train] step: 6199, loss_adv_disc: -0.683909 2023-02-12 15:02:51,300 - INFO - [Train] step: 6199, loss_mpn: 0.018274, loss_rec: 0.032154, loss_semantic: 0.425463, loss_idmrf: 0.816652, loss_adv_gen: -86.657425 2023-02-12 15:03:20,351 - INFO - [Train] step: 6299, loss_adv_disc: -1.493726 2023-02-12 15:03:20,560 - INFO - [Train] step: 6299, loss_mpn: 0.023145, loss_rec: 0.033411, loss_semantic: 0.432057, loss_idmrf: 0.897825, loss_adv_gen: -139.223816 2023-02-12 15:03:49,616 - INFO - [Train] step: 6399, loss_adv_disc: -0.277531 2023-02-12 15:03:49,824 - INFO - [Train] step: 6399, loss_mpn: 0.017375, loss_rec: 0.029168, loss_semantic: 0.402245, loss_idmrf: 0.622658, loss_adv_gen: -142.869843 2023-02-12 15:04:18,892 - INFO - [Train] step: 6499, loss_adv_disc: -0.235311 2023-02-12 15:04:19,101 - INFO - [Train] step: 6499, loss_mpn: 0.012269, loss_rec: 0.022133, loss_semantic: 0.319729, loss_idmrf: 1.260930, loss_adv_gen: -171.630920 2023-02-12 15:04:48,160 - INFO - [Train] step: 6599, loss_adv_disc: -0.469994 2023-02-12 15:04:48,370 - INFO - [Train] step: 6599, loss_mpn: 0.008784, loss_rec: 0.019133, loss_semantic: 0.306887, loss_idmrf: 0.801923, loss_adv_gen: -64.075722 2023-02-12 15:05:17,426 - INFO - [Train] step: 6699, loss_adv_disc: -1.466449 2023-02-12 15:05:17,634 - INFO - [Train] step: 6699, loss_mpn: 0.021287, loss_rec: 0.035671, loss_semantic: 0.454560, loss_idmrf: 1.030252, loss_adv_gen: -120.588272 2023-02-12 15:05:46,695 - INFO - [Train] step: 6799, loss_adv_disc: 0.515251 2023-02-12 15:05:46,904 - INFO - [Train] step: 6799, loss_mpn: 0.013280, loss_rec: 0.027942, loss_semantic: 0.360047, loss_idmrf: 1.654778, loss_adv_gen: -180.273392 2023-02-12 15:06:15,963 - INFO - [Train] step: 6899, loss_adv_disc: -0.489713 2023-02-12 15:06:16,171 - INFO - [Train] step: 6899, loss_mpn: 0.018902, loss_rec: 0.038339, loss_semantic: 0.442717, loss_idmrf: 2.431321, loss_adv_gen: -115.624718 2023-02-12 15:06:45,239 - INFO - [Train] step: 6999, loss_adv_disc: -1.769455 2023-02-12 15:06:45,447 - INFO - [Train] step: 6999, loss_mpn: 0.019381, loss_rec: 0.028560, loss_semantic: 0.383751, loss_idmrf: 1.163507, loss_adv_gen: -153.030273 2023-02-12 15:07:22,635 - INFO - [Eval] step: 6999, bce: 0.306903, psnr: 28.013441, ssim: 0.951795 2023-02-12 15:07:53,174 - INFO - [Train] step: 7099, loss_adv_disc: 1.659884 2023-02-12 15:07:53,383 - INFO - [Train] step: 7099, loss_mpn: 0.028287, loss_rec: 0.037931, loss_semantic: 0.468431, loss_idmrf: 1.497301, loss_adv_gen: -132.254868 2023-02-12 15:08:22,426 - INFO - [Train] step: 7199, loss_adv_disc: 0.472226 2023-02-12 15:08:22,636 - INFO - [Train] step: 7199, loss_mpn: 0.013172, loss_rec: 0.024137, loss_semantic: 0.364234, loss_idmrf: 1.162045, loss_adv_gen: -132.014572 2023-02-12 15:08:51,699 - INFO - [Train] step: 7299, loss_adv_disc: -0.581740 2023-02-12 15:08:51,908 - INFO - [Train] step: 7299, loss_mpn: 0.012158, loss_rec: 0.031973, loss_semantic: 0.362457, loss_idmrf: 1.268194, loss_adv_gen: -137.127380 2023-02-12 15:09:20,969 - INFO - [Train] step: 7399, loss_adv_disc: -0.978511 2023-02-12 15:09:21,178 - INFO - [Train] step: 7399, loss_mpn: 0.022330, loss_rec: 0.024040, loss_semantic: 0.309038, loss_idmrf: 0.792379, loss_adv_gen: -106.908493 2023-02-12 15:09:50,228 - INFO - [Train] step: 7499, loss_adv_disc: -0.073679 2023-02-12 15:09:50,436 - INFO - [Train] step: 7499, loss_mpn: 0.018063, loss_rec: 0.022740, loss_semantic: 0.319768, loss_idmrf: 1.349909, loss_adv_gen: -165.113327 2023-02-12 15:10:19,481 - INFO - [Train] step: 7599, loss_adv_disc: 0.049674 2023-02-12 15:10:19,689 - INFO - [Train] step: 7599, loss_mpn: 0.010805, loss_rec: 0.025964, loss_semantic: 0.350963, loss_idmrf: 0.961297, loss_adv_gen: -113.559540 2023-02-12 15:10:48,747 - INFO - [Train] step: 7699, loss_adv_disc: -0.608275 2023-02-12 15:10:48,955 - INFO - [Train] step: 7699, loss_mpn: 0.016553, loss_rec: 0.027370, loss_semantic: 0.380454, loss_idmrf: 1.702928, loss_adv_gen: -108.241295 2023-02-12 15:11:18,013 - INFO - [Train] step: 7799, loss_adv_disc: 1.413847 2023-02-12 15:11:18,221 - INFO - [Train] step: 7799, loss_mpn: 0.021216, loss_rec: 0.033353, loss_semantic: 0.390046, loss_idmrf: 1.304851, loss_adv_gen: -118.235794 2023-02-12 15:11:47,280 - INFO - [Train] step: 7899, loss_adv_disc: -0.615377 2023-02-12 15:11:47,489 - INFO - [Train] step: 7899, loss_mpn: 0.016149, loss_rec: 0.024272, loss_semantic: 0.358250, loss_idmrf: 0.852178, loss_adv_gen: -133.236221 2023-02-12 15:12:16,555 - INFO - [Train] step: 7999, loss_adv_disc: -0.062408 2023-02-12 15:12:16,764 - INFO - [Train] step: 7999, loss_mpn: 0.013659, loss_rec: 0.026106, loss_semantic: 0.386249, loss_idmrf: 1.021867, loss_adv_gen: -147.716049 2023-02-12 15:12:53,951 - INFO - [Eval] step: 7999, bce: 0.288697, psnr: 28.163181, ssim: 0.952470 2023-02-12 15:13:24,273 - INFO - [Train] step: 8099, loss_adv_disc: -1.120922 2023-02-12 15:13:24,481 - INFO - [Train] step: 8099, loss_mpn: 0.004955, loss_rec: 0.020915, loss_semantic: 0.322940, loss_idmrf: 1.106715, loss_adv_gen: -151.821106 2023-02-12 15:13:53,543 - INFO - [Train] step: 8199, loss_adv_disc: 0.029411 2023-02-12 15:13:53,753 - INFO - [Train] step: 8199, loss_mpn: 0.026964, loss_rec: 0.028958, loss_semantic: 0.445343, loss_idmrf: 1.209411, loss_adv_gen: -102.843552 2023-02-12 15:14:22,805 - INFO - [Train] step: 8299, loss_adv_disc: -4.315375 2023-02-12 15:14:23,012 - INFO - [Train] step: 8299, loss_mpn: 0.019549, loss_rec: 0.034941, loss_semantic: 0.462220, loss_idmrf: 0.890771, loss_adv_gen: -154.584610 2023-02-12 15:14:52,068 - INFO - [Train] step: 8399, loss_adv_disc: -1.995814 2023-02-12 15:14:52,276 - INFO - [Train] step: 8399, loss_mpn: 0.012006, loss_rec: 0.034098, loss_semantic: 0.396413, loss_idmrf: 0.811635, loss_adv_gen: -96.562088 2023-02-12 15:15:21,323 - INFO - [Train] step: 8499, loss_adv_disc: 0.820693 2023-02-12 15:15:21,532 - INFO - [Train] step: 8499, loss_mpn: 0.013124, loss_rec: 0.029301, loss_semantic: 0.393588, loss_idmrf: 1.013259, loss_adv_gen: -89.326385 2023-02-12 15:15:50,592 - INFO - [Train] step: 8599, loss_adv_disc: 0.393147 2023-02-12 15:15:50,800 - INFO - [Train] step: 8599, loss_mpn: 0.011680, loss_rec: 0.017792, loss_semantic: 0.300831, loss_idmrf: 0.998240, loss_adv_gen: -184.119385 2023-02-12 15:16:19,842 - INFO - [Train] step: 8699, loss_adv_disc: -0.531485 2023-02-12 15:16:20,051 - INFO - [Train] step: 8699, loss_mpn: 0.015221, loss_rec: 0.025195, loss_semantic: 0.341359, loss_idmrf: 1.062933, loss_adv_gen: -110.207268 2023-02-12 15:16:49,123 - INFO - [Train] step: 8799, loss_adv_disc: -0.491042 2023-02-12 15:16:49,332 - INFO - [Train] step: 8799, loss_mpn: 0.010106, loss_rec: 0.024656, loss_semantic: 0.360912, loss_idmrf: 0.767988, loss_adv_gen: -151.354294 2023-02-12 15:17:18,385 - INFO - [Train] step: 8899, loss_adv_disc: -0.419860 2023-02-12 15:17:18,594 - INFO - [Train] step: 8899, loss_mpn: 0.013824, loss_rec: 0.030028, loss_semantic: 0.417085, loss_idmrf: 1.085957, loss_adv_gen: -207.431549 2023-02-12 15:17:47,642 - INFO - [Train] step: 8999, loss_adv_disc: -0.628583 2023-02-12 15:17:47,850 - INFO - [Train] step: 8999, loss_mpn: 0.024702, loss_rec: 0.037819, loss_semantic: 0.416768, loss_idmrf: 2.331512, loss_adv_gen: -108.836723 2023-02-12 15:18:25,047 - INFO - [Eval] step: 8999, bce: 0.302010, psnr: 28.254166, ssim: 0.952631 2023-02-12 15:18:55,405 - INFO - [Train] step: 9099, loss_adv_disc: 0.632391 2023-02-12 15:18:55,616 - INFO - [Train] step: 9099, loss_mpn: 0.014931, loss_rec: 0.024709, loss_semantic: 0.325625, loss_idmrf: 1.853235, loss_adv_gen: -115.922569 2023-02-12 15:19:24,673 - INFO - [Train] step: 9199, loss_adv_disc: -1.815902 2023-02-12 15:19:24,881 - INFO - [Train] step: 9199, loss_mpn: 0.016712, loss_rec: 0.022784, loss_semantic: 0.345748, loss_idmrf: 1.113174, loss_adv_gen: -51.955341 2023-02-12 15:19:53,938 - INFO - [Train] step: 9299, loss_adv_disc: -0.156092 2023-02-12 15:19:54,147 - INFO - [Train] step: 9299, loss_mpn: 0.011803, loss_rec: 0.023657, loss_semantic: 0.339971, loss_idmrf: 1.318946, loss_adv_gen: -127.398438 2023-02-12 15:20:23,205 - INFO - [Train] step: 9399, loss_adv_disc: -0.345600 2023-02-12 15:20:23,415 - INFO - [Train] step: 9399, loss_mpn: 0.036090, loss_rec: 0.027977, loss_semantic: 0.409671, loss_idmrf: 1.271996, loss_adv_gen: -96.280853 2023-02-12 15:20:52,463 - INFO - [Train] step: 9499, loss_adv_disc: -0.154040 2023-02-12 15:20:52,671 - INFO - [Train] step: 9499, loss_mpn: 0.008822, loss_rec: 0.023087, loss_semantic: 0.333166, loss_idmrf: 1.134625, loss_adv_gen: -172.469971 2023-02-12 15:21:21,728 - INFO - [Train] step: 9599, loss_adv_disc: -2.609488 2023-02-12 15:21:21,937 - INFO - [Train] step: 9599, loss_mpn: 0.018622, loss_rec: 0.031683, loss_semantic: 0.426192, loss_idmrf: 0.870782, loss_adv_gen: -63.231483 2023-02-12 15:21:50,997 - INFO - [Train] step: 9699, loss_adv_disc: -1.726371 2023-02-12 15:21:51,205 - INFO - [Train] step: 9699, loss_mpn: 0.016741, loss_rec: 0.030167, loss_semantic: 0.398618, loss_idmrf: 1.257529, loss_adv_gen: -196.563248 2023-02-12 15:22:20,258 - INFO - [Train] step: 9799, loss_adv_disc: -2.347822 2023-02-12 15:22:20,466 - INFO - [Train] step: 9799, loss_mpn: 0.021154, loss_rec: 0.028975, loss_semantic: 0.403203, loss_idmrf: 0.936794, loss_adv_gen: -92.555832 2023-02-12 15:22:49,519 - INFO - [Train] step: 9899, loss_adv_disc: -0.370060 2023-02-12 15:22:49,728 - INFO - [Train] step: 9899, loss_mpn: 0.013475, loss_rec: 0.026137, loss_semantic: 0.385814, loss_idmrf: 0.893062, loss_adv_gen: -135.118530 2023-02-12 15:23:18,785 - INFO - [Train] step: 9999, loss_adv_disc: -1.345161 2023-02-12 15:23:18,993 - INFO - [Train] step: 9999, loss_mpn: 0.022469, loss_rec: 0.028983, loss_semantic: 0.421686, loss_idmrf: 1.735447, loss_adv_gen: -143.855499 2023-02-12 15:23:56,181 - INFO - [Eval] step: 9999, bce: 0.237329, psnr: 28.375944, ssim: 0.953470 2023-02-12 15:23:57,541 - INFO - End of training