vall-e_libritts / libritts-r /log /log-train-2024-08-06-08-06-14-6
yuekai's picture
Upload folder using huggingface_hub
c96c265 verified
raw
history blame contribute delete
No virus
71.4 kB
2024-08-06 08:06:14,313 INFO [trainer.py:870] (6/8) Training started
2024-08-06 08:06:14,315 INFO [trainer.py:889] (6/8) Device: cuda:6
2024-08-06 08:06:14,315 INFO [trainer.py:890] (6/8) {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': 0, 'log_interval': 100, 'reset_interval': 200, 'valid_interval': 2000, 'env_info': {'k2-version': '1.24.3', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '279b0c87015a615b81b147251814d737a548f397', 'k2-git-date': 'Wed May 24 22:24:09 2023', 'lhotse-version': '1.26.0', 'torch-version': '2.0.1+cu118', 'torch-cuda-available': True, 'torch-cuda-version': '11.8', 'python-version': '3.10', 'icefall-git-branch': None, 'icefall-git-sha1': None, 'icefall-git-date': None, 'icefall-path': '/workspace/icefall_llm', 'k2-path': '/usr/local/lib/python3.10/dist-packages/k2/__init__.py', 'lhotse-path': '/usr/local/lib/python3.10/dist-packages/lhotse/__init__.py', 'hostname': '6867463', 'IP address': '0.104.202.7'}, 'world_size': 8, 'master_port': 12354, 'tensorboard': True, 'num_epochs': 20, 'start_epoch': 1, 'start_batch': 0, 'exp_dir': PosixPath('exp/valle'), 'optimizer_name': 'ScaledAdam', 'scheduler_name': 'Eden', 'base_lr': 0.03, 'warmup_steps': 200, 'seed': 42, 'inf_check': False, 'save_every_n': 20000, 'keep_last_k': 20, 'average_period': 0, 'accumulate_grad_steps': 1, 'dtype': 'bfloat16', 'filter_min_duration': 0.5, 'filter_max_duration': 14.0, 'train_stage': 1, 'visualize': False, 'oom_check': False, 'model_name': 'valle', 'decoder_dim': 1024, 'nhead': 16, 'num_decoder_layers': 12, 'scale_factor': 1.0, 'norm_first': True, 'add_prenet': False, 'prefix_mode': 1, 'share_embedding': True, 'prepend_bos': False, 'num_quantizers': 8, 'scaling_xformers': False, 'manifest_dir': PosixPath('data/tokenized'), 'max_duration': 320, 'bucketing_sampler': True, 'num_buckets': 6, 'concatenate_cuts': False, 'duration_factor': 1.0, 'gap': 0.1, 'on_the_fly_feats': False, 'shuffle': True, 'buffer_size': 40000, 'shuffle_buffer_size': 100000, 'drop_last': False, 'return_cuts': True, 'num_workers': 8, 'enable_spec_aug': False, 'spec_aug_time_warp_factor': 80, 'input_strategy': 'PrecomputedFeatures', 'dataset': 'libritts', 'text_tokens': 'data/tokenized/unique_text_tokens.k2symbols', 'sampling_rate': 24000}
2024-08-06 08:06:14,315 INFO [trainer.py:892] (6/8) About to create model
2024-08-06 08:06:15,003 INFO [trainer.py:899] (6/8) Number of model parameters: 367386628
2024-08-06 08:06:16,221 INFO [trainer.py:914] (6/8) Using DDP
2024-08-06 08:06:19,152 INFO [datamodule.py:427] (6/8) About to get train cuts
2024-08-06 08:06:19,154 INFO [datamodule.py:434] (6/8) About to get dev cuts
2024-08-06 08:06:19,155 INFO [datamodule.py:292] (6/8) Disable SpecAugment
2024-08-06 08:06:19,155 INFO [datamodule.py:294] (6/8) About to create train dataset
2024-08-06 08:06:19,155 INFO [datamodule.py:323] (6/8) Using DynamicBucketingSampler
2024-08-06 08:06:19,762 INFO [datamodule.py:344] (6/8) About to create train dataloader
2024-08-06 08:06:19,762 INFO [datamodule.py:367] (6/8) About to create dev dataset
2024-08-06 08:06:20,082 INFO [datamodule.py:388] (6/8) About to create dev dataloader
2024-08-06 08:08:02,121 INFO [trainer.py:765] (6/8) Epoch 1, batch 100, train_loss[loss=4.388, ArTop10Accuracy=0.4801, over 14610.00 frames. ], tot_loss[loss=5.055, ArTop10Accuracy=0.3726, over 4770.95 frames. ], batch size: 63, lr: 2.25e-02
2024-08-06 08:09:28,827 INFO [trainer.py:765] (6/8) Epoch 1, batch 200, train_loss[loss=3.986, ArTop10Accuracy=0.5533, over 13587.00 frames. ], tot_loss[loss=4.49, ArTop10Accuracy=0.4676, over 7758.92 frames. ], batch size: 34, lr: 3.00e-02
2024-08-06 08:10:52,429 INFO [trainer.py:765] (6/8) Epoch 1, batch 300, train_loss[loss=3.87, ArTop10Accuracy=0.5696, over 14733.00 frames. ], tot_loss[loss=4.219, ArTop10Accuracy=0.5124, over 9377.57 frames. ], batch size: 45, lr: 3.00e-02
2024-08-06 08:12:12,699 INFO [trainer.py:765] (6/8) Epoch 1, batch 400, train_loss[loss=3.622, ArTop10Accuracy=0.6205, over 10365.00 frames. ], tot_loss[loss=4.032, ArTop10Accuracy=0.5444, over 10277.76 frames. ], batch size: 14, lr: 3.00e-02
2024-08-06 08:13:40,049 INFO [trainer.py:765] (6/8) Epoch 1, batch 500, train_loss[loss=3.739, ArTop10Accuracy=0.5912, over 12429.00 frames. ], tot_loss[loss=3.887, ArTop10Accuracy=0.5694, over 10839.94 frames. ], batch size: 22, lr: 2.99e-02
2024-08-06 08:15:00,242 INFO [trainer.py:765] (6/8) Epoch 1, batch 600, train_loss[loss=3.558, ArTop10Accuracy=0.6331, over 11361.00 frames. ], tot_loss[loss=3.772, ArTop10Accuracy=0.5898, over 11361.50 frames. ], batch size: 18, lr: 2.99e-02
2024-08-06 08:16:26,423 INFO [trainer.py:765] (6/8) Epoch 1, batch 700, train_loss[loss=3.451, ArTop10Accuracy=0.6472, over 10158.00 frames. ], tot_loss[loss=3.689, ArTop10Accuracy=0.6045, over 11517.12 frames. ], batch size: 12, lr: 2.99e-02
2024-08-06 08:17:43,017 INFO [trainer.py:765] (6/8) Epoch 1, batch 800, train_loss[loss=3.352, ArTop10Accuracy=0.6725, over 9489.00 frames. ], tot_loss[loss=3.627, ArTop10Accuracy=0.6162, over 11639.17 frames. ], batch size: 11, lr: 2.98e-02
2024-08-06 08:18:56,150 INFO [trainer.py:765] (6/8) Epoch 1, batch 900, train_loss[loss=3.437, ArTop10Accuracy=0.6495, over 12990.00 frames. ], tot_loss[loss=3.567, ArTop10Accuracy=0.6269, over 11692.94 frames. ], batch size: 27, lr: 2.98e-02
2024-08-06 08:20:12,862 INFO [trainer.py:765] (6/8) Epoch 1, batch 1000, train_loss[loss=3.39, ArTop10Accuracy=0.6602, over 12999.00 frames. ], tot_loss[loss=3.525, ArTop10Accuracy=0.6346, over 11885.92 frames. ], batch size: 27, lr: 2.97e-02
2024-08-06 08:20:13,538 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 9.300e+01 1.871e+02 2.675e+02 4.030e+02 9.119e+03, threshold=5.351e+02, percent-clipped=0.0
2024-08-06 08:21:29,154 INFO [trainer.py:765] (6/8) Epoch 1, batch 1100, train_loss[loss=3.509, ArTop10Accuracy=0.6415, over 13734.00 frames. ], tot_loss[loss=3.494, ArTop10Accuracy=0.6401, over 11960.16 frames. ], batch size: 34, lr: 2.96e-02
2024-08-06 08:22:45,411 INFO [trainer.py:765] (6/8) Epoch 1, batch 1200, train_loss[loss=3.423, ArTop10Accuracy=0.6589, over 11565.00 frames. ], tot_loss[loss=3.464, ArTop10Accuracy=0.6459, over 11864.25 frames. ], batch size: 101, lr: 2.96e-02
2024-08-06 08:23:45,150 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 08:25:36,237 INFO [trainer.py:765] (6/8) Epoch 2, batch 100, train_loss[loss=3.402, ArTop10Accuracy=0.6551, over 14169.00 frames. ], tot_loss[loss=3.421, ArTop10Accuracy=0.6534, over 4752.27 frames. ], batch size: 62, lr: 2.90e-02
2024-08-06 08:26:58,956 INFO [trainer.py:765] (6/8) Epoch 2, batch 200, train_loss[loss=3.363, ArTop10Accuracy=0.6631, over 13458.00 frames. ], tot_loss[loss=3.391, ArTop10Accuracy=0.6587, over 7750.75 frames. ], batch size: 34, lr: 2.89e-02
2024-08-06 08:28:25,532 INFO [trainer.py:765] (6/8) Epoch 2, batch 300, train_loss[loss=3.304, ArTop10Accuracy=0.6729, over 14391.00 frames. ], tot_loss[loss=3.375, ArTop10Accuracy=0.6616, over 9369.14 frames. ], batch size: 45, lr: 2.89e-02
2024-08-06 08:29:48,636 INFO [trainer.py:765] (6/8) Epoch 2, batch 400, train_loss[loss=3.287, ArTop10Accuracy=0.6818, over 10422.00 frames. ], tot_loss[loss=3.355, ArTop10Accuracy=0.6658, over 10264.80 frames. ], batch size: 14, lr: 2.88e-02
2024-08-06 08:31:22,901 INFO [trainer.py:765] (6/8) Epoch 2, batch 500, train_loss[loss=3.302, ArTop10Accuracy=0.6806, over 12180.00 frames. ], tot_loss[loss=3.337, ArTop10Accuracy=0.6695, over 10824.54 frames. ], batch size: 22, lr: 2.87e-02
2024-08-06 08:32:45,686 INFO [trainer.py:765] (6/8) Epoch 2, batch 600, train_loss[loss=3.304, ArTop10Accuracy=0.6782, over 11436.00 frames. ], tot_loss[loss=3.329, ArTop10Accuracy=0.6711, over 11355.80 frames. ], batch size: 18, lr: 2.86e-02
2024-08-06 08:34:13,582 INFO [trainer.py:765] (6/8) Epoch 2, batch 700, train_loss[loss=3.079, ArTop10Accuracy=0.7212, over 9345.00 frames. ], tot_loss[loss=3.324, ArTop10Accuracy=0.6721, over 11500.32 frames. ], batch size: 11, lr: 2.85e-02
2024-08-06 08:34:31,174 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 08:34:40,888 INFO [trainer.py:811] (6/8) Epoch 2, validation: loss=3.277, ArTop10Accuracy=0.6803, over 1827537.00 frames.
2024-08-06 08:34:40,889 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 29113MB
2024-08-06 08:34:41,699 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 7.953e+01 1.592e+02 2.200e+02 3.344e+02 2.949e+03, threshold=4.400e+02, percent-clipped=8.6
2024-08-06 08:35:39,876 INFO [trainer.py:765] (6/8) Epoch 2, batch 800, train_loss[loss=3.286, ArTop10Accuracy=0.6826, over 10155.00 frames. ], tot_loss[loss=3.321, ArTop10Accuracy=0.6728, over 11631.80 frames. ], batch size: 12, lr: 2.84e-02
2024-08-06 08:36:56,370 INFO [trainer.py:765] (6/8) Epoch 2, batch 900, train_loss[loss=3.287, ArTop10Accuracy=0.6823, over 13350.00 frames. ], tot_loss[loss=3.309, ArTop10Accuracy=0.6751, over 11666.99 frames. ], batch size: 28, lr: 2.83e-02
2024-08-06 08:38:10,510 INFO [trainer.py:765] (6/8) Epoch 2, batch 1000, train_loss[loss=3.288, ArTop10Accuracy=0.6815, over 13026.00 frames. ], tot_loss[loss=3.3, ArTop10Accuracy=0.6765, over 11879.59 frames. ], batch size: 27, lr: 2.82e-02
2024-08-06 08:39:25,058 INFO [trainer.py:765] (6/8) Epoch 2, batch 1100, train_loss[loss=3.25, ArTop10Accuracy=0.6835, over 13587.00 frames. ], tot_loss[loss=3.296, ArTop10Accuracy=0.6776, over 11944.38 frames. ], batch size: 34, lr: 2.81e-02
2024-08-06 08:40:38,219 INFO [trainer.py:765] (6/8) Epoch 2, batch 1200, train_loss[loss=3.286, ArTop10Accuracy=0.6777, over 11184.00 frames. ], tot_loss[loss=3.286, ArTop10Accuracy=0.6793, over 11847.29 frames. ], batch size: 101, lr: 2.80e-02
2024-08-06 08:41:38,257 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 08:43:36,648 INFO [trainer.py:765] (6/8) Epoch 3, batch 100, train_loss[loss=3.289, ArTop10Accuracy=0.6722, over 14634.00 frames. ], tot_loss[loss=3.251, ArTop10Accuracy=0.6851, over 4755.45 frames. ], batch size: 62, lr: 2.67e-02
2024-08-06 08:45:10,500 INFO [trainer.py:765] (6/8) Epoch 3, batch 200, train_loss[loss=3.181, ArTop10Accuracy=0.7005, over 13515.00 frames. ], tot_loss[loss=3.221, ArTop10Accuracy=0.6909, over 7745.42 frames. ], batch size: 34, lr: 2.66e-02
2024-08-06 08:46:29,257 INFO [trainer.py:765] (6/8) Epoch 3, batch 300, train_loss[loss=3.261, ArTop10Accuracy=0.6856, over 14103.00 frames. ], tot_loss[loss=3.206, ArTop10Accuracy=0.6938, over 9364.34 frames. ], batch size: 44, lr: 2.64e-02
2024-08-06 08:48:04,218 INFO [trainer.py:765] (6/8) Epoch 3, batch 400, train_loss[loss=3.105, ArTop10Accuracy=0.7125, over 10353.00 frames. ], tot_loss[loss=3.191, ArTop10Accuracy=0.6967, over 10275.21 frames. ], batch size: 14, lr: 2.63e-02
2024-08-06 08:48:40,881 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 9.282e+01 1.561e+02 1.981e+02 2.686e+02 1.768e+03, threshold=3.962e+02, percent-clipped=7.6
2024-08-06 08:49:25,541 INFO [trainer.py:765] (6/8) Epoch 3, batch 500, train_loss[loss=3.089, ArTop10Accuracy=0.7152, over 12294.00 frames. ], tot_loss[loss=3.171, ArTop10Accuracy=0.7005, over 10826.07 frames. ], batch size: 22, lr: 2.62e-02
2024-08-06 08:51:00,476 INFO [trainer.py:765] (6/8) Epoch 3, batch 600, train_loss[loss=3.138, ArTop10Accuracy=0.7107, over 11346.00 frames. ], tot_loss[loss=3.155, ArTop10Accuracy=0.7036, over 11363.23 frames. ], batch size: 18, lr: 2.61e-02
2024-08-06 08:52:31,618 INFO [trainer.py:765] (6/8) Epoch 3, batch 700, train_loss[loss=3.117, ArTop10Accuracy=0.7064, over 9513.00 frames. ], tot_loss[loss=3.144, ArTop10Accuracy=0.7058, over 11502.94 frames. ], batch size: 11, lr: 2.60e-02
2024-08-06 08:53:57,388 INFO [trainer.py:765] (6/8) Epoch 3, batch 800, train_loss[loss=3.005, ArTop10Accuracy=0.7328, over 10665.00 frames. ], tot_loss[loss=3.136, ArTop10Accuracy=0.7073, over 11636.91 frames. ], batch size: 13, lr: 2.59e-02
2024-08-06 08:55:15,117 INFO [trainer.py:765] (6/8) Epoch 3, batch 900, train_loss[loss=3.093, ArTop10Accuracy=0.7182, over 13086.00 frames. ], tot_loss[loss=3.119, ArTop10Accuracy=0.7107, over 11682.79 frames. ], batch size: 27, lr: 2.57e-02
2024-08-06 08:56:31,557 INFO [trainer.py:765] (6/8) Epoch 3, batch 1000, train_loss[loss=2.997, ArTop10Accuracy=0.7294, over 12915.00 frames. ], tot_loss[loss=3.109, ArTop10Accuracy=0.7126, over 11878.15 frames. ], batch size: 27, lr: 2.56e-02
2024-08-06 08:57:46,506 INFO [trainer.py:765] (6/8) Epoch 3, batch 1100, train_loss[loss=3.086, ArTop10Accuracy=0.7154, over 13635.00 frames. ], tot_loss[loss=3.102, ArTop10Accuracy=0.7138, over 11951.69 frames. ], batch size: 34, lr: 2.55e-02
2024-08-06 08:59:01,399 INFO [trainer.py:765] (6/8) Epoch 3, batch 1200, train_loss[loss=3.159, ArTop10Accuracy=0.7025, over 11697.00 frames. ], tot_loss[loss=3.094, ArTop10Accuracy=0.7154, over 11858.17 frames. ], batch size: 101, lr: 2.54e-02
2024-08-06 09:00:02,076 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 09:01:50,740 INFO [trainer.py:765] (6/8) Epoch 4, batch 100, train_loss[loss=3.078, ArTop10Accuracy=0.7206, over 14586.00 frames. ], tot_loss[loss=3.077, ArTop10Accuracy=0.7175, over 4762.22 frames. ], batch size: 62, lr: 2.38e-02
2024-08-06 09:02:52,858 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 09:03:02,384 INFO [trainer.py:811] (6/8) Epoch 4, validation: loss=2.997, ArTop10Accuracy=0.7338, over 1827537.00 frames.
2024-08-06 09:03:02,385 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 29374MB
2024-08-06 09:03:03,364 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.072e+02 1.499e+02 1.782e+02 2.273e+02 1.100e+03, threshold=3.565e+02, percent-clipped=4.7
2024-08-06 09:03:29,274 INFO [trainer.py:765] (6/8) Epoch 4, batch 200, train_loss[loss=3.044, ArTop10Accuracy=0.7246, over 13827.00 frames. ], tot_loss[loss=3.049, ArTop10Accuracy=0.7232, over 7750.55 frames. ], batch size: 35, lr: 2.37e-02
2024-08-06 09:05:01,732 INFO [trainer.py:765] (6/8) Epoch 4, batch 300, train_loss[loss=3.137, ArTop10Accuracy=0.7043, over 14121.00 frames. ], tot_loss[loss=3.041, ArTop10Accuracy=0.7249, over 9386.87 frames. ], batch size: 44, lr: 2.36e-02
2024-08-06 09:06:28,150 INFO [trainer.py:765] (6/8) Epoch 4, batch 400, train_loss[loss=3.036, ArTop10Accuracy=0.7249, over 10518.00 frames. ], tot_loss[loss=3.037, ArTop10Accuracy=0.7257, over 10309.40 frames. ], batch size: 14, lr: 2.34e-02
2024-08-06 09:08:01,925 INFO [trainer.py:765] (6/8) Epoch 4, batch 500, train_loss[loss=2.881, ArTop10Accuracy=0.7551, over 12159.00 frames. ], tot_loss[loss=3.028, ArTop10Accuracy=0.7275, over 10832.97 frames. ], batch size: 22, lr: 2.33e-02
2024-08-06 09:09:28,540 INFO [trainer.py:765] (6/8) Epoch 4, batch 600, train_loss[loss=2.984, ArTop10Accuracy=0.7324, over 11529.00 frames. ], tot_loss[loss=3.025, ArTop10Accuracy=0.728, over 11364.42 frames. ], batch size: 18, lr: 2.32e-02
2024-08-06 09:10:59,865 INFO [trainer.py:765] (6/8) Epoch 4, batch 700, train_loss[loss=2.895, ArTop10Accuracy=0.7564, over 10038.00 frames. ], tot_loss[loss=3.024, ArTop10Accuracy=0.7284, over 11504.85 frames. ], batch size: 12, lr: 2.31e-02
2024-08-06 09:12:17,513 INFO [trainer.py:765] (6/8) Epoch 4, batch 800, train_loss[loss=2.829, ArTop10Accuracy=0.7753, over 10317.00 frames. ], tot_loss[loss=3.021, ArTop10Accuracy=0.729, over 11618.34 frames. ], batch size: 12, lr: 2.30e-02
2024-08-06 09:13:33,212 INFO [trainer.py:765] (6/8) Epoch 4, batch 900, train_loss[loss=2.953, ArTop10Accuracy=0.7436, over 13008.00 frames. ], tot_loss[loss=3.013, ArTop10Accuracy=0.7307, over 11683.53 frames. ], batch size: 27, lr: 2.29e-02
2024-08-06 09:14:47,520 INFO [trainer.py:765] (6/8) Epoch 4, batch 1000, train_loss[loss=3.061, ArTop10Accuracy=0.7249, over 12897.00 frames. ], tot_loss[loss=3.013, ArTop10Accuracy=0.7306, over 11877.18 frames. ], batch size: 27, lr: 2.28e-02
2024-08-06 09:16:02,982 INFO [trainer.py:765] (6/8) Epoch 4, batch 1100, train_loss[loss=3.017, ArTop10Accuracy=0.7294, over 13554.00 frames. ], tot_loss[loss=3.015, ArTop10Accuracy=0.7302, over 11959.80 frames. ], batch size: 34, lr: 2.26e-02
2024-08-06 09:16:53,291 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.100e+02 1.440e+02 1.636e+02 1.968e+02 7.702e+02, threshold=3.273e+02, percent-clipped=1.3
2024-08-06 09:17:18,344 INFO [trainer.py:765] (6/8) Epoch 4, batch 1200, train_loss[loss=3.105, ArTop10Accuracy=0.712, over 12048.00 frames. ], tot_loss[loss=3.01, ArTop10Accuracy=0.7309, over 11874.78 frames. ], batch size: 103, lr: 2.25e-02
2024-08-06 09:18:17,022 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 09:20:17,171 INFO [trainer.py:765] (6/8) Epoch 5, batch 100, train_loss[loss=3.024, ArTop10Accuracy=0.729, over 14520.00 frames. ], tot_loss[loss=2.998, ArTop10Accuracy=0.7323, over 4763.86 frames. ], batch size: 62, lr: 2.10e-02
2024-08-06 09:21:52,296 INFO [trainer.py:765] (6/8) Epoch 5, batch 200, train_loss[loss=2.932, ArTop10Accuracy=0.7488, over 13890.00 frames. ], tot_loss[loss=2.978, ArTop10Accuracy=0.7364, over 7748.36 frames. ], batch size: 34, lr: 2.09e-02
2024-08-06 09:23:19,241 INFO [trainer.py:765] (6/8) Epoch 5, batch 300, train_loss[loss=2.922, ArTop10Accuracy=0.7481, over 14118.00 frames. ], tot_loss[loss=2.969, ArTop10Accuracy=0.7383, over 9371.93 frames. ], batch size: 45, lr: 2.08e-02
2024-08-06 09:24:53,537 INFO [trainer.py:765] (6/8) Epoch 5, batch 400, train_loss[loss=2.855, ArTop10Accuracy=0.7614, over 10344.00 frames. ], tot_loss[loss=2.97, ArTop10Accuracy=0.7382, over 10282.75 frames. ], batch size: 14, lr: 2.07e-02
2024-08-06 09:26:19,418 INFO [trainer.py:765] (6/8) Epoch 5, batch 500, train_loss[loss=2.962, ArTop10Accuracy=0.7411, over 12372.00 frames. ], tot_loss[loss=2.966, ArTop10Accuracy=0.739, over 10822.00 frames. ], batch size: 22, lr: 2.06e-02
2024-08-06 09:27:49,538 INFO [trainer.py:765] (6/8) Epoch 5, batch 600, train_loss[loss=2.859, ArTop10Accuracy=0.7623, over 11424.00 frames. ], tot_loss[loss=2.963, ArTop10Accuracy=0.7396, over 11344.24 frames. ], batch size: 18, lr: 2.05e-02
2024-08-06 09:29:21,670 INFO [trainer.py:765] (6/8) Epoch 5, batch 700, train_loss[loss=2.829, ArTop10Accuracy=0.766, over 10110.00 frames. ], tot_loss[loss=2.964, ArTop10Accuracy=0.7394, over 11498.36 frames. ], batch size: 12, lr: 2.04e-02
2024-08-06 09:30:44,694 INFO [trainer.py:765] (6/8) Epoch 5, batch 800, train_loss[loss=2.993, ArTop10Accuracy=0.7335, over 10182.00 frames. ], tot_loss[loss=2.968, ArTop10Accuracy=0.7386, over 11628.28 frames. ], batch size: 12, lr: 2.03e-02
2024-08-06 09:31:51,240 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 09:32:00,762 INFO [trainer.py:811] (6/8) Epoch 5, validation: loss=2.926, ArTop10Accuracy=0.7466, over 1827537.00 frames.
2024-08-06 09:32:00,763 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 29654MB
2024-08-06 09:32:01,709 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.060e+02 1.349e+02 1.525e+02 1.806e+02 1.007e+03, threshold=3.049e+02, percent-clipped=2.3
2024-08-06 09:32:10,554 INFO [trainer.py:765] (6/8) Epoch 5, batch 900, train_loss[loss=2.987, ArTop10Accuracy=0.7291, over 13041.00 frames. ], tot_loss[loss=2.961, ArTop10Accuracy=0.7401, over 11691.73 frames. ], batch size: 27, lr: 2.02e-02
2024-08-06 09:33:27,323 INFO [trainer.py:765] (6/8) Epoch 5, batch 1000, train_loss[loss=2.989, ArTop10Accuracy=0.7325, over 13020.00 frames. ], tot_loss[loss=2.961, ArTop10Accuracy=0.74, over 11890.41 frames. ], batch size: 27, lr: 2.01e-02
2024-08-06 09:34:42,300 INFO [trainer.py:765] (6/8) Epoch 5, batch 1100, train_loss[loss=2.954, ArTop10Accuracy=0.7405, over 13821.00 frames. ], tot_loss[loss=2.964, ArTop10Accuracy=0.7393, over 11936.71 frames. ], batch size: 34, lr: 2.00e-02
2024-08-06 09:35:56,331 INFO [trainer.py:765] (6/8) Epoch 5, batch 1200, train_loss[loss=3.119, ArTop10Accuracy=0.7086, over 12636.00 frames. ], tot_loss[loss=2.961, ArTop10Accuracy=0.7402, over 11856.38 frames. ], batch size: 101, lr: 1.99e-02
2024-08-06 09:36:55,716 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 09:38:52,665 INFO [trainer.py:765] (6/8) Epoch 6, batch 100, train_loss[loss=2.955, ArTop10Accuracy=0.7445, over 14772.00 frames. ], tot_loss[loss=2.951, ArTop10Accuracy=0.7416, over 4751.62 frames. ], batch size: 62, lr: 1.85e-02
2024-08-06 09:40:19,834 INFO [trainer.py:765] (6/8) Epoch 6, batch 200, train_loss[loss=2.913, ArTop10Accuracy=0.7489, over 13596.00 frames. ], tot_loss[loss=2.936, ArTop10Accuracy=0.7445, over 7733.30 frames. ], batch size: 34, lr: 1.84e-02
2024-08-06 09:41:52,965 INFO [trainer.py:765] (6/8) Epoch 6, batch 300, train_loss[loss=2.968, ArTop10Accuracy=0.7397, over 13965.00 frames. ], tot_loss[loss=2.929, ArTop10Accuracy=0.7459, over 9366.45 frames. ], batch size: 44, lr: 1.83e-02
2024-08-06 09:43:17,828 INFO [trainer.py:765] (6/8) Epoch 6, batch 400, train_loss[loss=2.856, ArTop10Accuracy=0.7616, over 10935.00 frames. ], tot_loss[loss=2.925, ArTop10Accuracy=0.7471, over 10286.23 frames. ], batch size: 15, lr: 1.83e-02
2024-08-06 09:44:54,128 INFO [trainer.py:765] (6/8) Epoch 6, batch 500, train_loss[loss=2.918, ArTop10Accuracy=0.7506, over 12210.00 frames. ], tot_loss[loss=2.923, ArTop10Accuracy=0.7475, over 10852.19 frames. ], batch size: 22, lr: 1.82e-02
2024-08-06 09:46:22,873 INFO [trainer.py:765] (6/8) Epoch 6, batch 600, train_loss[loss=2.782, ArTop10Accuracy=0.7736, over 11358.00 frames. ], tot_loss[loss=2.92, ArTop10Accuracy=0.7481, over 11364.24 frames. ], batch size: 18, lr: 1.81e-02
2024-08-06 09:46:37,219 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.012e+02 1.339e+02 1.480e+02 1.701e+02 7.506e+02, threshold=2.959e+02, percent-clipped=1.1
2024-08-06 09:47:57,870 INFO [trainer.py:765] (6/8) Epoch 6, batch 700, train_loss[loss=2.855, ArTop10Accuracy=0.7598, over 10134.00 frames. ], tot_loss[loss=2.925, ArTop10Accuracy=0.7472, over 11502.69 frames. ], batch size: 12, lr: 1.80e-02
2024-08-06 09:49:15,955 INFO [trainer.py:765] (6/8) Epoch 6, batch 800, train_loss[loss=2.854, ArTop10Accuracy=0.759, over 10095.00 frames. ], tot_loss[loss=2.926, ArTop10Accuracy=0.7469, over 11620.14 frames. ], batch size: 12, lr: 1.79e-02
2024-08-06 09:50:32,135 INFO [trainer.py:765] (6/8) Epoch 6, batch 900, train_loss[loss=2.932, ArTop10Accuracy=0.7518, over 13065.00 frames. ], tot_loss[loss=2.922, ArTop10Accuracy=0.7477, over 11670.94 frames. ], batch size: 27, lr: 1.78e-02
2024-08-06 09:51:47,299 INFO [trainer.py:765] (6/8) Epoch 6, batch 1000, train_loss[loss=2.987, ArTop10Accuracy=0.735, over 12957.00 frames. ], tot_loss[loss=2.926, ArTop10Accuracy=0.7468, over 11889.76 frames. ], batch size: 27, lr: 1.77e-02
2024-08-06 09:53:00,921 INFO [trainer.py:765] (6/8) Epoch 6, batch 1100, train_loss[loss=2.839, ArTop10Accuracy=0.7588, over 13614.00 frames. ], tot_loss[loss=2.93, ArTop10Accuracy=0.7459, over 11947.39 frames. ], batch size: 34, lr: 1.77e-02
2024-08-06 09:54:14,337 INFO [trainer.py:765] (6/8) Epoch 6, batch 1200, train_loss[loss=3.013, ArTop10Accuracy=0.7295, over 12051.00 frames. ], tot_loss[loss=2.929, ArTop10Accuracy=0.746, over 11869.99 frames. ], batch size: 101, lr: 1.76e-02
2024-08-06 09:55:13,008 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 09:57:06,699 INFO [trainer.py:765] (6/8) Epoch 7, batch 100, train_loss[loss=2.878, ArTop10Accuracy=0.7565, over 14559.00 frames. ], tot_loss[loss=2.912, ArTop10Accuracy=0.7495, over 4765.49 frames. ], batch size: 63, lr: 1.64e-02
2024-08-06 09:58:39,426 INFO [trainer.py:765] (6/8) Epoch 7, batch 200, train_loss[loss=2.901, ArTop10Accuracy=0.7472, over 13467.00 frames. ], tot_loss[loss=2.9, ArTop10Accuracy=0.7514, over 7756.69 frames. ], batch size: 34, lr: 1.64e-02
2024-08-06 10:00:06,083 INFO [trainer.py:765] (6/8) Epoch 7, batch 300, train_loss[loss=2.904, ArTop10Accuracy=0.7493, over 14169.00 frames. ], tot_loss[loss=2.896, ArTop10Accuracy=0.7525, over 9366.34 frames. ], batch size: 44, lr: 1.63e-02
2024-08-06 10:00:40,509 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 10:00:50,245 INFO [trainer.py:811] (6/8) Epoch 7, validation: loss=2.88, ArTop10Accuracy=0.7554, over 1827537.00 frames.
2024-08-06 10:00:50,246 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 29654MB
2024-08-06 10:00:50,976 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.002e+02 1.286e+02 1.429e+02 1.605e+02 1.020e+03, threshold=2.857e+02, percent-clipped=1.5
2024-08-06 10:01:49,115 INFO [trainer.py:765] (6/8) Epoch 7, batch 400, train_loss[loss=2.873, ArTop10Accuracy=0.7615, over 10770.00 frames. ], tot_loss[loss=2.894, ArTop10Accuracy=0.7531, over 10290.52 frames. ], batch size: 15, lr: 1.62e-02
2024-08-06 10:03:21,455 INFO [trainer.py:765] (6/8) Epoch 7, batch 500, train_loss[loss=2.785, ArTop10Accuracy=0.779, over 12291.00 frames. ], tot_loss[loss=2.89, ArTop10Accuracy=0.7536, over 10870.88 frames. ], batch size: 22, lr: 1.61e-02
2024-08-06 10:04:51,881 INFO [trainer.py:765] (6/8) Epoch 7, batch 600, train_loss[loss=2.849, ArTop10Accuracy=0.7606, over 11370.00 frames. ], tot_loss[loss=2.892, ArTop10Accuracy=0.753, over 11381.86 frames. ], batch size: 18, lr: 1.61e-02
2024-08-06 10:06:25,112 INFO [trainer.py:765] (6/8) Epoch 7, batch 700, train_loss[loss=2.794, ArTop10Accuracy=0.7743, over 10017.00 frames. ], tot_loss[loss=2.896, ArTop10Accuracy=0.7523, over 11532.56 frames. ], batch size: 12, lr: 1.60e-02
2024-08-06 10:07:46,948 INFO [trainer.py:765] (6/8) Epoch 7, batch 800, train_loss[loss=2.775, ArTop10Accuracy=0.7791, over 10104.00 frames. ], tot_loss[loss=2.896, ArTop10Accuracy=0.7525, over 11648.59 frames. ], batch size: 12, lr: 1.59e-02
2024-08-06 10:09:02,822 INFO [trainer.py:765] (6/8) Epoch 7, batch 900, train_loss[loss=2.816, ArTop10Accuracy=0.7693, over 13011.00 frames. ], tot_loss[loss=2.89, ArTop10Accuracy=0.7538, over 11702.50 frames. ], batch size: 27, lr: 1.59e-02
2024-08-06 10:10:19,636 INFO [trainer.py:765] (6/8) Epoch 7, batch 1000, train_loss[loss=2.9, ArTop10Accuracy=0.7526, over 12774.00 frames. ], tot_loss[loss=2.897, ArTop10Accuracy=0.7523, over 11906.77 frames. ], batch size: 27, lr: 1.58e-02
2024-08-06 10:11:35,208 INFO [trainer.py:765] (6/8) Epoch 7, batch 1100, train_loss[loss=2.894, ArTop10Accuracy=0.7517, over 13650.00 frames. ], tot_loss[loss=2.902, ArTop10Accuracy=0.7512, over 11965.81 frames. ], batch size: 34, lr: 1.57e-02
2024-08-06 10:12:48,204 INFO [trainer.py:765] (6/8) Epoch 7, batch 1200, train_loss[loss=3.006, ArTop10Accuracy=0.7278, over 12288.00 frames. ], tot_loss[loss=2.898, ArTop10Accuracy=0.752, over 11848.72 frames. ], batch size: 101, lr: 1.57e-02
2024-08-06 10:13:46,782 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 10:15:03,600 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.017e+02 1.283e+02 1.410e+02 1.601e+02 1.017e+03, threshold=2.820e+02, percent-clipped=0.9
2024-08-06 10:15:40,820 INFO [trainer.py:765] (6/8) Epoch 8, batch 100, train_loss[loss=2.916, ArTop10Accuracy=0.7516, over 14502.00 frames. ], tot_loss[loss=2.882, ArTop10Accuracy=0.7548, over 4756.16 frames. ], batch size: 62, lr: 1.47e-02
2024-08-06 10:17:12,861 INFO [trainer.py:765] (6/8) Epoch 8, batch 200, train_loss[loss=2.881, ArTop10Accuracy=0.7545, over 13539.00 frames. ], tot_loss[loss=2.873, ArTop10Accuracy=0.7563, over 7759.48 frames. ], batch size: 34, lr: 1.46e-02
2024-08-06 10:18:37,898 INFO [trainer.py:765] (6/8) Epoch 8, batch 300, train_loss[loss=2.96, ArTop10Accuracy=0.7408, over 13821.00 frames. ], tot_loss[loss=2.871, ArTop10Accuracy=0.7571, over 9369.67 frames. ], batch size: 44, lr: 1.46e-02
2024-08-06 10:20:06,341 INFO [trainer.py:765] (6/8) Epoch 8, batch 400, train_loss[loss=2.908, ArTop10Accuracy=0.7466, over 10806.00 frames. ], tot_loss[loss=2.868, ArTop10Accuracy=0.7576, over 10268.57 frames. ], batch size: 15, lr: 1.45e-02
2024-08-06 10:21:32,411 INFO [trainer.py:765] (6/8) Epoch 8, batch 500, train_loss[loss=2.894, ArTop10Accuracy=0.7548, over 12519.00 frames. ], tot_loss[loss=2.863, ArTop10Accuracy=0.7589, over 10821.20 frames. ], batch size: 23, lr: 1.45e-02
2024-08-06 10:23:00,974 INFO [trainer.py:765] (6/8) Epoch 8, batch 600, train_loss[loss=2.724, ArTop10Accuracy=0.786, over 11475.00 frames. ], tot_loss[loss=2.864, ArTop10Accuracy=0.7585, over 11354.19 frames. ], batch size: 18, lr: 1.44e-02
2024-08-06 10:24:37,788 INFO [trainer.py:765] (6/8) Epoch 8, batch 700, train_loss[loss=2.836, ArTop10Accuracy=0.7659, over 10110.00 frames. ], tot_loss[loss=2.87, ArTop10Accuracy=0.7572, over 11488.52 frames. ], batch size: 12, lr: 1.43e-02
2024-08-06 10:25:56,085 INFO [trainer.py:765] (6/8) Epoch 8, batch 800, train_loss[loss=2.815, ArTop10Accuracy=0.7729, over 10086.00 frames. ], tot_loss[loss=2.873, ArTop10Accuracy=0.7568, over 11614.89 frames. ], batch size: 12, lr: 1.43e-02
2024-08-06 10:27:12,245 INFO [trainer.py:765] (6/8) Epoch 8, batch 900, train_loss[loss=2.778, ArTop10Accuracy=0.7739, over 13353.00 frames. ], tot_loss[loss=2.868, ArTop10Accuracy=0.758, over 11678.16 frames. ], batch size: 28, lr: 1.42e-02
2024-08-06 10:28:25,263 INFO [trainer.py:765] (6/8) Epoch 8, batch 1000, train_loss[loss=2.856, ArTop10Accuracy=0.7623, over 12663.00 frames. ], tot_loss[loss=2.871, ArTop10Accuracy=0.7573, over 11880.24 frames. ], batch size: 27, lr: 1.42e-02
2024-08-06 10:29:07,155 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 10:29:16,831 INFO [trainer.py:811] (6/8) Epoch 8, validation: loss=2.858, ArTop10Accuracy=0.7594, over 1827537.00 frames.
2024-08-06 10:29:16,831 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 29654MB
2024-08-06 10:29:17,491 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.032e+02 1.275e+02 1.390e+02 1.547e+02 3.717e+02, threshold=2.781e+02, percent-clipped=0.7
2024-08-06 10:29:51,731 INFO [trainer.py:765] (6/8) Epoch 8, batch 1100, train_loss[loss=2.932, ArTop10Accuracy=0.7465, over 13584.00 frames. ], tot_loss[loss=2.877, ArTop10Accuracy=0.7559, over 11946.18 frames. ], batch size: 34, lr: 1.41e-02
2024-08-06 10:31:05,946 INFO [trainer.py:765] (6/8) Epoch 8, batch 1200, train_loss[loss=2.927, ArTop10Accuracy=0.7476, over 12045.00 frames. ], tot_loss[loss=2.877, ArTop10Accuracy=0.7557, over 11857.21 frames. ], batch size: 101, lr: 1.40e-02
2024-08-06 10:32:05,758 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 10:34:01,256 INFO [trainer.py:765] (6/8) Epoch 9, batch 100, train_loss[loss=2.9, ArTop10Accuracy=0.7515, over 14604.00 frames. ], tot_loss[loss=2.86, ArTop10Accuracy=0.7584, over 4783.71 frames. ], batch size: 62, lr: 1.32e-02
2024-08-06 10:35:31,773 INFO [trainer.py:765] (6/8) Epoch 9, batch 200, train_loss[loss=2.829, ArTop10Accuracy=0.7662, over 13380.00 frames. ], tot_loss[loss=2.851, ArTop10Accuracy=0.7607, over 7756.80 frames. ], batch size: 34, lr: 1.32e-02
2024-08-06 10:36:57,928 INFO [trainer.py:765] (6/8) Epoch 9, batch 300, train_loss[loss=2.847, ArTop10Accuracy=0.7628, over 14202.00 frames. ], tot_loss[loss=2.847, ArTop10Accuracy=0.7615, over 9387.52 frames. ], batch size: 44, lr: 1.31e-02
2024-08-06 10:38:32,697 INFO [trainer.py:765] (6/8) Epoch 9, batch 400, train_loss[loss=2.716, ArTop10Accuracy=0.7936, over 10851.00 frames. ], tot_loss[loss=2.846, ArTop10Accuracy=0.7619, over 10294.73 frames. ], batch size: 15, lr: 1.31e-02
2024-08-06 10:39:59,256 INFO [trainer.py:765] (6/8) Epoch 9, batch 500, train_loss[loss=2.8, ArTop10Accuracy=0.7729, over 12210.00 frames. ], tot_loss[loss=2.842, ArTop10Accuracy=0.7626, over 10867.43 frames. ], batch size: 22, lr: 1.30e-02
2024-08-06 10:41:29,691 INFO [trainer.py:765] (6/8) Epoch 9, batch 600, train_loss[loss=2.781, ArTop10Accuracy=0.7675, over 11394.00 frames. ], tot_loss[loss=2.844, ArTop10Accuracy=0.7622, over 11375.14 frames. ], batch size: 18, lr: 1.30e-02
2024-08-06 10:42:58,441 INFO [trainer.py:765] (6/8) Epoch 9, batch 700, train_loss[loss=2.918, ArTop10Accuracy=0.7417, over 9318.00 frames. ], tot_loss[loss=2.848, ArTop10Accuracy=0.7614, over 11500.36 frames. ], batch size: 11, lr: 1.29e-02
2024-08-06 10:44:02,952 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.039e+02 1.253e+02 1.352e+02 1.493e+02 7.010e+02, threshold=2.704e+02, percent-clipped=0.6
2024-08-06 10:44:19,670 INFO [trainer.py:765] (6/8) Epoch 9, batch 800, train_loss[loss=2.715, ArTop10Accuracy=0.7933, over 10152.00 frames. ], tot_loss[loss=2.849, ArTop10Accuracy=0.7614, over 11616.55 frames. ], batch size: 12, lr: 1.29e-02
2024-08-06 10:45:35,720 INFO [trainer.py:765] (6/8) Epoch 9, batch 900, train_loss[loss=2.749, ArTop10Accuracy=0.7809, over 13059.00 frames. ], tot_loss[loss=2.843, ArTop10Accuracy=0.7623, over 11671.85 frames. ], batch size: 27, lr: 1.28e-02
2024-08-06 10:46:51,272 INFO [trainer.py:765] (6/8) Epoch 9, batch 1000, train_loss[loss=2.793, ArTop10Accuracy=0.7717, over 13047.00 frames. ], tot_loss[loss=2.848, ArTop10Accuracy=0.7618, over 11855.39 frames. ], batch size: 27, lr: 1.28e-02
2024-08-06 10:48:06,247 INFO [trainer.py:765] (6/8) Epoch 9, batch 1100, train_loss[loss=2.91, ArTop10Accuracy=0.7485, over 13737.00 frames. ], tot_loss[loss=2.854, ArTop10Accuracy=0.7603, over 11939.27 frames. ], batch size: 34, lr: 1.28e-02
2024-08-06 10:49:21,054 INFO [trainer.py:765] (6/8) Epoch 9, batch 1200, train_loss[loss=2.971, ArTop10Accuracy=0.7364, over 11847.00 frames. ], tot_loss[loss=2.857, ArTop10Accuracy=0.7598, over 11871.90 frames. ], batch size: 101, lr: 1.27e-02
2024-08-06 10:50:22,739 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 10:52:12,325 INFO [trainer.py:765] (6/8) Epoch 10, batch 100, train_loss[loss=2.875, ArTop10Accuracy=0.7539, over 14637.00 frames. ], tot_loss[loss=2.843, ArTop10Accuracy=0.762, over 4761.86 frames. ], batch size: 63, lr: 1.20e-02
2024-08-06 10:53:44,585 INFO [trainer.py:765] (6/8) Epoch 10, batch 200, train_loss[loss=2.909, ArTop10Accuracy=0.7444, over 13647.00 frames. ], tot_loss[loss=2.835, ArTop10Accuracy=0.7635, over 7759.99 frames. ], batch size: 34, lr: 1.20e-02
2024-08-06 10:55:08,089 INFO [trainer.py:765] (6/8) Epoch 10, batch 300, train_loss[loss=2.861, ArTop10Accuracy=0.761, over 14100.00 frames. ], tot_loss[loss=2.827, ArTop10Accuracy=0.765, over 9360.91 frames. ], batch size: 44, lr: 1.19e-02
2024-08-06 10:56:41,175 INFO [trainer.py:765] (6/8) Epoch 10, batch 400, train_loss[loss=2.692, ArTop10Accuracy=0.7867, over 10296.00 frames. ], tot_loss[loss=2.823, ArTop10Accuracy=0.766, over 10289.78 frames. ], batch size: 14, lr: 1.19e-02
2024-08-06 10:58:04,937 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 10:58:14,559 INFO [trainer.py:811] (6/8) Epoch 10, validation: loss=2.842, ArTop10Accuracy=0.7624, over 1827537.00 frames.
2024-08-06 10:58:14,560 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 29654MB
2024-08-06 10:58:15,572 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.035e+02 1.228e+02 1.320e+02 1.458e+02 6.096e+02, threshold=2.641e+02, percent-clipped=0.6
2024-08-06 10:58:15,576 INFO [trainer.py:765] (6/8) Epoch 10, batch 500, train_loss[loss=2.872, ArTop10Accuracy=0.7575, over 12288.00 frames. ], tot_loss[loss=2.823, ArTop10Accuracy=0.7662, over 10829.88 frames. ], batch size: 22, lr: 1.19e-02
2024-08-06 10:59:42,814 INFO [trainer.py:765] (6/8) Epoch 10, batch 600, train_loss[loss=2.743, ArTop10Accuracy=0.776, over 11544.00 frames. ], tot_loss[loss=2.825, ArTop10Accuracy=0.7657, over 11345.47 frames. ], batch size: 18, lr: 1.18e-02
2024-08-06 11:01:18,107 INFO [trainer.py:765] (6/8) Epoch 10, batch 700, train_loss[loss=2.692, ArTop10Accuracy=0.7882, over 10173.00 frames. ], tot_loss[loss=2.828, ArTop10Accuracy=0.765, over 11518.39 frames. ], batch size: 12, lr: 1.18e-02
2024-08-06 11:02:36,917 INFO [trainer.py:765] (6/8) Epoch 10, batch 800, train_loss[loss=2.795, ArTop10Accuracy=0.7671, over 9525.00 frames. ], tot_loss[loss=2.83, ArTop10Accuracy=0.7647, over 11631.64 frames. ], batch size: 11, lr: 1.17e-02
2024-08-06 11:03:51,211 INFO [trainer.py:765] (6/8) Epoch 10, batch 900, train_loss[loss=2.774, ArTop10Accuracy=0.7739, over 12909.00 frames. ], tot_loss[loss=2.828, ArTop10Accuracy=0.7651, over 11689.97 frames. ], batch size: 27, lr: 1.17e-02
2024-08-06 11:05:06,351 INFO [trainer.py:765] (6/8) Epoch 10, batch 1000, train_loss[loss=2.859, ArTop10Accuracy=0.7609, over 12915.00 frames. ], tot_loss[loss=2.834, ArTop10Accuracy=0.7639, over 11887.16 frames. ], batch size: 27, lr: 1.17e-02
2024-08-06 11:06:21,722 INFO [trainer.py:765] (6/8) Epoch 10, batch 1100, train_loss[loss=2.86, ArTop10Accuracy=0.7558, over 13509.00 frames. ], tot_loss[loss=2.837, ArTop10Accuracy=0.7633, over 11950.67 frames. ], batch size: 34, lr: 1.16e-02
2024-08-06 11:07:34,772 INFO [trainer.py:765] (6/8) Epoch 10, batch 1200, train_loss[loss=2.952, ArTop10Accuracy=0.7372, over 12240.00 frames. ], tot_loss[loss=2.841, ArTop10Accuracy=0.7625, over 11878.92 frames. ], batch size: 102, lr: 1.16e-02
2024-08-06 11:08:33,387 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 11:10:29,954 INFO [trainer.py:765] (6/8) Epoch 11, batch 100, train_loss[loss=2.866, ArTop10Accuracy=0.7539, over 14262.00 frames. ], tot_loss[loss=2.82, ArTop10Accuracy=0.7664, over 4767.89 frames. ], batch size: 62, lr: 1.10e-02
2024-08-06 11:12:04,673 INFO [trainer.py:765] (6/8) Epoch 11, batch 200, train_loss[loss=2.809, ArTop10Accuracy=0.7704, over 13536.00 frames. ], tot_loss[loss=2.812, ArTop10Accuracy=0.7679, over 7753.99 frames. ], batch size: 34, lr: 1.10e-02
2024-08-06 11:12:22,825 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 9.884e+01 1.240e+02 1.333e+02 1.457e+02 6.939e+02, threshold=2.667e+02, percent-clipped=0.1
2024-08-06 11:13:31,547 INFO [trainer.py:765] (6/8) Epoch 11, batch 300, train_loss[loss=2.827, ArTop10Accuracy=0.7662, over 13899.00 frames. ], tot_loss[loss=2.808, ArTop10Accuracy=0.7685, over 9366.37 frames. ], batch size: 44, lr: 1.09e-02
2024-08-06 11:15:03,268 INFO [trainer.py:765] (6/8) Epoch 11, batch 400, train_loss[loss=2.802, ArTop10Accuracy=0.7696, over 10353.00 frames. ], tot_loss[loss=2.807, ArTop10Accuracy=0.7689, over 10271.71 frames. ], batch size: 14, lr: 1.09e-02
2024-08-06 11:16:29,636 INFO [trainer.py:765] (6/8) Epoch 11, batch 500, train_loss[loss=2.814, ArTop10Accuracy=0.7663, over 12087.00 frames. ], tot_loss[loss=2.801, ArTop10Accuracy=0.7703, over 10832.24 frames. ], batch size: 22, lr: 1.09e-02
2024-08-06 11:18:00,516 INFO [trainer.py:765] (6/8) Epoch 11, batch 600, train_loss[loss=2.82, ArTop10Accuracy=0.7657, over 11532.00 frames. ], tot_loss[loss=2.805, ArTop10Accuracy=0.7695, over 11361.59 frames. ], batch size: 18, lr: 1.08e-02
2024-08-06 11:19:34,513 INFO [trainer.py:765] (6/8) Epoch 11, batch 700, train_loss[loss=2.577, ArTop10Accuracy=0.8116, over 10242.00 frames. ], tot_loss[loss=2.806, ArTop10Accuracy=0.7693, over 11536.07 frames. ], batch size: 12, lr: 1.08e-02
2024-08-06 11:20:55,482 INFO [trainer.py:765] (6/8) Epoch 11, batch 800, train_loss[loss=2.719, ArTop10Accuracy=0.7877, over 9489.00 frames. ], tot_loss[loss=2.811, ArTop10Accuracy=0.7682, over 11637.50 frames. ], batch size: 11, lr: 1.07e-02
2024-08-06 11:22:13,704 INFO [trainer.py:765] (6/8) Epoch 11, batch 900, train_loss[loss=2.831, ArTop10Accuracy=0.7682, over 12921.00 frames. ], tot_loss[loss=2.809, ArTop10Accuracy=0.7688, over 11674.79 frames. ], batch size: 27, lr: 1.07e-02
2024-08-06 11:23:31,798 INFO [trainer.py:765] (6/8) Epoch 11, batch 1000, train_loss[loss=2.826, ArTop10Accuracy=0.7626, over 12708.00 frames. ], tot_loss[loss=2.816, ArTop10Accuracy=0.7673, over 11858.76 frames. ], batch size: 27, lr: 1.07e-02
2024-08-06 11:24:46,901 INFO [trainer.py:765] (6/8) Epoch 11, batch 1100, train_loss[loss=2.864, ArTop10Accuracy=0.759, over 13605.00 frames. ], tot_loss[loss=2.826, ArTop10Accuracy=0.7654, over 11944.73 frames. ], batch size: 34, lr: 1.06e-02
2024-08-06 11:26:00,732 INFO [trainer.py:765] (6/8) Epoch 11, batch 1200, train_loss[loss=2.94, ArTop10Accuracy=0.7391, over 12612.00 frames. ], tot_loss[loss=2.826, ArTop10Accuracy=0.7655, over 11881.65 frames. ], batch size: 101, lr: 1.06e-02
2024-08-06 11:26:15,846 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 11:26:25,556 INFO [trainer.py:811] (6/8) Epoch 11, validation: loss=2.831, ArTop10Accuracy=0.7643, over 1827537.00 frames.
2024-08-06 11:26:25,557 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 30358MB
2024-08-06 11:26:26,184 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.029e+02 1.251e+02 1.335e+02 1.441e+02 2.942e+02, threshold=2.669e+02, percent-clipped=0.1
2024-08-06 11:27:09,581 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 11:29:03,451 INFO [trainer.py:765] (6/8) Epoch 12, batch 100, train_loss[loss=2.865, ArTop10Accuracy=0.7524, over 14409.00 frames. ], tot_loss[loss=2.809, ArTop10Accuracy=0.7679, over 4752.94 frames. ], batch size: 62, lr: 1.01e-02
2024-08-06 11:30:30,674 INFO [trainer.py:765] (6/8) Epoch 12, batch 200, train_loss[loss=2.826, ArTop10Accuracy=0.7657, over 13659.00 frames. ], tot_loss[loss=2.801, ArTop10Accuracy=0.7698, over 7751.77 frames. ], batch size: 34, lr: 1.01e-02
2024-08-06 11:31:57,655 INFO [trainer.py:765] (6/8) Epoch 12, batch 300, train_loss[loss=2.881, ArTop10Accuracy=0.7502, over 14427.00 frames. ], tot_loss[loss=2.795, ArTop10Accuracy=0.7709, over 9386.69 frames. ], batch size: 45, lr: 1.01e-02
2024-08-06 11:33:30,738 INFO [trainer.py:765] (6/8) Epoch 12, batch 400, train_loss[loss=2.72, ArTop10Accuracy=0.7858, over 10209.00 frames. ], tot_loss[loss=2.791, ArTop10Accuracy=0.7717, over 10294.20 frames. ], batch size: 14, lr: 1.00e-02
2024-08-06 11:34:55,733 INFO [trainer.py:765] (6/8) Epoch 12, batch 500, train_loss[loss=2.798, ArTop10Accuracy=0.7727, over 12255.00 frames. ], tot_loss[loss=2.787, ArTop10Accuracy=0.7727, over 10843.06 frames. ], batch size: 22, lr: 1.00e-02
2024-08-06 11:36:29,361 INFO [trainer.py:765] (6/8) Epoch 12, batch 600, train_loss[loss=2.749, ArTop10Accuracy=0.7803, over 12000.00 frames. ], tot_loss[loss=2.79, ArTop10Accuracy=0.7723, over 11370.61 frames. ], batch size: 19, lr: 9.97e-03
2024-08-06 11:38:00,344 INFO [trainer.py:765] (6/8) Epoch 12, batch 700, train_loss[loss=2.655, ArTop10Accuracy=0.8034, over 10221.00 frames. ], tot_loss[loss=2.797, ArTop10Accuracy=0.7707, over 11515.54 frames. ], batch size: 12, lr: 9.93e-03
2024-08-06 11:39:23,611 INFO [trainer.py:765] (6/8) Epoch 12, batch 800, train_loss[loss=2.715, ArTop10Accuracy=0.7911, over 10062.00 frames. ], tot_loss[loss=2.8, ArTop10Accuracy=0.77, over 11640.74 frames. ], batch size: 12, lr: 9.90e-03
2024-08-06 11:40:39,889 INFO [trainer.py:765] (6/8) Epoch 12, batch 900, train_loss[loss=2.857, ArTop10Accuracy=0.756, over 12738.00 frames. ], tot_loss[loss=2.797, ArTop10Accuracy=0.7708, over 11687.97 frames. ], batch size: 27, lr: 9.87e-03
2024-08-06 11:41:13,995 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.041e+02 1.248e+02 1.348e+02 1.459e+02 5.540e+02, threshold=2.695e+02, percent-clipped=0.3
2024-08-06 11:41:56,188 INFO [trainer.py:765] (6/8) Epoch 12, batch 1000, train_loss[loss=2.799, ArTop10Accuracy=0.7682, over 12924.00 frames. ], tot_loss[loss=2.799, ArTop10Accuracy=0.7705, over 11894.35 frames. ], batch size: 27, lr: 9.85e-03
2024-08-06 11:43:14,320 INFO [trainer.py:765] (6/8) Epoch 12, batch 1100, train_loss[loss=2.824, ArTop10Accuracy=0.7649, over 13695.00 frames. ], tot_loss[loss=2.804, ArTop10Accuracy=0.7697, over 11970.40 frames. ], batch size: 34, lr: 9.82e-03
2024-08-06 11:44:26,156 INFO [trainer.py:765] (6/8) Epoch 12, batch 1200, train_loss[loss=2.943, ArTop10Accuracy=0.7438, over 11631.00 frames. ], tot_loss[loss=2.806, ArTop10Accuracy=0.7694, over 11876.83 frames. ], batch size: 101, lr: 9.79e-03
2024-08-06 11:45:26,431 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 11:47:26,600 INFO [trainer.py:765] (6/8) Epoch 13, batch 100, train_loss[loss=2.817, ArTop10Accuracy=0.7696, over 14379.00 frames. ], tot_loss[loss=2.798, ArTop10Accuracy=0.77, over 4763.63 frames. ], batch size: 62, lr: 9.37e-03
2024-08-06 11:48:54,778 INFO [trainer.py:765] (6/8) Epoch 13, batch 200, train_loss[loss=2.781, ArTop10Accuracy=0.78, over 13704.00 frames. ], tot_loss[loss=2.788, ArTop10Accuracy=0.7723, over 7763.99 frames. ], batch size: 34, lr: 9.34e-03
2024-08-06 11:50:20,515 INFO [trainer.py:765] (6/8) Epoch 13, batch 300, train_loss[loss=2.859, ArTop10Accuracy=0.7575, over 14160.00 frames. ], tot_loss[loss=2.779, ArTop10Accuracy=0.7741, over 9394.50 frames. ], batch size: 44, lr: 9.31e-03
2024-08-06 11:51:48,764 INFO [trainer.py:765] (6/8) Epoch 13, batch 400, train_loss[loss=2.823, ArTop10Accuracy=0.7634, over 10341.00 frames. ], tot_loss[loss=2.776, ArTop10Accuracy=0.7748, over 10317.47 frames. ], batch size: 14, lr: 9.28e-03
2024-08-06 11:53:13,406 INFO [trainer.py:765] (6/8) Epoch 13, batch 500, train_loss[loss=2.745, ArTop10Accuracy=0.7847, over 12111.00 frames. ], tot_loss[loss=2.774, ArTop10Accuracy=0.7749, over 10867.95 frames. ], batch size: 22, lr: 9.26e-03
2024-08-06 11:54:52,222 INFO [trainer.py:765] (6/8) Epoch 13, batch 600, train_loss[loss=2.773, ArTop10Accuracy=0.7749, over 11472.00 frames. ], tot_loss[loss=2.781, ArTop10Accuracy=0.7737, over 11389.84 frames. ], batch size: 18, lr: 9.23e-03
2024-08-06 11:55:47,079 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 11:55:56,834 INFO [trainer.py:811] (6/8) Epoch 13, validation: loss=2.824, ArTop10Accuracy=0.7662, over 1827537.00 frames.
2024-08-06 11:55:56,835 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 32996MB
2024-08-06 11:55:57,712 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.064e+02 1.255e+02 1.343e+02 1.452e+02 4.888e+02, threshold=2.687e+02, percent-clipped=0.1
2024-08-06 11:56:28,464 INFO [trainer.py:765] (6/8) Epoch 13, batch 700, train_loss[loss=2.755, ArTop10Accuracy=0.7743, over 9240.00 frames. ], tot_loss[loss=2.785, ArTop10Accuracy=0.7732, over 11529.27 frames. ], batch size: 11, lr: 9.20e-03
2024-08-06 11:57:46,683 INFO [trainer.py:765] (6/8) Epoch 13, batch 800, train_loss[loss=2.678, ArTop10Accuracy=0.7995, over 10161.00 frames. ], tot_loss[loss=2.787, ArTop10Accuracy=0.7728, over 11666.26 frames. ], batch size: 12, lr: 9.18e-03
2024-08-06 11:59:03,286 INFO [trainer.py:765] (6/8) Epoch 13, batch 900, train_loss[loss=2.735, ArTop10Accuracy=0.7828, over 12972.00 frames. ], tot_loss[loss=2.78, ArTop10Accuracy=0.7743, over 11696.54 frames. ], batch size: 27, lr: 9.15e-03
2024-08-06 12:00:19,174 INFO [trainer.py:765] (6/8) Epoch 13, batch 1000, train_loss[loss=2.802, ArTop10Accuracy=0.7653, over 13188.00 frames. ], tot_loss[loss=2.784, ArTop10Accuracy=0.7734, over 11875.95 frames. ], batch size: 28, lr: 9.13e-03
2024-08-06 12:01:34,881 INFO [trainer.py:765] (6/8) Epoch 13, batch 1100, train_loss[loss=2.776, ArTop10Accuracy=0.7804, over 13716.00 frames. ], tot_loss[loss=2.791, ArTop10Accuracy=0.7721, over 11956.17 frames. ], batch size: 34, lr: 9.10e-03
2024-08-06 12:02:48,662 INFO [trainer.py:765] (6/8) Epoch 13, batch 1200, train_loss[loss=2.953, ArTop10Accuracy=0.7397, over 13203.00 frames. ], tot_loss[loss=2.793, ArTop10Accuracy=0.7716, over 11874.72 frames. ], batch size: 101, lr: 9.08e-03
2024-08-06 12:03:48,490 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 12:05:45,334 INFO [trainer.py:765] (6/8) Epoch 14, batch 100, train_loss[loss=2.79, ArTop10Accuracy=0.7743, over 14433.00 frames. ], tot_loss[loss=2.776, ArTop10Accuracy=0.7745, over 4759.85 frames. ], batch size: 62, lr: 8.71e-03
2024-08-06 12:07:16,604 INFO [trainer.py:765] (6/8) Epoch 14, batch 200, train_loss[loss=2.769, ArTop10Accuracy=0.779, over 13692.00 frames. ], tot_loss[loss=2.771, ArTop10Accuracy=0.7753, over 7748.44 frames. ], batch size: 34, lr: 8.69e-03
2024-08-06 12:08:44,311 INFO [trainer.py:765] (6/8) Epoch 14, batch 300, train_loss[loss=2.83, ArTop10Accuracy=0.7658, over 14106.00 frames. ], tot_loss[loss=2.765, ArTop10Accuracy=0.7766, over 9362.74 frames. ], batch size: 44, lr: 8.66e-03
2024-08-06 12:10:01,130 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.072e+02 1.266e+02 1.374e+02 1.483e+02 6.480e+02, threshold=2.748e+02, percent-clipped=0.2
2024-08-06 12:10:10,227 INFO [trainer.py:765] (6/8) Epoch 14, batch 400, train_loss[loss=2.692, ArTop10Accuracy=0.7942, over 10272.00 frames. ], tot_loss[loss=2.76, ArTop10Accuracy=0.7778, over 10277.08 frames. ], batch size: 14, lr: 8.64e-03
2024-08-06 12:11:36,151 INFO [trainer.py:765] (6/8) Epoch 14, batch 500, train_loss[loss=2.784, ArTop10Accuracy=0.774, over 12231.00 frames. ], tot_loss[loss=2.76, ArTop10Accuracy=0.7779, over 10848.71 frames. ], batch size: 22, lr: 8.62e-03
2024-08-06 12:13:05,994 INFO [trainer.py:765] (6/8) Epoch 14, batch 600, train_loss[loss=2.734, ArTop10Accuracy=0.782, over 11403.00 frames. ], tot_loss[loss=2.761, ArTop10Accuracy=0.7779, over 11369.47 frames. ], batch size: 18, lr: 8.59e-03
2024-08-06 12:14:38,553 INFO [trainer.py:765] (6/8) Epoch 14, batch 700, train_loss[loss=2.72, ArTop10Accuracy=0.7841, over 9363.00 frames. ], tot_loss[loss=2.765, ArTop10Accuracy=0.777, over 11518.27 frames. ], batch size: 11, lr: 8.57e-03
2024-08-06 12:15:58,070 INFO [trainer.py:765] (6/8) Epoch 14, batch 800, train_loss[loss=2.736, ArTop10Accuracy=0.7816, over 10083.00 frames. ], tot_loss[loss=2.769, ArTop10Accuracy=0.7761, over 11641.03 frames. ], batch size: 12, lr: 8.55e-03
2024-08-06 12:17:12,866 INFO [trainer.py:765] (6/8) Epoch 14, batch 900, train_loss[loss=2.893, ArTop10Accuracy=0.7504, over 12873.00 frames. ], tot_loss[loss=2.766, ArTop10Accuracy=0.7767, over 11702.17 frames. ], batch size: 27, lr: 8.52e-03
2024-08-06 12:18:29,614 INFO [trainer.py:765] (6/8) Epoch 14, batch 1000, train_loss[loss=2.769, ArTop10Accuracy=0.775, over 12948.00 frames. ], tot_loss[loss=2.771, ArTop10Accuracy=0.7759, over 11886.54 frames. ], batch size: 27, lr: 8.50e-03
2024-08-06 12:19:45,377 INFO [trainer.py:765] (6/8) Epoch 14, batch 1100, train_loss[loss=2.792, ArTop10Accuracy=0.7721, over 13494.00 frames. ], tot_loss[loss=2.779, ArTop10Accuracy=0.7743, over 11923.88 frames. ], batch size: 34, lr: 8.48e-03
2024-08-06 12:20:59,279 INFO [trainer.py:765] (6/8) Epoch 14, batch 1200, train_loss[loss=2.957, ArTop10Accuracy=0.7376, over 13002.00 frames. ], tot_loss[loss=2.779, ArTop10Accuracy=0.7743, over 11856.84 frames. ], batch size: 104, lr: 8.46e-03
2024-08-06 12:21:58,343 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 12:23:51,961 INFO [trainer.py:765] (6/8) Epoch 15, batch 100, train_loss[loss=2.806, ArTop10Accuracy=0.7697, over 14721.00 frames. ], tot_loss[loss=2.764, ArTop10Accuracy=0.7764, over 4782.17 frames. ], batch size: 62, lr: 8.14e-03
2024-08-06 12:24:00,599 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 12:24:10,290 INFO [trainer.py:811] (6/8) Epoch 15, validation: loss=2.819, ArTop10Accuracy=0.7675, over 1827537.00 frames.
2024-08-06 12:24:10,291 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 32996MB
2024-08-06 12:24:11,094 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.080e+02 1.284e+02 1.371e+02 1.488e+02 4.667e+02, threshold=2.743e+02, percent-clipped=0.2
2024-08-06 12:25:29,987 INFO [trainer.py:765] (6/8) Epoch 15, batch 200, train_loss[loss=2.735, ArTop10Accuracy=0.784, over 13680.00 frames. ], tot_loss[loss=2.757, ArTop10Accuracy=0.7782, over 7762.01 frames. ], batch size: 34, lr: 8.12e-03
2024-08-06 12:26:58,695 INFO [trainer.py:765] (6/8) Epoch 15, batch 300, train_loss[loss=2.759, ArTop10Accuracy=0.778, over 14037.00 frames. ], tot_loss[loss=2.754, ArTop10Accuracy=0.7789, over 9372.04 frames. ], batch size: 44, lr: 8.09e-03
2024-08-06 12:28:28,533 INFO [trainer.py:765] (6/8) Epoch 15, batch 400, train_loss[loss=2.646, ArTop10Accuracy=0.7968, over 10359.00 frames. ], tot_loss[loss=2.752, ArTop10Accuracy=0.7791, over 10297.47 frames. ], batch size: 14, lr: 8.07e-03
2024-08-06 12:29:54,031 INFO [trainer.py:765] (6/8) Epoch 15, batch 500, train_loss[loss=2.737, ArTop10Accuracy=0.7873, over 12321.00 frames. ], tot_loss[loss=2.746, ArTop10Accuracy=0.7803, over 10856.16 frames. ], batch size: 22, lr: 8.05e-03
2024-08-06 12:31:23,292 INFO [trainer.py:765] (6/8) Epoch 15, batch 600, train_loss[loss=2.682, ArTop10Accuracy=0.7952, over 11307.00 frames. ], tot_loss[loss=2.749, ArTop10Accuracy=0.7797, over 11384.85 frames. ], batch size: 18, lr: 8.03e-03
2024-08-06 12:32:53,175 INFO [trainer.py:765] (6/8) Epoch 15, batch 700, train_loss[loss=2.718, ArTop10Accuracy=0.7914, over 10119.00 frames. ], tot_loss[loss=2.752, ArTop10Accuracy=0.7794, over 11527.71 frames. ], batch size: 12, lr: 8.01e-03
2024-08-06 12:34:18,254 INFO [trainer.py:765] (6/8) Epoch 15, batch 800, train_loss[loss=2.788, ArTop10Accuracy=0.7694, over 10101.00 frames. ], tot_loss[loss=2.756, ArTop10Accuracy=0.7787, over 11637.91 frames. ], batch size: 12, lr: 7.99e-03
2024-08-06 12:35:34,726 INFO [trainer.py:765] (6/8) Epoch 15, batch 900, train_loss[loss=2.783, ArTop10Accuracy=0.7685, over 13059.00 frames. ], tot_loss[loss=2.751, ArTop10Accuracy=0.7798, over 11686.68 frames. ], batch size: 27, lr: 7.97e-03
2024-08-06 12:36:50,540 INFO [trainer.py:765] (6/8) Epoch 15, batch 1000, train_loss[loss=2.65, ArTop10Accuracy=0.8008, over 12771.00 frames. ], tot_loss[loss=2.755, ArTop10Accuracy=0.779, over 11872.69 frames. ], batch size: 27, lr: 7.95e-03
2024-08-06 12:38:05,178 INFO [trainer.py:765] (6/8) Epoch 15, batch 1100, train_loss[loss=2.793, ArTop10Accuracy=0.7725, over 13560.00 frames. ], tot_loss[loss=2.765, ArTop10Accuracy=0.777, over 11943.50 frames. ], batch size: 34, lr: 7.93e-03
2024-08-06 12:38:12,841 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.080e+02 1.293e+02 1.379e+02 1.467e+02 2.824e+02, threshold=2.759e+02, percent-clipped=0.1
2024-08-06 12:39:18,788 INFO [trainer.py:765] (6/8) Epoch 15, batch 1200, train_loss[loss=2.871, ArTop10Accuracy=0.7557, over 12357.00 frames. ], tot_loss[loss=2.766, ArTop10Accuracy=0.7769, over 11881.76 frames. ], batch size: 101, lr: 7.91e-03
2024-08-06 12:40:18,514 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 12:42:17,618 INFO [trainer.py:765] (6/8) Epoch 16, batch 100, train_loss[loss=2.841, ArTop10Accuracy=0.7601, over 14646.00 frames. ], tot_loss[loss=2.746, ArTop10Accuracy=0.7803, over 4751.87 frames. ], batch size: 62, lr: 7.63e-03
2024-08-06 12:43:49,564 INFO [trainer.py:765] (6/8) Epoch 16, batch 200, train_loss[loss=2.797, ArTop10Accuracy=0.7643, over 13659.00 frames. ], tot_loss[loss=2.74, ArTop10Accuracy=0.7815, over 7742.92 frames. ], batch size: 34, lr: 7.61e-03
2024-08-06 12:45:18,501 INFO [trainer.py:765] (6/8) Epoch 16, batch 300, train_loss[loss=2.836, ArTop10Accuracy=0.7623, over 14430.00 frames. ], tot_loss[loss=2.739, ArTop10Accuracy=0.7816, over 9375.84 frames. ], batch size: 44, lr: 7.59e-03
2024-08-06 12:46:45,208 INFO [trainer.py:765] (6/8) Epoch 16, batch 400, train_loss[loss=2.658, ArTop10Accuracy=0.8016, over 10086.00 frames. ], tot_loss[loss=2.739, ArTop10Accuracy=0.7816, over 10295.80 frames. ], batch size: 14, lr: 7.58e-03
2024-08-06 12:48:16,310 INFO [trainer.py:765] (6/8) Epoch 16, batch 500, train_loss[loss=2.604, ArTop10Accuracy=0.8112, over 12246.00 frames. ], tot_loss[loss=2.732, ArTop10Accuracy=0.7829, over 10853.76 frames. ], batch size: 22, lr: 7.56e-03
2024-08-06 12:49:46,641 INFO [trainer.py:765] (6/8) Epoch 16, batch 600, train_loss[loss=2.725, ArTop10Accuracy=0.7851, over 11598.00 frames. ], tot_loss[loss=2.738, ArTop10Accuracy=0.782, over 11364.55 frames. ], batch size: 18, lr: 7.54e-03
2024-08-06 12:51:23,681 INFO [trainer.py:765] (6/8) Epoch 16, batch 700, train_loss[loss=2.59, ArTop10Accuracy=0.8143, over 10077.00 frames. ], tot_loss[loss=2.742, ArTop10Accuracy=0.781, over 11512.75 frames. ], batch size: 12, lr: 7.52e-03
2024-08-06 12:52:43,500 INFO [trainer.py:765] (6/8) Epoch 16, batch 800, train_loss[loss=2.67, ArTop10Accuracy=0.7927, over 9582.00 frames. ], tot_loss[loss=2.746, ArTop10Accuracy=0.7802, over 11628.57 frames. ], batch size: 11, lr: 7.51e-03
2024-08-06 12:53:06,015 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 12:53:15,497 INFO [trainer.py:811] (6/8) Epoch 16, validation: loss=2.816, ArTop10Accuracy=0.7678, over 1827537.00 frames.
2024-08-06 12:53:15,497 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 32996MB
2024-08-06 12:53:16,186 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.112e+02 1.291e+02 1.391e+02 1.487e+02 3.459e+02, threshold=2.783e+02, percent-clipped=0.1
2024-08-06 12:54:06,482 INFO [trainer.py:765] (6/8) Epoch 16, batch 900, train_loss[loss=2.788, ArTop10Accuracy=0.772, over 12915.00 frames. ], tot_loss[loss=2.741, ArTop10Accuracy=0.7813, over 11680.34 frames. ], batch size: 27, lr: 7.49e-03
2024-08-06 12:55:19,791 INFO [trainer.py:765] (6/8) Epoch 16, batch 1000, train_loss[loss=2.737, ArTop10Accuracy=0.7785, over 12723.00 frames. ], tot_loss[loss=2.748, ArTop10Accuracy=0.7801, over 11873.29 frames. ], batch size: 27, lr: 7.47e-03
2024-08-06 12:56:33,162 INFO [trainer.py:765] (6/8) Epoch 16, batch 1100, train_loss[loss=2.79, ArTop10Accuracy=0.7745, over 13440.00 frames. ], tot_loss[loss=2.758, ArTop10Accuracy=0.7782, over 11956.98 frames. ], batch size: 34, lr: 7.45e-03
2024-08-06 12:57:48,485 INFO [trainer.py:765] (6/8) Epoch 16, batch 1200, train_loss[loss=2.865, ArTop10Accuracy=0.7548, over 12549.00 frames. ], tot_loss[loss=2.755, ArTop10Accuracy=0.7788, over 11849.71 frames. ], batch size: 101, lr: 7.44e-03
2024-08-06 12:58:48,420 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 13:00:47,899 INFO [trainer.py:765] (6/8) Epoch 17, batch 100, train_loss[loss=2.875, ArTop10Accuracy=0.76, over 14706.00 frames. ], tot_loss[loss=2.742, ArTop10Accuracy=0.7804, over 4765.55 frames. ], batch size: 62, lr: 7.18e-03
2024-08-06 13:02:19,302 INFO [trainer.py:765] (6/8) Epoch 17, batch 200, train_loss[loss=2.777, ArTop10Accuracy=0.7749, over 13542.00 frames. ], tot_loss[loss=2.734, ArTop10Accuracy=0.7822, over 7755.64 frames. ], batch size: 34, lr: 7.17e-03
2024-08-06 13:03:45,515 INFO [trainer.py:765] (6/8) Epoch 17, batch 300, train_loss[loss=2.828, ArTop10Accuracy=0.7632, over 14058.00 frames. ], tot_loss[loss=2.732, ArTop10Accuracy=0.7828, over 9393.18 frames. ], batch size: 44, lr: 7.15e-03
2024-08-06 13:05:21,759 INFO [trainer.py:765] (6/8) Epoch 17, batch 400, train_loss[loss=2.758, ArTop10Accuracy=0.7779, over 10314.00 frames. ], tot_loss[loss=2.73, ArTop10Accuracy=0.7832, over 10294.26 frames. ], batch size: 14, lr: 7.14e-03
2024-08-06 13:06:47,020 INFO [trainer.py:765] (6/8) Epoch 17, batch 500, train_loss[loss=2.681, ArTop10Accuracy=0.7888, over 12366.00 frames. ], tot_loss[loss=2.725, ArTop10Accuracy=0.7843, over 10838.45 frames. ], batch size: 22, lr: 7.12e-03
2024-08-06 13:07:39,878 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.140e+02 1.293e+02 1.386e+02 1.488e+02 3.253e+02, threshold=2.772e+02, percent-clipped=0.1
2024-08-06 13:08:22,687 INFO [trainer.py:765] (6/8) Epoch 17, batch 600, train_loss[loss=2.668, ArTop10Accuracy=0.794, over 11466.00 frames. ], tot_loss[loss=2.728, ArTop10Accuracy=0.7836, over 11379.70 frames. ], batch size: 18, lr: 7.10e-03
2024-08-06 13:09:54,834 INFO [trainer.py:765] (6/8) Epoch 17, batch 700, train_loss[loss=2.621, ArTop10Accuracy=0.8043, over 10122.00 frames. ], tot_loss[loss=2.734, ArTop10Accuracy=0.7823, over 11526.17 frames. ], batch size: 12, lr: 7.09e-03
2024-08-06 13:11:19,479 INFO [trainer.py:765] (6/8) Epoch 17, batch 800, train_loss[loss=2.704, ArTop10Accuracy=0.79, over 9333.00 frames. ], tot_loss[loss=2.737, ArTop10Accuracy=0.7818, over 11640.17 frames. ], batch size: 11, lr: 7.07e-03
2024-08-06 13:12:35,668 INFO [trainer.py:765] (6/8) Epoch 17, batch 900, train_loss[loss=2.734, ArTop10Accuracy=0.7787, over 12672.00 frames. ], tot_loss[loss=2.735, ArTop10Accuracy=0.7826, over 11668.05 frames. ], batch size: 27, lr: 7.06e-03
2024-08-06 13:13:53,060 INFO [trainer.py:765] (6/8) Epoch 17, batch 1000, train_loss[loss=2.713, ArTop10Accuracy=0.7853, over 12852.00 frames. ], tot_loss[loss=2.74, ArTop10Accuracy=0.7818, over 11869.79 frames. ], batch size: 27, lr: 7.04e-03
2024-08-06 13:15:08,483 INFO [trainer.py:765] (6/8) Epoch 17, batch 1100, train_loss[loss=2.706, ArTop10Accuracy=0.7856, over 13716.00 frames. ], tot_loss[loss=2.744, ArTop10Accuracy=0.781, over 11962.45 frames. ], batch size: 34, lr: 7.02e-03
2024-08-06 13:16:22,387 INFO [trainer.py:765] (6/8) Epoch 17, batch 1200, train_loss[loss=2.909, ArTop10Accuracy=0.7491, over 12246.00 frames. ], tot_loss[loss=2.744, ArTop10Accuracy=0.781, over 11873.02 frames. ], batch size: 101, lr: 7.01e-03
2024-08-06 13:17:22,043 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 13:19:15,993 INFO [trainer.py:765] (6/8) Epoch 18, batch 100, train_loss[loss=2.776, ArTop10Accuracy=0.7781, over 14277.00 frames. ], tot_loss[loss=2.73, ArTop10Accuracy=0.783, over 4750.54 frames. ], batch size: 62, lr: 6.78e-03
2024-08-06 13:20:46,600 INFO [trainer.py:765] (6/8) Epoch 18, batch 200, train_loss[loss=2.672, ArTop10Accuracy=0.7921, over 13611.00 frames. ], tot_loss[loss=2.726, ArTop10Accuracy=0.7837, over 7748.30 frames. ], batch size: 34, lr: 6.77e-03
2024-08-06 13:21:55,104 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 13:22:04,751 INFO [trainer.py:811] (6/8) Epoch 18, validation: loss=2.817, ArTop10Accuracy=0.768, over 1827537.00 frames.
2024-08-06 13:22:04,752 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 32996MB
2024-08-06 13:22:05,473 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.131e+02 1.323e+02 1.409e+02 1.514e+02 3.209e+02, threshold=2.818e+02, percent-clipped=0.1
2024-08-06 13:22:26,581 INFO [trainer.py:765] (6/8) Epoch 18, batch 300, train_loss[loss=2.838, ArTop10Accuracy=0.7638, over 13896.00 frames. ], tot_loss[loss=2.72, ArTop10Accuracy=0.7851, over 9377.58 frames. ], batch size: 44, lr: 6.76e-03
2024-08-06 13:23:57,930 INFO [trainer.py:765] (6/8) Epoch 18, batch 400, train_loss[loss=2.641, ArTop10Accuracy=0.8014, over 10359.00 frames. ], tot_loss[loss=2.715, ArTop10Accuracy=0.7864, over 10291.94 frames. ], batch size: 14, lr: 6.74e-03
2024-08-06 13:25:34,013 INFO [trainer.py:765] (6/8) Epoch 18, batch 500, train_loss[loss=2.668, ArTop10Accuracy=0.7913, over 12363.00 frames. ], tot_loss[loss=2.707, ArTop10Accuracy=0.7878, over 10844.48 frames. ], batch size: 22, lr: 6.73e-03
2024-08-06 13:27:00,634 INFO [trainer.py:765] (6/8) Epoch 18, batch 600, train_loss[loss=2.764, ArTop10Accuracy=0.7797, over 11328.00 frames. ], tot_loss[loss=2.715, ArTop10Accuracy=0.7862, over 11349.84 frames. ], batch size: 18, lr: 6.71e-03
2024-08-06 13:28:33,582 INFO [trainer.py:765] (6/8) Epoch 18, batch 700, train_loss[loss=2.591, ArTop10Accuracy=0.8194, over 10095.00 frames. ], tot_loss[loss=2.722, ArTop10Accuracy=0.7846, over 11508.81 frames. ], batch size: 12, lr: 6.70e-03
2024-08-06 13:29:54,986 INFO [trainer.py:765] (6/8) Epoch 18, batch 800, train_loss[loss=2.675, ArTop10Accuracy=0.7975, over 10242.00 frames. ], tot_loss[loss=2.727, ArTop10Accuracy=0.7839, over 11641.75 frames. ], batch size: 12, lr: 6.68e-03
2024-08-06 13:31:12,519 INFO [trainer.py:765] (6/8) Epoch 18, batch 900, train_loss[loss=2.736, ArTop10Accuracy=0.785, over 12867.00 frames. ], tot_loss[loss=2.723, ArTop10Accuracy=0.7847, over 11684.78 frames. ], batch size: 27, lr: 6.67e-03
2024-08-06 13:32:26,552 INFO [trainer.py:765] (6/8) Epoch 18, batch 1000, train_loss[loss=2.675, ArTop10Accuracy=0.7963, over 13053.00 frames. ], tot_loss[loss=2.728, ArTop10Accuracy=0.7837, over 11887.14 frames. ], batch size: 27, lr: 6.66e-03
2024-08-06 13:33:41,498 INFO [trainer.py:765] (6/8) Epoch 18, batch 1100, train_loss[loss=2.771, ArTop10Accuracy=0.7765, over 13674.00 frames. ], tot_loss[loss=2.737, ArTop10Accuracy=0.7822, over 11950.20 frames. ], batch size: 34, lr: 6.64e-03
2024-08-06 13:34:54,674 INFO [trainer.py:765] (6/8) Epoch 18, batch 1200, train_loss[loss=2.864, ArTop10Accuracy=0.756, over 12729.00 frames. ], tot_loss[loss=2.737, ArTop10Accuracy=0.7823, over 11875.52 frames. ], batch size: 103, lr: 6.63e-03
2024-08-06 13:35:51,064 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.124e+02 1.340e+02 1.433e+02 1.533e+02 2.444e+02, threshold=2.867e+02, percent-clipped=0.0
2024-08-06 13:35:54,178 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 13:37:48,625 INFO [trainer.py:765] (6/8) Epoch 19, batch 100, train_loss[loss=2.803, ArTop10Accuracy=0.7693, over 14697.00 frames. ], tot_loss[loss=2.725, ArTop10Accuracy=0.784, over 4767.13 frames. ], batch size: 62, lr: 6.43e-03
2024-08-06 13:39:23,257 INFO [trainer.py:765] (6/8) Epoch 19, batch 200, train_loss[loss=2.731, ArTop10Accuracy=0.7833, over 13434.00 frames. ], tot_loss[loss=2.714, ArTop10Accuracy=0.7859, over 7768.01 frames. ], batch size: 34, lr: 6.41e-03
2024-08-06 13:40:48,360 INFO [trainer.py:765] (6/8) Epoch 19, batch 300, train_loss[loss=2.752, ArTop10Accuracy=0.7792, over 14064.00 frames. ], tot_loss[loss=2.709, ArTop10Accuracy=0.7868, over 9391.40 frames. ], batch size: 44, lr: 6.40e-03
2024-08-06 13:42:21,067 INFO [trainer.py:765] (6/8) Epoch 19, batch 400, train_loss[loss=2.602, ArTop10Accuracy=0.8087, over 10113.00 frames. ], tot_loss[loss=2.704, ArTop10Accuracy=0.7881, over 10296.90 frames. ], batch size: 14, lr: 6.39e-03
2024-08-06 13:43:44,955 INFO [trainer.py:765] (6/8) Epoch 19, batch 500, train_loss[loss=2.709, ArTop10Accuracy=0.7881, over 12324.00 frames. ], tot_loss[loss=2.702, ArTop10Accuracy=0.7884, over 10837.31 frames. ], batch size: 22, lr: 6.37e-03
2024-08-06 13:45:16,682 INFO [trainer.py:765] (6/8) Epoch 19, batch 600, train_loss[loss=2.716, ArTop10Accuracy=0.7878, over 11298.00 frames. ], tot_loss[loss=2.709, ArTop10Accuracy=0.7873, over 11358.08 frames. ], batch size: 18, lr: 6.36e-03
2024-08-06 13:46:48,324 INFO [trainer.py:765] (6/8) Epoch 19, batch 700, train_loss[loss=2.64, ArTop10Accuracy=0.7995, over 9423.00 frames. ], tot_loss[loss=2.713, ArTop10Accuracy=0.7867, over 11509.57 frames. ], batch size: 11, lr: 6.35e-03
2024-08-06 13:48:11,884 INFO [trainer.py:765] (6/8) Epoch 19, batch 800, train_loss[loss=2.655, ArTop10Accuracy=0.7969, over 9987.00 frames. ], tot_loss[loss=2.718, ArTop10Accuracy=0.7858, over 11618.78 frames. ], batch size: 12, lr: 6.34e-03
2024-08-06 13:49:27,259 INFO [trainer.py:765] (6/8) Epoch 19, batch 900, train_loss[loss=2.658, ArTop10Accuracy=0.7996, over 12774.00 frames. ], tot_loss[loss=2.717, ArTop10Accuracy=0.7859, over 11675.63 frames. ], batch size: 27, lr: 6.32e-03
2024-08-06 13:50:40,655 INFO [trainer.py:803] (6/8) Computing validation loss
2024-08-06 13:50:50,537 INFO [trainer.py:811] (6/8) Epoch 19, validation: loss=2.818, ArTop10Accuracy=0.7679, over 1827537.00 frames.
2024-08-06 13:50:50,537 INFO [trainer.py:814] (6/8) Maximum memory allocated so far is 33001MB
2024-08-06 13:50:51,489 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.161e+02 1.371e+02 1.455e+02 1.550e+02 3.697e+02, threshold=2.909e+02, percent-clipped=0.2
2024-08-06 13:50:52,915 INFO [trainer.py:765] (6/8) Epoch 19, batch 1000, train_loss[loss=2.717, ArTop10Accuracy=0.7874, over 13323.00 frames. ], tot_loss[loss=2.718, ArTop10Accuracy=0.7857, over 11862.01 frames. ], batch size: 28, lr: 6.31e-03
2024-08-06 13:52:08,265 INFO [trainer.py:765] (6/8) Epoch 19, batch 1100, train_loss[loss=2.755, ArTop10Accuracy=0.7818, over 13548.00 frames. ], tot_loss[loss=2.724, ArTop10Accuracy=0.7847, over 11965.46 frames. ], batch size: 34, lr: 6.30e-03
2024-08-06 13:53:22,311 INFO [trainer.py:765] (6/8) Epoch 19, batch 1200, train_loss[loss=2.826, ArTop10Accuracy=0.7675, over 12309.00 frames. ], tot_loss[loss=2.726, ArTop10Accuracy=0.7843, over 11867.08 frames. ], batch size: 101, lr: 6.28e-03
2024-08-06 13:54:21,695 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 13:56:12,904 INFO [trainer.py:765] (6/8) Epoch 20, batch 100, train_loss[loss=2.797, ArTop10Accuracy=0.7733, over 14463.00 frames. ], tot_loss[loss=2.713, ArTop10Accuracy=0.7859, over 4772.15 frames. ], batch size: 62, lr: 6.10e-03
2024-08-06 13:57:42,494 INFO [trainer.py:765] (6/8) Epoch 20, batch 200, train_loss[loss=2.71, ArTop10Accuracy=0.793, over 13584.00 frames. ], tot_loss[loss=2.708, ArTop10Accuracy=0.7872, over 7731.54 frames. ], batch size: 34, lr: 6.09e-03
2024-08-06 13:59:15,430 INFO [trainer.py:765] (6/8) Epoch 20, batch 300, train_loss[loss=2.815, ArTop10Accuracy=0.7639, over 14337.00 frames. ], tot_loss[loss=2.703, ArTop10Accuracy=0.7882, over 9366.66 frames. ], batch size: 44, lr: 6.08e-03
2024-08-06 14:00:44,356 INFO [trainer.py:765] (6/8) Epoch 20, batch 400, train_loss[loss=2.476, ArTop10Accuracy=0.8339, over 10269.00 frames. ], tot_loss[loss=2.698, ArTop10Accuracy=0.7895, over 10281.80 frames. ], batch size: 14, lr: 6.07e-03
2024-08-06 14:02:14,854 INFO [trainer.py:765] (6/8) Epoch 20, batch 500, train_loss[loss=2.669, ArTop10Accuracy=0.7987, over 12246.00 frames. ], tot_loss[loss=2.695, ArTop10Accuracy=0.7901, over 10844.32 frames. ], batch size: 22, lr: 6.06e-03
2024-08-06 14:03:40,855 INFO [trainer.py:765] (6/8) Epoch 20, batch 600, train_loss[loss=2.728, ArTop10Accuracy=0.7828, over 11280.00 frames. ], tot_loss[loss=2.698, ArTop10Accuracy=0.7897, over 11360.98 frames. ], batch size: 18, lr: 6.04e-03
2024-08-06 14:05:13,864 INFO [trainer.py:765] (6/8) Epoch 20, batch 700, train_loss[loss=2.634, ArTop10Accuracy=0.8074, over 9540.00 frames. ], tot_loss[loss=2.701, ArTop10Accuracy=0.7889, over 11505.93 frames. ], batch size: 11, lr: 6.03e-03
2024-08-06 14:05:30,791 INFO [optim.py:386] (6/8) Clipping_scale=2.0, grad-norm quartiles 1.180e+02 1.365e+02 1.456e+02 1.550e+02 3.525e+02, threshold=2.913e+02, percent-clipped=0.1
2024-08-06 14:06:34,509 INFO [trainer.py:765] (6/8) Epoch 20, batch 800, train_loss[loss=2.617, ArTop10Accuracy=0.8054, over 9417.00 frames. ], tot_loss[loss=2.708, ArTop10Accuracy=0.7877, over 11629.12 frames. ], batch size: 11, lr: 6.02e-03
2024-08-06 14:07:50,944 INFO [trainer.py:765] (6/8) Epoch 20, batch 900, train_loss[loss=2.679, ArTop10Accuracy=0.7932, over 12960.00 frames. ], tot_loss[loss=2.702, ArTop10Accuracy=0.7887, over 11692.16 frames. ], batch size: 27, lr: 6.01e-03
2024-08-06 14:09:07,172 INFO [trainer.py:765] (6/8) Epoch 20, batch 1000, train_loss[loss=2.667, ArTop10Accuracy=0.7966, over 12981.00 frames. ], tot_loss[loss=2.707, ArTop10Accuracy=0.7878, over 11875.75 frames. ], batch size: 27, lr: 6.00e-03
2024-08-06 14:10:21,209 INFO [trainer.py:765] (6/8) Epoch 20, batch 1100, train_loss[loss=2.763, ArTop10Accuracy=0.7709, over 13563.00 frames. ], tot_loss[loss=2.715, ArTop10Accuracy=0.786, over 11962.47 frames. ], batch size: 34, lr: 5.99e-03
2024-08-06 14:11:37,812 INFO [trainer.py:765] (6/8) Epoch 20, batch 1200, train_loss[loss=2.868, ArTop10Accuracy=0.7602, over 12078.00 frames. ], tot_loss[loss=2.715, ArTop10Accuracy=0.7859, over 11889.77 frames. ], batch size: 101, lr: 5.98e-03
2024-08-06 14:12:37,393 INFO [trainer.py:650] (6/8) Reaches end of dataloader.
2024-08-06 14:12:37,395 INFO [trainer.py:1069] (6/8) Done!