2023-02-26 08:23:18,928 32k INFO {'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 32000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'meril': 0}, 'model_dir': './logs\\32k'} 2023-02-26 08:23:18,928 32k WARNING K:\AI\so-vits-svc-32k is not a git repository, therefore hash value comparison will be ignored. 2023-02-26 08:23:34,708 32k INFO emb_g.weight is not in the checkpoint 2023-02-26 08:23:34,818 32k INFO Loaded checkpoint './logs\32k\G_0.pth' (iteration 1) 2023-02-26 08:23:35,956 32k INFO Loaded checkpoint './logs\32k\D_0.pth' (iteration 1) 2023-02-26 08:24:07,398 32k INFO Train Epoch: 1 [0%] 2023-02-26 08:24:07,399 32k INFO [6.396151542663574, 2.3192086219787598, 11.437691688537598, 50.04783248901367, 12.884584426879883, 0, 0.0001] 2023-02-26 08:24:14,320 32k INFO Saving model and optimizer state at iteration 1 to ./logs\32k\G_0.pth 2023-02-26 08:24:29,149 32k INFO Saving model and optimizer state at iteration 1 to ./logs\32k\D_0.pth 2023-02-26 08:26:57,094 32k INFO ====> Epoch: 1 2023-02-26 08:27:48,937 32k INFO Train Epoch: 2 [21%] 2023-02-26 08:27:48,937 32k INFO [2.6703596115112305, 2.1580634117126465, 11.49161434173584, 22.269886016845703, 1.027299404144287, 200, 9.99875e-05] 2023-02-26 08:29:40,443 32k INFO ====> Epoch: 2 2023-02-26 08:31:02,090 32k INFO Train Epoch: 3 [42%] 2023-02-26 08:31:02,091 32k INFO [2.5759048461914062, 2.196209192276001, 13.65181827545166, 22.816627502441406, 0.9087493419647217, 400, 9.99750015625e-05] 2023-02-26 08:32:23,299 32k INFO ====> Epoch: 3 2023-02-26 08:34:14,621 32k INFO Train Epoch: 4 [64%] 2023-02-26 08:34:14,621 32k INFO [2.3012750148773193, 2.5668628215789795, 8.850838661193848, 16.783226013183594, 1.1684110164642334, 600, 9.996250468730469e-05] 2023-02-26 08:35:05,788 32k INFO ====> Epoch: 4 2023-02-26 08:37:27,454 32k INFO Train Epoch: 5 [85%] 2023-02-26 08:37:27,455 32k INFO [2.5453319549560547, 2.3678677082061768, 11.264457702636719, 20.05609130859375, 1.1281580924987793, 800, 9.995000937421877e-05] 2023-02-26 08:37:48,604 32k INFO ====> Epoch: 5 2023-02-26 08:40:30,764 32k INFO ====> Epoch: 6 2023-02-26 08:41:00,695 32k INFO Train Epoch: 7 [6%] 2023-02-26 08:41:00,696 32k INFO [2.663148880004883, 2.078253746032715, 7.356672286987305, 15.606974601745605, 0.9389564990997314, 1000, 9.99250234335941e-05] 2023-02-26 08:41:05,158 32k INFO Saving model and optimizer state at iteration 7 to ./logs\32k\G_1000.pth 2023-02-26 08:41:18,568 32k INFO Saving model and optimizer state at iteration 7 to ./logs\32k\D_1000.pth 2023-02-26 08:43:34,949 32k INFO ====> Epoch: 7 2023-02-26 08:44:35,017 32k INFO Train Epoch: 8 [27%] 2023-02-26 08:44:35,017 32k INFO [2.3168065547943115, 2.6603329181671143, 8.887020111083984, 17.153154373168945, 1.0433679819107056, 1200, 9.991253280566489e-05] 2023-02-26 08:46:17,519 32k INFO ====> Epoch: 8 2023-02-26 08:47:47,469 32k INFO Train Epoch: 9 [48%] 2023-02-26 08:47:47,469 32k INFO [2.3874289989471436, 2.434887170791626, 11.490910530090332, 21.34469985961914, 0.8404753804206848, 1400, 9.990004373906418e-05] 2023-02-26 08:49:00,098 32k INFO ====> Epoch: 9 2023-02-26 08:51:00,214 32k INFO Train Epoch: 10 [70%] 2023-02-26 08:51:00,214 32k INFO [2.1876630783081055, 2.5636796951293945, 12.61434555053711, 23.038005828857422, 0.9913359880447388, 1600, 9.98875562335968e-05] 2023-02-26 08:51:42,829 32k INFO ====> Epoch: 10 2023-02-26 08:54:12,770 32k INFO Train Epoch: 11 [91%] 2023-02-26 08:54:12,771 32k INFO [2.3796133995056152, 2.9137446880340576, 11.033517837524414, 21.80364227294922, 0.69889235496521, 1800, 9.987507028906759e-05] 2023-02-26 08:54:25,282 32k INFO ====> Epoch: 11 2023-02-26 08:57:07,397 32k INFO ====> Epoch: 12 2023-02-26 08:57:45,982 32k INFO Train Epoch: 13 [12%] 2023-02-26 08:57:45,982 32k INFO [2.590365409851074, 2.162517547607422, 12.567984580993652, 20.611303329467773, 0.8790818452835083, 2000, 9.98501030820433e-05] 2023-02-26 08:57:50,455 32k INFO Saving model and optimizer state at iteration 13 to ./logs\32k\G_2000.pth 2023-02-26 08:58:06,096 32k INFO Saving model and optimizer state at iteration 13 to ./logs\32k\D_2000.pth 2023-02-26 09:00:13,474 32k INFO ====> Epoch: 13 2023-02-26 09:01:22,067 32k INFO Train Epoch: 14 [33%] 2023-02-26 09:01:22,067 32k INFO [2.7683796882629395, 2.2413697242736816, 7.833517551422119, 17.857086181640625, 1.1680740118026733, 2200, 9.983762181915804e-05] 2023-02-26 09:02:56,328 32k INFO ====> Epoch: 14 2023-02-26 09:04:34,974 32k INFO Train Epoch: 15 [55%] 2023-02-26 09:04:34,974 32k INFO [2.3525094985961914, 2.3344459533691406, 12.257208824157715, 19.89115333557129, 0.9875838160514832, 2400, 9.982514211643064e-05] 2023-02-26 09:05:39,025 32k INFO ====> Epoch: 15 2023-02-26 09:07:47,693 32k INFO Train Epoch: 16 [76%] 2023-02-26 09:07:47,693 32k INFO [2.6893608570098877, 2.344364881515503, 10.605229377746582, 19.090167999267578, 0.780343234539032, 2600, 9.981266397366609e-05] 2023-02-26 09:08:21,682 32k INFO ====> Epoch: 16 2023-02-26 09:11:00,144 32k INFO Train Epoch: 17 [97%] 2023-02-26 09:11:00,144 32k INFO [2.524367570877075, 2.1253821849823, 10.096750259399414, 19.844287872314453, 1.2072694301605225, 2800, 9.980018739066937e-05] 2023-02-26 09:11:04,092 32k INFO ====> Epoch: 17 2023-02-26 09:13:46,167 32k INFO ====> Epoch: 18 2023-02-26 09:14:33,368 32k INFO Train Epoch: 19 [18%] 2023-02-26 09:14:33,368 32k INFO [2.5664799213409424, 2.009101629257202, 15.13039493560791, 22.3335018157959, 0.9496199488639832, 3000, 9.977523890319963e-05] 2023-02-26 09:14:37,794 32k INFO Saving model and optimizer state at iteration 19 to ./logs\32k\G_3000.pth 2023-02-26 09:14:55,196 32k INFO Saving model and optimizer state at iteration 19 to ./logs\32k\D_3000.pth 2023-02-26 09:16:54,310 32k INFO ====> Epoch: 19 2023-02-26 09:18:11,582 32k INFO Train Epoch: 20 [39%] 2023-02-26 09:18:11,582 32k INFO [2.4653780460357666, 2.1782279014587402, 10.213360786437988, 17.996280670166016, 1.3770432472229004, 3200, 9.976276699833672e-05] 2023-02-26 09:19:37,287 32k INFO ====> Epoch: 20 2023-02-26 09:21:24,337 32k INFO Train Epoch: 21 [61%] 2023-02-26 09:21:24,337 32k INFO [2.3505122661590576, 2.285341262817383, 12.031506538391113, 21.24757957458496, 0.6532730460166931, 3400, 9.975029665246193e-05] 2023-02-26 09:22:19,751 32k INFO ====> Epoch: 21 2023-02-26 09:24:36,978 32k INFO Train Epoch: 22 [82%] 2023-02-26 09:24:36,979 32k INFO [2.295872688293457, 2.435206413269043, 13.174171447753906, 20.36532974243164, 1.0691040754318237, 3600, 9.973782786538036e-05] 2023-02-26 09:25:02,486 32k INFO ====> Epoch: 22 2023-02-26 09:27:44,742 32k INFO ====> Epoch: 23 2023-02-26 09:28:10,473 32k INFO Train Epoch: 24 [3%] 2023-02-26 09:28:10,474 32k INFO [2.4777655601501465, 2.438159942626953, 10.104680061340332, 20.890073776245117, 0.7508599758148193, 3800, 9.971289496681757e-05] 2023-02-26 09:30:27,585 32k INFO ====> Epoch: 24 2023-02-26 09:31:23,330 32k INFO Train Epoch: 25 [24%] 2023-02-26 09:31:23,331 32k INFO [2.547969341278076, 2.103365421295166, 8.601760864257812, 17.903499603271484, 1.1798760890960693, 4000, 9.970043085494672e-05] 2023-02-26 09:31:27,787 32k INFO Saving model and optimizer state at iteration 25 to ./logs\32k\G_4000.pth 2023-02-26 09:31:42,830 32k INFO Saving model and optimizer state at iteration 25 to ./logs\32k\D_4000.pth 2023-02-26 09:33:33,295 32k INFO ====> Epoch: 25 2023-02-26 09:34:58,902 32k INFO Train Epoch: 26 [45%] 2023-02-26 09:34:58,903 32k INFO [2.509209632873535, 2.2358999252319336, 11.01781940460205, 20.53192710876465, 1.0602046251296997, 4200, 9.968796830108985e-05] 2023-02-26 09:36:15,852 32k INFO ====> Epoch: 26 2023-02-26 09:38:11,412 32k INFO Train Epoch: 27 [67%] 2023-02-26 09:38:11,412 32k INFO [2.666337490081787, 2.08825421333313, 9.963523864746094, 18.926380157470703, 1.1029638051986694, 4400, 9.967550730505221e-05] 2023-02-26 09:38:58,150 32k INFO ====> Epoch: 27 2023-02-26 09:41:23,952 32k INFO Train Epoch: 28 [88%] 2023-02-26 09:41:23,953 32k INFO [2.410893678665161, 2.41351318359375, 11.717813491821289, 20.35811996459961, 0.8942963480949402, 4600, 9.966304786663908e-05] 2023-02-26 09:41:40,917 32k INFO ====> Epoch: 28 2023-02-26 09:44:23,546 32k INFO ====> Epoch: 29 2023-02-26 09:44:57,880 32k INFO Train Epoch: 30 [9%] 2023-02-26 09:44:57,880 32k INFO [2.4157967567443848, 2.2903125286102295, 10.10302448272705, 17.93283462524414, 0.8192426562309265, 4800, 9.963813366190753e-05] 2023-02-26 09:47:06,129 32k INFO ====> Epoch: 30 2023-02-26 09:48:10,482 32k INFO Train Epoch: 31 [30%] 2023-02-26 09:48:10,483 32k INFO [2.456360101699829, 2.292372465133667, 11.75523567199707, 22.287111282348633, 1.0126503705978394, 5000, 9.962567889519979e-05] 2023-02-26 09:48:14,931 32k INFO Saving model and optimizer state at iteration 31 to ./logs\32k\G_5000.pth 2023-02-26 09:48:33,569 32k INFO Saving model and optimizer state at iteration 31 to ./logs\32k\D_5000.pth 2023-02-26 09:50:15,366 32k INFO ====> Epoch: 31 2023-02-26 09:51:49,619 32k INFO Train Epoch: 32 [52%] 2023-02-26 09:51:49,619 32k INFO [2.4675090312957764, 2.139700412750244, 10.635783195495605, 18.904586791992188, 1.133800745010376, 5200, 9.961322568533789e-05] 2023-02-26 09:52:57,795 32k INFO ====> Epoch: 32 2023-02-26 09:55:02,298 32k INFO Train Epoch: 33 [73%] 2023-02-26 09:55:02,299 32k INFO [2.346632480621338, 2.284864902496338, 10.70773696899414, 16.828781127929688, 0.8554806113243103, 5400, 9.960077403212722e-05] 2023-02-26 09:55:40,602 32k INFO ====> Epoch: 33 2023-02-26 09:58:15,316 32k INFO Train Epoch: 34 [94%] 2023-02-26 09:58:15,316 32k INFO [2.448796510696411, 2.3974175453186035, 9.994577407836914, 20.724594116210938, 0.7800424098968506, 5600, 9.95883239353732e-05] 2023-02-26 09:58:23,670 32k INFO ====> Epoch: 34 2023-02-26 10:01:06,450 32k INFO ====> Epoch: 35 2023-02-26 10:01:49,384 32k INFO Train Epoch: 36 [15%] 2023-02-26 10:01:49,384 32k INFO [2.4834535121917725, 2.215736150741577, 10.72994613647461, 20.1037654876709, 0.9641714096069336, 5800, 9.956342841045691e-05] 2023-02-26 10:03:49,561 32k INFO ====> Epoch: 36 2023-02-26 10:05:02,545 32k INFO Train Epoch: 37 [36%] 2023-02-26 10:05:02,546 32k INFO [2.2858619689941406, 2.297046422958374, 13.459763526916504, 20.353960037231445, 0.8896880745887756, 6000, 9.95509829819056e-05] 2023-02-26 10:05:07,000 32k INFO Saving model and optimizer state at iteration 37 to ./logs\32k\G_6000.pth 2023-02-26 10:05:21,749 32k INFO Saving model and optimizer state at iteration 37 to ./logs\32k\D_6000.pth 2023-02-26 10:06:55,120 32k INFO ====> Epoch: 37 2023-02-26 10:08:38,237 32k INFO Train Epoch: 38 [58%] 2023-02-26 10:08:38,237 32k INFO [2.468832015991211, 2.1250123977661133, 9.363271713256836, 16.582752227783203, 1.190895915031433, 6200, 9.953853910903285e-05] 2023-02-26 10:09:38,270 32k INFO ====> Epoch: 38 2023-02-26 10:11:51,557 32k INFO Train Epoch: 39 [79%] 2023-02-26 10:11:51,557 32k INFO [2.4695611000061035, 2.5815343856811523, 10.841808319091797, 20.596200942993164, 1.0436564683914185, 6400, 9.952609679164422e-05] 2023-02-26 10:12:21,372 32k INFO ====> Epoch: 39 2023-02-26 10:15:03,991 32k INFO ====> Epoch: 40 2023-02-26 10:15:25,470 32k INFO Train Epoch: 41 [0%] 2023-02-26 10:15:25,470 32k INFO [2.4647908210754395, 2.4343149662017822, 9.655537605285645, 17.267290115356445, 1.031686782836914, 6600, 9.950121682254156e-05] 2023-02-26 10:17:46,940 32k INFO ====> Epoch: 41 2023-02-26 10:18:38,447 32k INFO Train Epoch: 42 [21%] 2023-02-26 10:18:38,447 32k INFO [2.5329630374908447, 2.130812644958496, 11.61275863647461, 16.99370574951172, 0.6277658939361572, 6800, 9.948877917043875e-05] 2023-02-26 10:20:29,739 32k INFO ====> Epoch: 42 2023-02-26 10:21:51,358 32k INFO Train Epoch: 43 [42%] 2023-02-26 10:21:51,358 32k INFO [2.4816935062408447, 2.166879415512085, 8.776912689208984, 14.728988647460938, 0.7365485429763794, 7000, 9.947634307304244e-05] 2023-02-26 10:21:55,799 32k INFO Saving model and optimizer state at iteration 43 to ./logs\32k\G_7000.pth 2023-02-26 10:22:13,584 32k INFO Saving model and optimizer state at iteration 43 to ./logs\32k\D_7000.pth 2023-02-26 10:23:38,459 32k INFO ====> Epoch: 43 2023-02-26 10:25:30,320 32k INFO Train Epoch: 44 [64%] 2023-02-26 10:25:30,321 32k INFO [2.4604673385620117, 2.3776729106903076, 9.516617774963379, 18.740528106689453, 0.7489030361175537, 7200, 9.94639085301583e-05] 2023-02-26 10:26:21,626 32k INFO ====> Epoch: 44 2023-02-26 10:28:43,231 32k INFO Train Epoch: 45 [85%] 2023-02-26 10:28:43,231 32k INFO [2.274491786956787, 2.4330806732177734, 11.296477317810059, 20.537822723388672, 0.806971549987793, 7400, 9.945147554159202e-05] 2023-02-26 10:29:04,414 32k INFO ====> Epoch: 45 2023-02-26 10:31:47,148 32k INFO ====> Epoch: 46 2023-02-26 10:32:17,260 32k INFO Train Epoch: 47 [6%] 2023-02-26 10:32:17,260 32k INFO [2.7507975101470947, 2.126884698867798, 6.915186405181885, 13.514629364013672, 1.2603626251220703, 7600, 9.942661422663591e-05] 2023-02-26 10:34:30,480 32k INFO ====> Epoch: 47 2023-02-26 10:35:30,556 32k INFO Train Epoch: 48 [27%] 2023-02-26 10:35:30,556 32k INFO [2.0968923568725586, 2.708631992340088, 13.675420761108398, 20.606611251831055, 0.8771938681602478, 7800, 9.941418589985758e-05] 2023-02-26 10:37:13,522 32k INFO ====> Epoch: 48 2023-02-26 10:38:43,695 32k INFO Train Epoch: 49 [48%] 2023-02-26 10:38:43,695 32k INFO [2.3763391971588135, 2.3627359867095947, 9.687402725219727, 16.78105354309082, 1.0562866926193237, 8000, 9.940175912662009e-05] 2023-02-26 10:38:48,175 32k INFO Saving model and optimizer state at iteration 49 to ./logs\32k\G_8000.pth 2023-02-26 10:39:01,774 32k INFO Saving model and optimizer state at iteration 49 to ./logs\32k\D_8000.pth 2023-02-26 10:40:17,615 32k INFO ====> Epoch: 49 2023-02-26 10:42:17,724 32k INFO Train Epoch: 50 [70%] 2023-02-26 10:42:17,725 32k INFO [2.344200611114502, 2.1781373023986816, 11.780096054077148, 18.40574073791504, 1.086164116859436, 8200, 9.938933390672926e-05] 2023-02-26 10:43:00,363 32k INFO ====> Epoch: 50 2023-02-26 10:45:30,408 32k INFO Train Epoch: 51 [91%] 2023-02-26 10:45:30,408 32k INFO [2.4205174446105957, 2.7213199138641357, 7.73087215423584, 15.820646286010742, 0.7065295577049255, 8400, 9.937691023999092e-05] 2023-02-26 10:45:43,007 32k INFO ====> Epoch: 51 2023-02-26 10:48:26,056 32k INFO ====> Epoch: 52 2023-02-26 10:49:04,736 32k INFO Train Epoch: 53 [12%] 2023-02-26 10:49:04,737 32k INFO [2.5237555503845215, 2.2003116607666016, 8.608362197875977, 19.112735748291016, 0.7628692984580994, 8600, 9.935206756519513e-05] 2023-02-26 10:51:09,259 32k INFO ====> Epoch: 53 2023-02-26 10:52:17,858 32k INFO Train Epoch: 54 [33%] 2023-02-26 10:52:17,858 32k INFO [2.260878562927246, 2.6713953018188477, 8.389069557189941, 18.49284553527832, 0.44008728861808777, 8800, 9.933964855674948e-05] 2023-02-26 10:53:51,921 32k INFO ====> Epoch: 54 2023-02-26 10:55:30,509 32k INFO Train Epoch: 55 [55%] 2023-02-26 10:55:30,509 32k INFO [2.3566603660583496, 2.293156385421753, 10.89488410949707, 21.621196746826172, 0.8128052949905396, 9000, 9.932723110067987e-05] 2023-02-26 10:55:34,991 32k INFO Saving model and optimizer state at iteration 55 to ./logs\32k\G_9000.pth 2023-02-26 10:55:51,219 32k INFO Saving model and optimizer state at iteration 55 to ./logs\32k\D_9000.pth 2023-02-26 10:56:58,333 32k INFO ====> Epoch: 55 2023-02-26 10:59:07,126 32k INFO Train Epoch: 56 [76%] 2023-02-26 10:59:07,126 32k INFO [2.519491195678711, 2.2075395584106445, 8.736771583557129, 16.377058029174805, 1.0564581155776978, 9200, 9.931481519679228e-05] 2023-02-26 10:59:41,285 32k INFO ====> Epoch: 56 2023-02-26 11:02:20,544 32k INFO Train Epoch: 57 [97%] 2023-02-26 11:02:20,544 32k INFO [2.3525781631469727, 2.154360771179199, 6.794205665588379, 13.52789306640625, 0.7326827645301819, 9400, 9.930240084489267e-05] 2023-02-26 11:02:24,511 32k INFO ====> Epoch: 57 2023-02-26 11:05:07,452 32k INFO ====> Epoch: 58 2023-02-26 11:05:54,673 32k INFO Train Epoch: 59 [18%] 2023-02-26 11:05:54,673 32k INFO [2.4005348682403564, 2.302424907684326, 13.667900085449219, 19.535924911499023, 1.1588077545166016, 9600, 9.927757679628145e-05] 2023-02-26 11:07:50,410 32k INFO ====> Epoch: 59 2023-02-26 11:09:07,843 32k INFO Train Epoch: 60 [39%] 2023-02-26 11:09:07,843 32k INFO [2.6450674533843994, 2.2238292694091797, 9.370787620544434, 17.595104217529297, 1.1768231391906738, 9800, 9.926516709918191e-05] 2023-02-26 11:10:33,485 32k INFO ====> Epoch: 60 2023-02-26 11:12:21,000 32k INFO Train Epoch: 61 [61%] 2023-02-26 11:12:21,000 32k INFO [2.4990792274475098, 2.169142961502075, 10.941349983215332, 20.05807876586914, 0.4903837740421295, 10000, 9.92527589532945e-05] 2023-02-26 11:12:25,586 32k INFO Saving model and optimizer state at iteration 61 to ./logs\32k\G_10000.pth 2023-02-26 11:12:44,877 32k INFO Saving model and optimizer state at iteration 61 to ./logs\32k\D_10000.pth 2023-02-26 11:13:43,760 32k INFO ====> Epoch: 61 2023-02-26 11:16:01,215 32k INFO Train Epoch: 62 [82%] 2023-02-26 11:16:01,215 32k INFO [2.5079026222229004, 2.3164236545562744, 11.353753089904785, 17.586023330688477, 0.8176776170730591, 10200, 9.924035235842533e-05] 2023-02-26 11:16:26,741 32k INFO ====> Epoch: 62 2023-02-26 11:19:09,446 32k INFO ====> Epoch: 63 2023-02-26 11:19:35,364 32k INFO Train Epoch: 64 [3%] 2023-02-26 11:19:35,364 32k INFO [2.4500768184661865, 2.264669418334961, 10.30024528503418, 20.142595291137695, 0.8843652606010437, 10400, 9.921554382096622e-05] 2023-02-26 11:21:51,912 32k INFO ====> Epoch: 64 2023-02-26 11:22:47,692 32k INFO Train Epoch: 65 [24%] 2023-02-26 11:22:47,693 32k INFO [2.3112714290618896, 2.271970510482788, 11.571227073669434, 20.092790603637695, 1.156628966331482, 10600, 9.92031418779886e-05] 2023-02-26 11:24:34,238 32k INFO ====> Epoch: 65 2023-02-26 11:26:00,281 32k INFO Train Epoch: 66 [45%] 2023-02-26 11:26:00,281 32k INFO [2.2577104568481445, 2.4161815643310547, 13.695369720458984, 20.6165714263916, 0.7836306095123291, 10800, 9.919074148525384e-05] 2023-02-26 11:27:17,333 32k INFO ====> Epoch: 66 2023-02-26 11:29:13,378 32k INFO Train Epoch: 67 [67%] 2023-02-26 11:29:13,379 32k INFO [2.5594491958618164, 2.362276554107666, 8.985106468200684, 17.9657039642334, 0.5958060026168823, 11000, 9.917834264256819e-05] 2023-02-26 11:29:17,966 32k INFO Saving model and optimizer state at iteration 67 to ./logs\32k\G_11000.pth 2023-02-26 11:29:37,399 32k INFO Saving model and optimizer state at iteration 67 to ./logs\32k\D_11000.pth 2023-02-26 11:30:27,649 32k INFO ====> Epoch: 67 2023-02-26 11:32:53,499 32k INFO Train Epoch: 68 [88%] 2023-02-26 11:32:53,500 32k INFO [2.488551378250122, 2.086569309234619, 7.120002746582031, 15.225802421569824, 1.0212639570236206, 11200, 9.916594534973787e-05] 2023-02-26 11:33:10,300 32k INFO ====> Epoch: 68 2023-02-26 11:35:52,703 32k INFO ====> Epoch: 69 2023-02-26 11:36:27,149 32k INFO Train Epoch: 70 [9%] 2023-02-26 11:36:27,150 32k INFO [2.421487808227539, 2.171816825866699, 10.875814437866211, 18.283443450927734, 0.7764310836791992, 11400, 9.914115541286833e-05] 2023-02-26 11:38:35,072 32k INFO ====> Epoch: 70 2023-02-26 11:39:39,324 32k INFO Train Epoch: 71 [30%] 2023-02-26 11:39:39,325 32k INFO [2.53536319732666, 2.252779722213745, 8.362150192260742, 18.167621612548828, 0.8004449009895325, 11600, 9.912876276844171e-05] 2023-02-26 11:41:17,298 32k INFO ====> Epoch: 71 2023-02-26 11:42:51,836 32k INFO Train Epoch: 72 [52%] 2023-02-26 11:42:51,836 32k INFO [2.354383945465088, 2.426551342010498, 10.844928741455078, 19.34088897705078, 0.8941074013710022, 11800, 9.911637167309565e-05] 2023-02-26 11:43:59,941 32k INFO ====> Epoch: 72 2023-02-26 11:46:04,250 32k INFO Train Epoch: 73 [73%] 2023-02-26 11:46:04,250 32k INFO [2.426401376724243, 2.322526454925537, 12.05467700958252, 20.506017684936523, 0.779582142829895, 12000, 9.910398212663652e-05] 2023-02-26 11:46:08,787 32k INFO Saving model and optimizer state at iteration 73 to ./logs\32k\G_12000.pth 2023-02-26 11:46:29,376 32k INFO Saving model and optimizer state at iteration 73 to ./logs\32k\D_12000.pth 2023-02-26 11:47:11,053 32k INFO ====> Epoch: 73 2023-02-26 11:49:45,489 32k INFO Train Epoch: 74 [94%] 2023-02-26 11:49:45,490 32k INFO [2.5343637466430664, 2.27656888961792, 7.482431411743164, 15.527031898498535, 1.1111173629760742, 12200, 9.909159412887068e-05] 2023-02-26 11:49:53,749 32k INFO ====> Epoch: 74 2023-02-26 11:52:35,964 32k INFO ====> Epoch: 75 2023-02-26 11:53:19,010 32k INFO Train Epoch: 76 [15%] 2023-02-26 11:53:19,011 32k INFO [2.406834363937378, 2.1646063327789307, 11.008843421936035, 19.259048461914062, 0.9326943159103394, 12400, 9.906682277864462e-05] 2023-02-26 11:55:19,408 32k INFO ====> Epoch: 76 2023-02-26 11:56:32,760 32k INFO Train Epoch: 77 [36%] 2023-02-26 11:56:32,761 32k INFO [2.4522836208343506, 2.1643474102020264, 8.926955223083496, 16.831026077270508, 0.900156557559967, 12600, 9.905443942579728e-05] 2023-02-26 11:58:02,659 32k INFO ====> Epoch: 77 2023-02-26 11:59:45,816 32k INFO Train Epoch: 78 [58%] 2023-02-26 11:59:45,816 32k INFO [2.5271518230438232, 1.9529402256011963, 10.654559135437012, 18.19529914855957, 0.8628219366073608, 12800, 9.904205762086905e-05] 2023-02-26 12:00:45,628 32k INFO ====> Epoch: 78 2023-02-26 12:02:58,997 32k INFO Train Epoch: 79 [79%] 2023-02-26 12:02:58,997 32k INFO [2.6350643634796143, 2.541313648223877, 8.186237335205078, 15.443192481994629, 0.9821919202804565, 13000, 9.902967736366644e-05] 2023-02-26 12:03:03,614 32k INFO Saving model and optimizer state at iteration 79 to ./logs\32k\G_13000.pth 2023-02-26 12:03:21,550 32k INFO Saving model and optimizer state at iteration 79 to ./logs\32k\D_13000.pth 2023-02-26 12:03:54,598 32k INFO ====> Epoch: 79 2023-02-26 12:06:37,363 32k INFO ====> Epoch: 80 2023-02-26 12:06:58,998 32k INFO Train Epoch: 81 [0%] 2023-02-26 12:06:58,998 32k INFO [2.4078915119171143, 2.3199801445007324, 11.23653793334961, 20.276243209838867, 1.1815028190612793, 13200, 9.900492149166423e-05] 2023-02-26 12:09:20,572 32k INFO ====> Epoch: 81 2023-02-26 12:10:12,077 32k INFO Train Epoch: 82 [21%] 2023-02-26 12:10:12,077 32k INFO [2.6164400577545166, 2.0303988456726074, 9.276334762573242, 16.739444732666016, 0.6553909778594971, 13400, 9.899254587647776e-05] 2023-02-26 12:12:03,130 32k INFO ====> Epoch: 82 2023-02-26 12:13:24,471 32k INFO Train Epoch: 83 [42%] 2023-02-26 12:13:24,471 32k INFO [2.5159566402435303, 2.14263916015625, 9.134819984436035, 15.940903663635254, 0.8988858461380005, 13600, 9.89801718082432e-05] 2023-02-26 12:14:45,738 32k INFO ====> Epoch: 83 2023-02-26 12:16:37,387 32k INFO Train Epoch: 84 [64%] 2023-02-26 12:16:37,387 32k INFO [2.477391242980957, 2.244145393371582, 10.206537246704102, 18.64247703552246, 1.085805058479309, 13800, 9.896779928676716e-05] 2023-02-26 12:17:28,668 32k INFO ====> Epoch: 84 2023-02-26 12:19:51,028 32k INFO Train Epoch: 85 [85%] 2023-02-26 12:19:51,028 32k INFO [2.391843557357788, 2.3440544605255127, 9.451476097106934, 18.754911422729492, 0.7714139819145203, 14000, 9.895542831185631e-05] 2023-02-26 12:19:55,663 32k INFO Saving model and optimizer state at iteration 85 to ./logs\32k\G_14000.pth 2023-02-26 12:20:14,439 32k INFO Saving model and optimizer state at iteration 85 to ./logs\32k\D_14000.pth 2023-02-26 12:20:38,984 32k INFO ====> Epoch: 85 2023-02-26 12:23:21,881 32k INFO ====> Epoch: 86 2023-02-26 12:23:52,036 32k INFO Train Epoch: 87 [6%] 2023-02-26 12:23:52,036 32k INFO [2.5880343914031982, 2.229883909225464, 9.676251411437988, 17.177593231201172, 1.0161173343658447, 14200, 9.89306910009569e-05] 2023-02-26 12:26:04,685 32k INFO ====> Epoch: 87 2023-02-26 12:27:04,860 32k INFO Train Epoch: 88 [27%] 2023-02-26 12:27:04,861 32k INFO [2.4286739826202393, 2.340498685836792, 8.028197288513184, 17.648345947265625, 0.772789716720581, 14400, 9.891832466458178e-05] 2023-02-26 12:28:47,879 32k INFO ====> Epoch: 88 2023-02-26 12:30:17,930 32k INFO Train Epoch: 89 [48%] 2023-02-26 12:30:17,930 32k INFO [2.43446683883667, 2.1952078342437744, 9.100362777709961, 15.110132217407227, 0.9286537170410156, 14600, 9.89059598739987e-05] 2023-02-26 12:31:30,914 32k INFO ====> Epoch: 89 2023-02-26 12:33:31,510 32k INFO Train Epoch: 90 [70%] 2023-02-26 12:33:31,510 32k INFO [2.373197078704834, 2.1595218181610107, 11.149375915527344, 18.03667640686035, 1.4922605752944946, 14800, 9.889359662901445e-05] 2023-02-26 12:34:14,202 32k INFO ====> Epoch: 90 2023-02-26 12:36:44,894 32k INFO Train Epoch: 91 [91%] 2023-02-26 12:36:44,894 32k INFO [2.510439872741699, 2.3874175548553467, 10.754826545715332, 17.15918731689453, 0.9694947004318237, 15000, 9.888123492943583e-05] 2023-02-26 12:36:49,496 32k INFO Saving model and optimizer state at iteration 91 to ./logs\32k\G_15000.pth 2023-02-26 12:37:05,821 32k INFO Saving model and optimizer state at iteration 91 to ./logs\32k\D_15000.pth 2023-02-26 12:37:21,861 32k INFO ====> Epoch: 91 2023-02-26 12:40:04,580 32k INFO ====> Epoch: 92 2023-02-26 12:40:43,339 32k INFO Train Epoch: 93 [12%] 2023-02-26 12:40:43,339 32k INFO [2.642388105392456, 2.06752872467041, 8.832306861877441, 16.4188232421875, 0.5744019150733948, 15200, 9.885651616572276e-05] 2023-02-26 12:42:47,745 32k INFO ====> Epoch: 93 2023-02-26 12:43:56,624 32k INFO Train Epoch: 94 [33%] 2023-02-26 12:43:56,624 32k INFO [2.478945255279541, 2.3418726921081543, 8.750991821289062, 18.5238094329834, 0.5728946924209595, 15400, 9.884415910120204e-05] 2023-02-26 12:45:31,004 32k INFO ====> Epoch: 94 2023-02-26 12:47:10,134 32k INFO Train Epoch: 95 [55%] 2023-02-26 12:47:10,134 32k INFO [2.3579206466674805, 2.31203031539917, 12.870414733886719, 18.046958923339844, 0.9995560050010681, 15600, 9.883180358131438e-05] 2023-02-26 12:48:14,547 32k INFO ====> Epoch: 95 2023-02-26 12:50:23,600 32k INFO Train Epoch: 96 [76%] 2023-02-26 12:50:23,601 32k INFO [2.4442484378814697, 2.2578535079956055, 13.333099365234375, 20.16985511779785, 0.7437376379966736, 15800, 9.881944960586671e-05] 2023-02-26 12:50:57,662 32k INFO ====> Epoch: 96 2023-02-26 12:53:36,723 32k INFO Train Epoch: 97 [97%] 2023-02-26 12:53:36,723 32k INFO [2.3909411430358887, 2.307910919189453, 10.894923210144043, 17.3150691986084, 0.68175208568573, 16000, 9.880709717466598e-05] 2023-02-26 12:53:41,364 32k INFO Saving model and optimizer state at iteration 97 to ./logs\32k\G_16000.pth 2023-02-26 12:53:59,201 32k INFO Saving model and optimizer state at iteration 97 to ./logs\32k\D_16000.pth 2023-02-26 12:54:06,830 32k INFO ====> Epoch: 97 2023-02-26 12:56:49,437 32k INFO ====> Epoch: 98 2023-02-26 12:57:36,944 32k INFO Train Epoch: 99 [18%] 2023-02-26 12:57:36,944 32k INFO [2.634997606277466, 2.167447090148926, 9.929041862487793, 16.28448486328125, 0.8011267781257629, 16200, 9.87823969442332e-05] 2023-02-26 12:59:33,007 32k INFO ====> Epoch: 99 2023-02-26 13:00:50,543 32k INFO Train Epoch: 100 [39%] 2023-02-26 13:00:50,543 32k INFO [2.3184642791748047, 2.705042839050293, 16.255216598510742, 21.745960235595703, 0.44556474685668945, 16400, 9.877004914461517e-05] 2023-02-26 13:02:16,391 32k INFO ====> Epoch: 100 2023-02-26 13:04:03,758 32k INFO Train Epoch: 101 [61%] 2023-02-26 13:04:03,759 32k INFO [2.477332830429077, 2.2263150215148926, 9.293753623962402, 14.126523971557617, 1.1971876621246338, 16600, 9.875770288847208e-05] 2023-02-26 13:04:59,212 32k INFO ====> Epoch: 101 2023-02-26 13:07:16,685 32k INFO Train Epoch: 102 [82%] 2023-02-26 13:07:16,685 32k INFO [2.2978484630584717, 2.2822859287261963, 16.321632385253906, 20.7629337310791, 1.3268964290618896, 16800, 9.874535817561101e-05] 2023-02-26 13:07:42,165 32k INFO ====> Epoch: 102 2023-02-26 13:10:24,749 32k INFO ====> Epoch: 103 2023-02-26 13:10:50,748 32k INFO Train Epoch: 104 [3%] 2023-02-26 13:10:50,748 32k INFO [2.371419906616211, 2.4400792121887207, 12.4196138381958, 21.362497329711914, 0.7677367925643921, 17000, 9.872067337896332e-05] 2023-02-26 13:10:55,199 32k INFO Saving model and optimizer state at iteration 104 to ./logs\32k\G_17000.pth 2023-02-26 13:11:13,185 32k INFO Saving model and optimizer state at iteration 104 to ./logs\32k\D_17000.pth 2023-02-26 13:13:33,715 32k INFO ====> Epoch: 104 2023-02-26 13:14:29,696 32k INFO Train Epoch: 105 [24%] 2023-02-26 13:14:29,696 32k INFO [2.47296142578125, 2.4107069969177246, 12.652009963989258, 18.860116958618164, 0.5747465491294861, 17200, 9.870833329479095e-05] 2023-02-26 13:16:16,869 32k INFO ====> Epoch: 105 2023-02-26 13:17:42,916 32k INFO Train Epoch: 106 [45%] 2023-02-26 13:17:42,916 32k INFO [2.304741859436035, 2.252957344055176, 9.988290786743164, 17.014446258544922, 1.0552763938903809, 17400, 9.86959947531291e-05] 2023-02-26 13:18:59,953 32k INFO ====> Epoch: 106 2023-02-26 13:20:55,999 32k INFO Train Epoch: 107 [67%] 2023-02-26 13:20:56,000 32k INFO [2.284235715866089, 2.448091506958008, 12.869209289550781, 20.055776596069336, 0.9019550681114197, 17600, 9.868365775378495e-05] 2023-02-26 13:21:42,926 32k INFO ====> Epoch: 107 2023-02-26 13:24:09,043 32k INFO Train Epoch: 108 [88%] 2023-02-26 13:24:09,043 32k INFO [2.475769281387329, 2.1433467864990234, 11.07569694519043, 18.09524917602539, 0.5900028347969055, 17800, 9.867132229656573e-05] 2023-02-26 13:24:25,935 32k INFO ====> Epoch: 108 2023-02-26 13:27:08,654 32k INFO ====> Epoch: 109 2023-02-26 13:27:43,160 32k INFO Train Epoch: 110 [9%] 2023-02-26 13:27:43,160 32k INFO [2.4269490242004395, 2.379713535308838, 11.117244720458984, 17.388996124267578, 1.439730167388916, 18000, 9.864665600773098e-05] 2023-02-26 13:27:47,636 32k INFO Saving model and optimizer state at iteration 110 to ./logs\32k\G_18000.pth 2023-02-26 13:28:04,718 32k INFO Saving model and optimizer state at iteration 110 to ./logs\32k\D_18000.pth 2023-02-26 13:30:16,432 32k INFO ====> Epoch: 110 2023-02-26 13:31:20,894 32k INFO Train Epoch: 111 [30%] 2023-02-26 13:31:20,895 32k INFO [2.3756492137908936, 2.4075100421905518, 10.187261581420898, 18.855302810668945, 1.052577257156372, 18200, 9.863432517573002e-05] 2023-02-26 13:32:59,526 32k INFO ====> Epoch: 111 2023-02-26 13:34:34,102 32k INFO Train Epoch: 112 [52%] 2023-02-26 13:34:34,102 32k INFO [2.535065174102783, 2.273414134979248, 10.668708801269531, 19.734350204467773, 0.91983962059021, 18400, 9.862199588508305e-05] 2023-02-26 13:35:42,444 32k INFO ====> Epoch: 112 2023-02-26 13:37:59,802 32k INFO Train Epoch: 113 [73%] 2023-02-26 13:37:59,803 32k INFO [2.4501099586486816, 2.217410087585449, 11.971563339233398, 20.161376953125, 0.8216803073883057, 18600, 9.86096681355974e-05] 2023-02-26 13:38:38,263 32k INFO ====> Epoch: 113 2023-02-26 13:41:13,187 32k INFO Train Epoch: 114 [94%] 2023-02-26 13:41:13,188 32k INFO [2.3596482276916504, 2.3115038871765137, 11.398310661315918, 20.521438598632812, 0.8498327136039734, 18800, 9.859734192708044e-05] 2023-02-26 13:41:21,449 32k INFO ====> Epoch: 114 2023-02-26 13:44:03,918 32k INFO ====> Epoch: 115 2023-02-26 13:44:47,046 32k INFO Train Epoch: 116 [15%] 2023-02-26 13:44:47,047 32k INFO [2.0652332305908203, 3.1315736770629883, 8.01175594329834, 14.535475730895996, 0.7651816606521606, 19000, 9.857269413218213e-05] 2023-02-26 13:44:51,565 32k INFO Saving model and optimizer state at iteration 116 to ./logs\32k\G_19000.pth 2023-02-26 13:45:10,729 32k INFO Saving model and optimizer state at iteration 116 to ./logs\32k\D_19000.pth 2023-02-26 13:47:13,579 32k INFO ====> Epoch: 116 2023-02-26 13:48:26,856 32k INFO Train Epoch: 117 [36%] 2023-02-26 13:48:26,856 32k INFO [2.108147621154785, 2.9401679039001465, 10.276931762695312, 16.358720779418945, 1.1584646701812744, 19200, 9.85603725454156e-05] 2023-02-26 13:49:56,624 32k INFO ====> Epoch: 117 2023-02-26 13:51:40,008 32k INFO Train Epoch: 118 [58%] 2023-02-26 13:51:40,009 32k INFO [2.5673112869262695, 2.034385919570923, 9.616512298583984, 16.208316802978516, 0.5776308178901672, 19400, 9.854805249884741e-05] 2023-02-26 13:52:39,631 32k INFO ====> Epoch: 118 2023-02-26 13:54:52,831 32k INFO Train Epoch: 119 [79%] 2023-02-26 13:54:52,831 32k INFO [2.3758647441864014, 2.9925012588500977, 11.676872253417969, 16.84213638305664, 0.8426787257194519, 19600, 9.853573399228505e-05] 2023-02-26 13:55:22,466 32k INFO ====> Epoch: 119 2023-02-26 13:58:04,962 32k INFO ====> Epoch: 120 2023-02-26 13:58:26,712 32k INFO Train Epoch: 121 [0%] 2023-02-26 13:58:26,712 32k INFO [2.3770370483398438, 2.334702491760254, 11.059645652770996, 18.461627960205078, 0.8800027370452881, 19800, 9.851110159840781e-05] 2023-02-26 14:00:47,783 32k INFO ====> Epoch: 121 2023-02-26 14:01:39,407 32k INFO Train Epoch: 122 [21%] 2023-02-26 14:01:39,408 32k INFO [2.527496814727783, 2.255011796951294, 11.402054786682129, 17.249414443969727, 0.4702906012535095, 20000, 9.8498787710708e-05] 2023-02-26 14:01:43,901 32k INFO Saving model and optimizer state at iteration 122 to ./logs\32k\G_20000.pth 2023-02-26 14:02:02,003 32k INFO Saving model and optimizer state at iteration 122 to ./logs\32k\D_20000.pth 2023-02-26 14:03:56,814 32k INFO ====> Epoch: 122 2023-02-26 14:05:18,678 32k INFO Train Epoch: 123 [42%] 2023-02-26 14:05:18,678 32k INFO [2.512394428253174, 2.2709600925445557, 11.919401168823242, 17.652254104614258, 0.645085871219635, 20200, 9.848647536224416e-05] 2023-02-26 14:06:40,243 32k INFO ====> Epoch: 123 2023-02-26 14:08:31,954 32k INFO Train Epoch: 124 [64%] 2023-02-26 14:08:31,955 32k INFO [2.643122434616089, 2.007068395614624, 6.659689426422119, 14.161267280578613, 0.931346595287323, 20400, 9.847416455282387e-05] 2023-02-26 14:09:23,160 32k INFO ====> Epoch: 124 2023-02-26 14:11:44,973 32k INFO Train Epoch: 125 [85%] 2023-02-26 14:11:44,974 32k INFO [2.2812883853912354, 2.3466286659240723, 10.443126678466797, 17.098127365112305, 0.8298570513725281, 20600, 9.846185528225477e-05] 2023-02-26 14:12:06,190 32k INFO ====> Epoch: 125 2023-02-26 14:14:49,057 32k INFO ====> Epoch: 126 2023-02-26 14:15:19,377 32k INFO Train Epoch: 127 [6%] 2023-02-26 14:15:19,378 32k INFO [2.2862539291381836, 2.4085965156555176, 8.173857688903809, 13.033564567565918, 1.2490079402923584, 20800, 9.84372413569007e-05] 2023-02-26 14:17:32,296 32k INFO ====> Epoch: 127 2023-02-26 14:18:32,493 32k INFO Train Epoch: 128 [27%] 2023-02-26 14:18:32,494 32k INFO [2.3605754375457764, 2.3576247692108154, 8.230725288391113, 16.209781646728516, 0.5764607191085815, 21000, 9.842493670173108e-05] 2023-02-26 14:18:36,964 32k INFO Saving model and optimizer state at iteration 128 to ./logs\32k\G_21000.pth 2023-02-26 14:18:56,081 32k INFO Saving model and optimizer state at iteration 128 to ./logs\32k\D_21000.pth 2023-02-26 14:20:42,104 32k INFO ====> Epoch: 128 2023-02-26 14:22:12,423 32k INFO Train Epoch: 129 [48%] 2023-02-26 14:22:12,423 32k INFO [2.4556076526641846, 2.319934844970703, 10.994032859802246, 17.579025268554688, 0.9810572862625122, 21200, 9.841263358464336e-05] 2023-02-26 14:23:25,204 32k INFO ====> Epoch: 129 2023-02-26 14:25:25,430 32k INFO Train Epoch: 130 [70%] 2023-02-26 14:25:25,430 32k INFO [2.3116416931152344, 2.5935959815979004, 12.18552303314209, 19.612600326538086, 0.09066768735647202, 21400, 9.840033200544528e-05] 2023-02-26 14:26:07,943 32k INFO ====> Epoch: 130 2023-02-26 14:28:38,322 32k INFO Train Epoch: 131 [91%] 2023-02-26 14:28:38,322 32k INFO [2.417170763015747, 2.866428852081299, 10.004213333129883, 21.375431060791016, 0.7538483142852783, 21600, 9.838803196394459e-05] 2023-02-26 14:28:50,906 32k INFO ====> Epoch: 131 2023-02-26 14:31:33,676 32k INFO ====> Epoch: 132 2023-02-26 14:32:12,439 32k INFO Train Epoch: 133 [12%] 2023-02-26 14:32:12,439 32k INFO [2.4903366565704346, 2.381277322769165, 8.76326847076416, 17.031940460205078, 0.6056929230690002, 21800, 9.836343649326659e-05] 2023-02-26 14:34:16,420 32k INFO ====> Epoch: 133 2023-02-26 14:35:25,160 32k INFO Train Epoch: 134 [33%] 2023-02-26 14:35:25,161 32k INFO [2.4677324295043945, 2.5568089485168457, 9.496826171875, 19.7178955078125, 0.6253650784492493, 22000, 9.835114106370493e-05] 2023-02-26 14:35:29,650 32k INFO Saving model and optimizer state at iteration 134 to ./logs\32k\G_22000.pth 2023-02-26 14:35:46,991 32k INFO Saving model and optimizer state at iteration 134 to ./logs\32k\D_22000.pth 2023-02-26 14:37:24,621 32k INFO ====> Epoch: 134 2023-02-26 14:39:03,827 32k INFO Train Epoch: 135 [55%] 2023-02-26 14:39:03,828 32k INFO [2.3463804721832275, 2.363849639892578, 14.297788619995117, 20.66683006286621, 0.6891955137252808, 22200, 9.833884717107196e-05] 2023-02-26 14:40:08,153 32k INFO ====> Epoch: 135 2023-02-26 14:42:17,146 32k INFO Train Epoch: 136 [76%] 2023-02-26 14:42:17,146 32k INFO [2.5203256607055664, 2.283804416656494, 11.94053840637207, 18.11589241027832, 1.292447805404663, 22400, 9.832655481517557e-05] 2023-02-26 14:42:51,186 32k INFO ====> Epoch: 136 2023-02-26 14:45:30,393 32k INFO Train Epoch: 137 [97%] 2023-02-26 14:45:30,393 32k INFO [2.601602077484131, 2.1686148643493652, 9.941407203674316, 15.776628494262695, 0.6523029208183289, 22600, 9.831426399582366e-05] 2023-02-26 14:45:34,385 32k INFO ====> Epoch: 137 2023-02-26 14:48:17,181 32k INFO ====> Epoch: 138 2023-02-26 14:49:04,561 32k INFO Train Epoch: 139 [18%] 2023-02-26 14:49:04,561 32k INFO [2.4520339965820312, 2.3309485912323, 13.66077995300293, 19.651887893676758, 0.536736011505127, 22800, 9.828968696598508e-05] 2023-02-26 14:51:00,322 32k INFO ====> Epoch: 139 2023-02-26 14:52:17,661 32k INFO Train Epoch: 140 [39%] 2023-02-26 14:52:17,662 32k INFO [2.4809441566467285, 2.5217180252075195, 12.526641845703125, 21.054609298706055, 1.3811713457107544, 23000, 9.827740075511432e-05] 2023-02-26 14:52:22,135 32k INFO Saving model and optimizer state at iteration 140 to ./logs\32k\G_23000.pth 2023-02-26 14:52:39,142 32k INFO Saving model and optimizer state at iteration 140 to ./logs\32k\D_23000.pth 2023-02-26 14:54:08,027 32k INFO ====> Epoch: 140 2023-02-26 14:55:55,688 32k INFO Train Epoch: 141 [61%] 2023-02-26 14:55:55,688 32k INFO [2.57208514213562, 2.1723203659057617, 5.556727886199951, 11.347487449645996, 0.6554768085479736, 23200, 9.826511608001993e-05] 2023-02-26 14:56:51,095 32k INFO ====> Epoch: 141 2023-02-26 14:59:08,681 32k INFO Train Epoch: 142 [82%] 2023-02-26 14:59:08,681 32k INFO [2.2383546829223633, 2.4536244869232178, 12.622249603271484, 18.40056610107422, 0.4919876456260681, 23400, 9.825283294050992e-05] 2023-02-26 14:59:34,044 32k INFO ====> Epoch: 142 2023-02-26 15:02:16,450 32k INFO ====> Epoch: 143 2023-02-26 15:02:42,500 32k INFO Train Epoch: 144 [3%] 2023-02-26 15:02:42,501 32k INFO [2.2660953998565674, 2.6160569190979004, 12.693144798278809, 20.315467834472656, 0.5867981314659119, 23600, 9.822827126747529e-05] 2023-02-26 15:04:59,831 32k INFO ====> Epoch: 144 2023-02-26 15:05:55,876 32k INFO Train Epoch: 145 [24%] 2023-02-26 15:05:55,876 32k INFO [2.4133129119873047, 2.3682570457458496, 12.988896369934082, 19.82603645324707, 0.8757506012916565, 23800, 9.821599273356685e-05] 2023-02-26 15:07:42,996 32k INFO ====> Epoch: 145 2023-02-26 15:09:09,009 32k INFO Train Epoch: 146 [45%] 2023-02-26 15:09:09,010 32k INFO [2.371466875076294, 2.411593198776245, 11.293488502502441, 16.904762268066406, 0.918782651424408, 24000, 9.820371573447515e-05] 2023-02-26 15:09:13,478 32k INFO Saving model and optimizer state at iteration 146 to ./logs\32k\G_24000.pth 2023-02-26 15:09:32,722 32k INFO Saving model and optimizer state at iteration 146 to ./logs\32k\D_24000.pth 2023-02-26 15:10:52,999 32k INFO ====> Epoch: 146 2023-02-26 15:12:49,426 32k INFO Train Epoch: 147 [67%] 2023-02-26 15:12:49,426 32k INFO [2.5269036293029785, 2.449225664138794, 11.160113334655762, 19.884809494018555, 0.6317355632781982, 24200, 9.819144027000834e-05] 2023-02-26 15:13:36,511 32k INFO ====> Epoch: 147 2023-02-26 15:16:02,727 32k INFO Train Epoch: 148 [88%] 2023-02-26 15:16:02,728 32k INFO [2.286109209060669, 2.792794942855835, 8.917141914367676, 14.302690505981445, 0.7620863318443298, 24400, 9.817916633997459e-05] 2023-02-26 15:16:19,590 32k INFO ====> Epoch: 148 2023-02-26 15:19:02,564 32k INFO ====> Epoch: 149 2023-02-26 15:19:37,203 32k INFO Train Epoch: 150 [9%] 2023-02-26 15:19:37,203 32k INFO [2.229402542114258, 2.738642692565918, 11.272886276245117, 16.66368293762207, 1.1462740898132324, 24600, 9.815462308243906e-05] 2023-02-26 15:21:45,802 32k INFO ====> Epoch: 150 2023-02-26 15:22:50,583 32k INFO Train Epoch: 151 [30%] 2023-02-26 15:22:50,584 32k INFO [2.3389620780944824, 2.6368374824523926, 10.70553207397461, 18.062721252441406, 1.1679829359054565, 24800, 9.814235375455375e-05] 2023-02-26 15:24:29,362 32k INFO ====> Epoch: 151 2023-02-26 15:26:04,092 32k INFO Train Epoch: 152 [52%] 2023-02-26 15:26:04,092 32k INFO [2.5250439643859863, 2.5258371829986572, 9.597591400146484, 18.75698471069336, 0.38917434215545654, 25000, 9.813008596033443e-05] 2023-02-26 15:26:08,573 32k INFO Saving model and optimizer state at iteration 152 to ./logs\32k\G_25000.pth 2023-02-26 15:26:25,818 32k INFO Saving model and optimizer state at iteration 152 to ./logs\32k\D_25000.pth 2023-02-26 15:27:37,202 32k INFO ====> Epoch: 152 2023-02-26 15:29:41,926 32k INFO Train Epoch: 153 [73%] 2023-02-26 15:29:41,927 32k INFO [2.3494515419006348, 2.425994873046875, 12.536174774169922, 20.15695571899414, 0.5543274283409119, 25200, 9.811781969958938e-05] 2023-02-26 15:30:20,303 32k INFO ====> Epoch: 153 2023-02-26 15:32:55,039 32k INFO Train Epoch: 154 [94%] 2023-02-26 15:32:55,039 32k INFO [2.4024767875671387, 2.125033140182495, 10.49553394317627, 16.836292266845703, 0.6470404267311096, 25400, 9.810555497212693e-05] 2023-02-26 15:33:03,303 32k INFO ====> Epoch: 154 2023-02-26 15:35:46,178 32k INFO ====> Epoch: 155 2023-02-26 15:36:29,478 32k INFO Train Epoch: 156 [15%] 2023-02-26 15:36:29,478 32k INFO [2.6614341735839844, 2.014702081680298, 8.740072250366211, 12.970017433166504, 0.6980465650558472, 25600, 9.808103011628319e-05] 2023-02-26 15:38:29,800 32k INFO ====> Epoch: 156 2023-02-26 15:39:43,051 32k INFO Train Epoch: 157 [36%] 2023-02-26 15:39:43,051 32k INFO [2.654205560684204, 1.9615615606307983, 10.93551254272461, 16.712379455566406, 0.4804637134075165, 25800, 9.806876998751865e-05] 2023-02-26 15:41:13,122 32k INFO ====> Epoch: 157 2023-02-26 15:42:56,289 32k INFO Train Epoch: 158 [58%] 2023-02-26 15:42:56,289 32k INFO [2.6511435508728027, 2.1100852489471436, 8.584171295166016, 15.892630577087402, 0.90334153175354, 26000, 9.80565113912702e-05] 2023-02-26 15:43:00,797 32k INFO Saving model and optimizer state at iteration 158 to ./logs\32k\G_26000.pth 2023-02-26 15:43:20,004 32k INFO Saving model and optimizer state at iteration 158 to ./logs\32k\D_26000.pth 2023-02-26 15:44:23,290 32k INFO ====> Epoch: 158 2023-02-26 15:46:36,868 32k INFO Train Epoch: 159 [79%] 2023-02-26 15:46:36,869 32k INFO [2.527649402618408, 2.3790462017059326, 9.914752960205078, 17.783714294433594, 0.7191269397735596, 26200, 9.804425432734629e-05] 2023-02-26 15:47:06,754 32k INFO ====> Epoch: 159 2023-02-26 15:49:49,795 32k INFO ====> Epoch: 160 2023-02-26 15:50:11,611 32k INFO Train Epoch: 161 [0%] 2023-02-26 15:50:11,611 32k INFO [2.4440112113952637, 2.4401893615722656, 11.61741828918457, 17.544946670532227, 0.9068783521652222, 26400, 9.801974479570593e-05] 2023-02-26 15:52:33,511 32k INFO ====> Epoch: 161 2023-02-26 15:53:25,221 32k INFO Train Epoch: 162 [21%] 2023-02-26 15:53:25,221 32k INFO [2.300156593322754, 2.5606746673583984, 11.57729434967041, 19.577245712280273, 0.6100308895111084, 26600, 9.800749232760646e-05] 2023-02-26 15:55:16,561 32k INFO ====> Epoch: 162 2023-02-26 15:56:38,386 32k INFO Train Epoch: 163 [42%] 2023-02-26 15:56:38,387 32k INFO [2.491887331008911, 2.174567937850952, 8.945231437683105, 16.8068790435791, 0.6337096095085144, 26800, 9.79952413910655e-05] 2023-02-26 15:57:59,818 32k INFO ====> Epoch: 163 2023-02-26 15:59:51,615 32k INFO Train Epoch: 164 [64%] 2023-02-26 15:59:51,615 32k INFO [2.4709670543670654, 2.4039604663848877, 10.360793113708496, 15.725382804870605, 0.9451103210449219, 27000, 9.798299198589162e-05] 2023-02-26 15:59:56,072 32k INFO Saving model and optimizer state at iteration 164 to ./logs\32k\G_27000.pth 2023-02-26 16:00:12,933 32k INFO Saving model and optimizer state at iteration 164 to ./logs\32k\D_27000.pth 2023-02-26 16:01:07,609 32k INFO ====> Epoch: 164 2023-02-26 16:03:29,754 32k INFO Train Epoch: 165 [85%] 2023-02-26 16:03:29,754 32k INFO [2.4107818603515625, 2.2964704036712646, 10.976929664611816, 18.840293884277344, 0.8151223063468933, 27200, 9.797074411189339e-05] 2023-02-26 16:03:50,996 32k INFO ====> Epoch: 165 2023-02-26 16:06:33,906 32k INFO ====> Epoch: 166 2023-02-26 16:07:04,238 32k INFO Train Epoch: 167 [6%] 2023-02-26 16:07:04,239 32k INFO [2.4089515209198, 2.457214593887329, 9.272985458374023, 17.213451385498047, 0.820664644241333, 27400, 9.794625295665828e-05] 2023-02-26 16:09:17,110 32k INFO ====> Epoch: 167 2023-02-26 16:10:17,435 32k INFO Train Epoch: 168 [27%] 2023-02-26 16:10:17,436 32k INFO [2.2713615894317627, 2.1985321044921875, 11.777606010437012, 20.058086395263672, 0.7406303882598877, 27600, 9.79340096750387e-05] 2023-02-26 16:12:00,339 32k INFO ====> Epoch: 168 2023-02-26 16:13:30,855 32k INFO Train Epoch: 169 [48%] 2023-02-26 16:13:30,855 32k INFO [2.4570372104644775, 2.2436656951904297, 11.803679466247559, 17.833927154541016, 0.6307259798049927, 27800, 9.792176792382932e-05] 2023-02-26 16:14:43,793 32k INFO ====> Epoch: 169 2023-02-26 16:16:43,976 32k INFO Train Epoch: 170 [70%] 2023-02-26 16:16:43,976 32k INFO [2.246384382247925, 2.4739303588867188, 15.200418472290039, 21.546630859375, 0.6328362226486206, 28000, 9.790952770283884e-05] 2023-02-26 16:16:48,430 32k INFO Saving model and optimizer state at iteration 170 to ./logs\32k\G_28000.pth 2023-02-26 16:17:01,674 32k INFO Saving model and optimizer state at iteration 170 to ./logs\32k\D_28000.pth 2023-02-26 16:17:47,558 32k INFO ====> Epoch: 170 2023-02-26 16:20:18,059 32k INFO Train Epoch: 171 [91%] 2023-02-26 16:20:18,059 32k INFO [2.306701898574829, 2.6621859073638916, 10.908252716064453, 17.372089385986328, 0.784980297088623, 28200, 9.789728901187598e-05] 2023-02-26 16:20:30,684 32k INFO ====> Epoch: 171 2023-02-26 16:23:13,289 32k INFO ====> Epoch: 172 2023-02-26 16:23:52,145 32k INFO Train Epoch: 173 [12%] 2023-02-26 16:23:52,145 32k INFO [2.399289131164551, 2.3199777603149414, 7.142980098724365, 13.057271003723145, 0.8620398640632629, 28400, 9.787281621926815e-05] 2023-02-26 16:25:56,218 32k INFO ====> Epoch: 173 2023-02-26 16:27:05,005 32k INFO Train Epoch: 174 [33%] 2023-02-26 16:27:05,005 32k INFO [2.470820188522339, 2.4600534439086914, 6.626948356628418, 13.11660385131836, 0.841496467590332, 28600, 9.786058211724074e-05] 2023-02-26 16:28:39,432 32k INFO ====> Epoch: 174 2023-02-26 16:30:18,516 32k INFO Train Epoch: 175 [55%] 2023-02-26 16:30:18,516 32k INFO [2.4205260276794434, 2.25736927986145, 13.192204475402832, 19.85516357421875, 0.7485554218292236, 28800, 9.784834954447608e-05] 2023-02-26 16:31:22,767 32k INFO ====> Epoch: 175 2023-02-26 16:33:31,581 32k INFO Train Epoch: 176 [76%] 2023-02-26 16:33:31,582 32k INFO [2.455970287322998, 2.2141826152801514, 13.451642036437988, 19.803707122802734, 0.9468473792076111, 29000, 9.783611850078301e-05] 2023-02-26 16:33:36,138 32k INFO Saving model and optimizer state at iteration 176 to ./logs\32k\G_29000.pth 2023-02-26 16:33:53,144 32k INFO Saving model and optimizer state at iteration 176 to ./logs\32k\D_29000.pth 2023-02-26 16:34:30,652 32k INFO ====> Epoch: 176 2023-02-26 16:37:10,010 32k INFO Train Epoch: 177 [97%] 2023-02-26 16:37:10,011 32k INFO [2.232689380645752, 2.6161293983459473, 10.906001091003418, 17.958105087280273, 0.6369590163230896, 29200, 9.782388898597041e-05] 2023-02-26 16:37:13,995 32k INFO ====> Epoch: 177 2023-02-26 16:39:57,216 32k INFO ====> Epoch: 178 2023-02-26 16:40:44,663 32k INFO Train Epoch: 179 [18%] 2023-02-26 16:40:44,664 32k INFO [2.267847776412964, 2.723783493041992, 14.152230262756348, 18.365135192871094, 1.0862221717834473, 29400, 9.779943454222217e-05] 2023-02-26 16:42:40,681 32k INFO ====> Epoch: 179 2023-02-26 16:43:58,308 32k INFO Train Epoch: 180 [39%] 2023-02-26 16:43:58,308 32k INFO [2.529902696609497, 2.3272926807403564, 9.528871536254883, 18.957096099853516, 0.9033053517341614, 29600, 9.778720961290439e-05] 2023-02-26 16:45:24,000 32k INFO ====> Epoch: 180 2023-02-26 16:47:11,592 32k INFO Train Epoch: 181 [61%] 2023-02-26 16:47:11,593 32k INFO [2.2471132278442383, 2.5182602405548096, 11.21172046661377, 16.198549270629883, 0.6432769298553467, 29800, 9.777498621170277e-05] 2023-02-26 16:48:07,243 32k INFO ====> Epoch: 181 2023-02-26 16:50:24,659 32k INFO Train Epoch: 182 [82%] 2023-02-26 16:50:24,659 32k INFO [2.3189096450805664, 2.2408602237701416, 11.619179725646973, 19.898473739624023, 0.5813133716583252, 30000, 9.776276433842631e-05] 2023-02-26 16:50:29,138 32k INFO Saving model and optimizer state at iteration 182 to ./logs\32k\G_30000.pth 2023-02-26 16:50:46,475 32k INFO Saving model and optimizer state at iteration 182 to ./logs\32k\D_30000.pth 2023-02-26 16:51:15,396 32k INFO ====> Epoch: 182 2023-02-26 16:53:58,501 32k INFO ====> Epoch: 183 2023-02-26 16:54:24,532 32k INFO Train Epoch: 184 [3%] 2023-02-26 16:54:24,532 32k INFO [2.3636772632598877, 2.432774782180786, 12.960881233215332, 19.998132705688477, 0.9693511128425598, 30200, 9.773832517488488e-05] 2023-02-26 16:56:41,437 32k INFO ====> Epoch: 184 2023-02-26 16:57:37,255 32k INFO Train Epoch: 185 [24%] 2023-02-26 16:57:37,255 32k INFO [2.5370290279388428, 2.19403338432312, 11.489301681518555, 18.055042266845703, 0.938399076461792, 30400, 9.772610788423802e-05] 2023-02-26 16:59:24,316 32k INFO ====> Epoch: 185 2023-02-26 17:00:50,418 32k INFO Train Epoch: 186 [45%] 2023-02-26 17:00:50,418 32k INFO [2.288398265838623, 2.4643843173980713, 13.293804168701172, 19.007169723510742, 0.821979820728302, 30600, 9.771389212075249e-05] 2023-02-26 17:02:07,421 32k INFO ====> Epoch: 186 2023-02-26 17:04:03,697 32k INFO Train Epoch: 187 [67%] 2023-02-26 17:04:03,697 32k INFO [2.5506398677825928, 2.305884838104248, 9.674799919128418, 17.26383399963379, 0.8924499154090881, 30800, 9.77016778842374e-05] 2023-02-26 17:04:50,640 32k INFO ====> Epoch: 187 2023-02-26 17:07:16,910 32k INFO Train Epoch: 188 [88%] 2023-02-26 17:07:16,911 32k INFO [2.2533488273620605, 2.5775458812713623, 9.142365455627441, 14.616179466247559, 0.9205642342567444, 31000, 9.768946517450186e-05] 2023-02-26 17:07:21,445 32k INFO Saving model and optimizer state at iteration 188 to ./logs\32k\G_31000.pth 2023-02-26 17:07:36,357 32k INFO Saving model and optimizer state at iteration 188 to ./logs\32k\D_31000.pth 2023-02-26 17:07:56,519 32k INFO ====> Epoch: 188 2023-02-26 17:10:39,474 32k INFO ====> Epoch: 189 2023-02-26 17:11:14,118 32k INFO Train Epoch: 190 [9%] 2023-02-26 17:11:14,118 32k INFO [2.4963176250457764, 2.191936492919922, 8.220184326171875, 15.837531089782715, 0.7903323173522949, 31200, 9.766504433460612e-05] 2023-02-26 17:13:22,812 32k INFO ====> Epoch: 190 2023-02-26 17:14:27,380 32k INFO Train Epoch: 191 [30%] 2023-02-26 17:14:27,381 32k INFO [2.4663021564483643, 2.284228801727295, 11.195096969604492, 19.334308624267578, 0.8231672644615173, 31400, 9.765283620406429e-05] 2023-02-26 17:16:05,903 32k INFO ====> Epoch: 191 2023-02-26 17:17:40,428 32k INFO Train Epoch: 192 [52%] 2023-02-26 17:17:40,428 32k INFO [2.4471681118011475, 2.4072704315185547, 8.412674903869629, 18.166622161865234, 0.9922705888748169, 31600, 9.764062959953878e-05] 2023-02-26 17:18:49,074 32k INFO ====> Epoch: 192 2023-02-26 17:20:54,252 32k INFO Train Epoch: 193 [73%] 2023-02-26 17:20:54,253 32k INFO [2.1911771297454834, 2.7126452922821045, 11.184356689453125, 17.750993728637695, 1.0288227796554565, 31800, 9.762842452083883e-05] 2023-02-26 17:21:32,780 32k INFO ====> Epoch: 193 2023-02-26 17:24:07,741 32k INFO Train Epoch: 194 [94%] 2023-02-26 17:24:07,742 32k INFO [2.3111774921417236, 2.240935802459717, 11.050326347351074, 20.342138290405273, 0.5650017857551575, 32000, 9.761622096777372e-05] 2023-02-26 17:24:12,415 32k INFO Saving model and optimizer state at iteration 194 to ./logs\32k\G_32000.pth 2023-02-26 17:24:29,104 32k INFO Saving model and optimizer state at iteration 194 to ./logs\32k\D_32000.pth 2023-02-26 17:24:41,097 32k INFO ====> Epoch: 194 2023-02-26 17:27:23,969 32k INFO ====> Epoch: 195 2023-02-26 17:28:07,238 32k INFO Train Epoch: 196 [15%] 2023-02-26 17:28:07,239 32k INFO [1.9948863983154297, 3.002936363220215, 11.8908052444458, 17.03791618347168, 0.7017788290977478, 32200, 9.759181843778522e-05] 2023-02-26 17:30:07,220 32k INFO ====> Epoch: 196 2023-02-26 17:31:20,166 32k INFO Train Epoch: 197 [36%] 2023-02-26 17:31:20,167 32k INFO [2.5854690074920654, 2.1435956954956055, 7.690774440765381, 17.148244857788086, 0.5889198780059814, 32400, 9.757961946048049e-05] 2023-02-26 17:32:50,210 32k INFO ====> Epoch: 197 2023-02-26 17:34:33,444 32k INFO Train Epoch: 198 [58%] 2023-02-26 17:34:33,444 32k INFO [2.352189302444458, 2.4562466144561768, 9.4241304397583, 17.52354621887207, 0.9340884685516357, 32600, 9.756742200804793e-05] 2023-02-26 17:35:33,194 32k INFO ====> Epoch: 198 2023-02-26 17:37:46,492 32k INFO Train Epoch: 199 [79%] 2023-02-26 17:37:46,493 32k INFO [2.4146697521209717, 2.7238364219665527, 12.723920822143555, 18.988252639770508, 0.7106491923332214, 32800, 9.755522608029692e-05] 2023-02-26 17:38:16,258 32k INFO ====> Epoch: 199 2023-02-26 17:40:58,717 32k INFO ====> Epoch: 200 2023-02-26 17:41:20,503 32k INFO Train Epoch: 201 [0%] 2023-02-26 17:41:20,503 32k INFO [2.367798089981079, 2.7577104568481445, 10.98935317993164, 17.85304832458496, 0.918219268321991, 33000, 9.753083879807726e-05] 2023-02-26 17:41:24,949 32k INFO Saving model and optimizer state at iteration 201 to ./logs\32k\G_33000.pth 2023-02-26 17:41:44,987 32k INFO Saving model and optimizer state at iteration 201 to ./logs\32k\D_33000.pth 2023-02-26 17:44:09,525 32k INFO ====> Epoch: 201 2023-02-26 17:45:01,443 32k INFO Train Epoch: 202 [21%] 2023-02-26 17:45:01,444 32k INFO [2.361729145050049, 2.3721957206726074, 9.747944831848145, 14.154775619506836, 0.863827109336853, 33200, 9.75186474432275e-05] 2023-02-26 17:46:53,304 32k INFO ====> Epoch: 202 2023-02-26 17:48:15,322 32k INFO Train Epoch: 203 [42%] 2023-02-26 17:48:15,323 32k INFO [2.4549036026000977, 2.087249279022217, 9.122322082519531, 14.459492683410645, 0.8657273054122925, 33400, 9.750645761229709e-05] 2023-02-26 17:49:36,813 32k INFO ====> Epoch: 203 2023-02-26 17:51:28,792 32k INFO Train Epoch: 204 [64%] 2023-02-26 17:51:28,793 32k INFO [2.574124813079834, 2.3560216426849365, 10.238191604614258, 17.874080657958984, 0.7011803388595581, 33600, 9.749426930509556e-05] 2023-02-26 17:52:20,099 32k INFO ====> Epoch: 204 2023-02-26 17:54:41,919 32k INFO Train Epoch: 205 [85%] 2023-02-26 17:54:41,919 32k INFO [2.441328763961792, 2.4306070804595947, 11.29406452178955, 17.06692123413086, 0.48038169741630554, 33800, 9.748208252143241e-05] 2023-02-26 17:55:03,141 32k INFO ====> Epoch: 205 2023-02-26 17:57:45,959 32k INFO ====> Epoch: 206 2023-02-26 17:58:16,204 32k INFO Train Epoch: 207 [6%] 2023-02-26 17:58:16,204 32k INFO [2.772115468978882, 2.026067018508911, 7.555477142333984, 12.804871559143066, 0.48970046639442444, 34000, 9.745771352395957e-05] 2023-02-26 17:58:20,721 32k INFO Saving model and optimizer state at iteration 207 to ./logs\32k\G_34000.pth 2023-02-26 17:58:38,922 32k INFO Saving model and optimizer state at iteration 207 to ./logs\32k\D_34000.pth 2023-02-26 18:00:55,455 32k INFO ====> Epoch: 207 2023-02-26 18:01:56,077 32k INFO Train Epoch: 208 [27%] 2023-02-26 18:01:56,077 32k INFO [2.3201816082000732, 2.404618263244629, 13.214925765991211, 19.893857955932617, 0.8915579915046692, 34200, 9.744553130976908e-05] 2023-02-26 18:03:39,028 32k INFO ====> Epoch: 208 2023-02-26 18:05:09,684 32k INFO Train Epoch: 209 [48%] 2023-02-26 18:05:09,685 32k INFO [2.543070077896118, 2.1961042881011963, 9.725593566894531, 14.015393257141113, 0.5515499711036682, 34400, 9.743335061835535e-05] 2023-02-26 18:06:22,420 32k INFO ====> Epoch: 209 2023-02-26 18:08:22,834 32k INFO Train Epoch: 210 [70%] 2023-02-26 18:08:22,835 32k INFO [2.291600465774536, 2.7387681007385254, 11.23803424835205, 18.91277313232422, 0.5562854409217834, 34600, 9.742117144952805e-05] 2023-02-26 18:09:05,526 32k INFO ====> Epoch: 210 2023-02-26 18:11:36,675 32k INFO Train Epoch: 211 [91%] 2023-02-26 18:11:36,676 32k INFO [2.4361047744750977, 2.432372808456421, 6.777080535888672, 13.665382385253906, 0.6072559356689453, 34800, 9.740899380309685e-05] 2023-02-26 18:11:49,390 32k INFO ====> Epoch: 211 2023-02-26 18:14:33,046 32k INFO ====> Epoch: 212 2023-02-26 18:15:12,040 32k INFO Train Epoch: 213 [12%] 2023-02-26 18:15:12,040 32k INFO [2.706439733505249, 1.891126036643982, 7.307193279266357, 14.386757850646973, 0.69676274061203, 35000, 9.73846430766616e-05] 2023-02-26 18:15:16,522 32k INFO Saving model and optimizer state at iteration 213 to ./logs\32k\G_35000.pth 2023-02-26 18:15:32,644 32k INFO Saving model and optimizer state at iteration 213 to ./logs\32k\D_35000.pth 2023-02-26 18:17:40,257 32k INFO ====> Epoch: 213 2023-02-26 18:18:49,631 32k INFO Train Epoch: 214 [33%] 2023-02-26 18:18:49,632 32k INFO [2.469897747039795, 2.5583174228668213, 10.134647369384766, 18.64297866821289, 0.6196038722991943, 35200, 9.7372469996277e-05] 2023-02-26 18:20:23,825 32k INFO ====> Epoch: 214