File size: 24,531 Bytes
8237492
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
2023-02-14 13:12:41,308	32k	INFO	{'train': {'log_interval': 200, 'eval_interval': 1000, 'seed': 1234, 'epochs': 10000, 'learning_rate': 0.0001, 'betas': [0.8, 0.99], 'eps': 1e-09, 'batch_size': 6, 'fp16_run': False, 'lr_decay': 0.999875, 'segment_size': 17920, 'init_lr_ratio': 1, 'warmup_epochs': 0, 'c_mel': 45, 'c_kl': 1.0, 'use_sr': True, 'max_speclen': 384, 'port': '8001'}, 'data': {'training_files': 'filelists/train.txt', 'validation_files': 'filelists/val.txt', 'max_wav_value': 32768.0, 'sampling_rate': 32000, 'filter_length': 1280, 'hop_length': 320, 'win_length': 1280, 'n_mel_channels': 80, 'mel_fmin': 0.0, 'mel_fmax': None}, 'model': {'inter_channels': 192, 'hidden_channels': 192, 'filter_channels': 768, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'resblock': '1', 'resblock_kernel_sizes': [3, 7, 11], 'resblock_dilation_sizes': [[1, 3, 5], [1, 3, 5], [1, 3, 5]], 'upsample_rates': [10, 8, 2, 2], 'upsample_initial_channel': 512, 'upsample_kernel_sizes': [16, 16, 4, 4], 'n_layers_q': 3, 'use_spectral_norm': False, 'gin_channels': 256, 'ssl_dim': 256, 'n_speakers': 2}, 'spk': {'hiyo': 0}, 'model_dir': './logs\\32k'}
2023-02-14 13:12:58,959	32k	INFO	Loaded checkpoint './logs\32k\G_0.pth' (iteration 1)
2023-02-14 13:12:59,334	32k	INFO	Loaded checkpoint './logs\32k\D_0.pth' (iteration 1)
2023-02-14 13:13:25,423	32k	INFO	Train Epoch: 1 [0%]
2023-02-14 13:13:25,424	32k	INFO	[2.7184720039367676, 2.6034836769104004, 11.849114418029785, 47.374595642089844, 9.48661994934082, 0, 0.0001]
2023-02-14 13:13:30,918	32k	INFO	Saving model and optimizer state at iteration 1 to ./logs\32k\G_0.pth
2023-02-14 13:13:47,803	32k	INFO	Saving model and optimizer state at iteration 1 to ./logs\32k\D_0.pth
2023-02-14 13:15:02,651	32k	INFO	====> Epoch: 1
2023-02-14 13:16:34,036	32k	INFO	====> Epoch: 2
2023-02-14 13:17:26,475	32k	INFO	Train Epoch: 3 [44%]
2023-02-14 13:17:26,475	32k	INFO	[2.6585214138031006, 2.366513252258301, 10.955318450927734, 23.930110931396484, 1.0435259342193604, 200, 9.99750015625e-05]
2023-02-14 13:18:05,652	32k	INFO	====> Epoch: 3
2023-02-14 13:19:36,775	32k	INFO	====> Epoch: 4
2023-02-14 13:20:59,955	32k	INFO	Train Epoch: 5 [88%]
2023-02-14 13:20:59,955	32k	INFO	[2.501133441925049, 2.350158452987671, 9.644698143005371, 22.92068862915039, 0.7671791315078735, 400, 9.995000937421877e-05]
2023-02-14 13:21:08,211	32k	INFO	====> Epoch: 5
2023-02-14 13:22:39,553	32k	INFO	====> Epoch: 6
2023-02-14 13:24:10,878	32k	INFO	====> Epoch: 7
2023-02-14 13:24:54,687	32k	INFO	Train Epoch: 8 [32%]
2023-02-14 13:24:54,688	32k	INFO	[2.5303597450256348, 2.077411413192749, 12.605754852294922, 22.12200355529785, 1.3761812448501587, 600, 9.991253280566489e-05]
2023-02-14 13:25:42,491	32k	INFO	====> Epoch: 8
2023-02-14 13:27:13,843	32k	INFO	====> Epoch: 9
2023-02-14 13:28:28,606	32k	INFO	Train Epoch: 10 [76%]
2023-02-14 13:28:28,606	32k	INFO	[2.6250500679016113, 2.1927847862243652, 10.148731231689453, 20.529062271118164, 0.5864875316619873, 800, 9.98875562335968e-05]
2023-02-14 13:28:45,437	32k	INFO	====> Epoch: 10
2023-02-14 13:30:16,681	32k	INFO	====> Epoch: 11
2023-02-14 13:31:48,026	32k	INFO	====> Epoch: 12
2023-02-14 13:32:23,413	32k	INFO	Train Epoch: 13 [20%]
2023-02-14 13:32:23,413	32k	INFO	[2.4075894355773926, 2.307851791381836, 11.19032096862793, 22.16382598876953, 1.286623239517212, 1000, 9.98501030820433e-05]
2023-02-14 13:32:27,543	32k	INFO	Saving model and optimizer state at iteration 13 to ./logs\32k\G_1000.pth
2023-02-14 13:32:44,941	32k	INFO	Saving model and optimizer state at iteration 13 to ./logs\32k\D_1000.pth
2023-02-14 13:33:44,326	32k	INFO	====> Epoch: 13
2023-02-14 13:35:15,814	32k	INFO	====> Epoch: 14
2023-02-14 13:36:22,132	32k	INFO	Train Epoch: 15 [63%]
2023-02-14 13:36:22,132	32k	INFO	[2.5968217849731445, 2.164472818374634, 8.871170043945312, 20.146108627319336, 0.8499955534934998, 1200, 9.982514211643064e-05]
2023-02-14 13:36:47,621	32k	INFO	====> Epoch: 15
2023-02-14 13:38:19,098	32k	INFO	====> Epoch: 16
2023-02-14 13:39:50,485	32k	INFO	====> Epoch: 17
2023-02-14 13:40:17,300	32k	INFO	Train Epoch: 18 [7%]
2023-02-14 13:40:17,300	32k	INFO	[2.55147647857666, 2.1380293369293213, 11.658864974975586, 21.447202682495117, 1.0160374641418457, 1400, 9.978771236724554e-05]
2023-02-14 13:41:22,298	32k	INFO	====> Epoch: 18
2023-02-14 13:42:53,875	32k	INFO	====> Epoch: 19
2023-02-14 13:43:51,564	32k	INFO	Train Epoch: 20 [51%]
2023-02-14 13:43:51,565	32k	INFO	[2.5742850303649902, 2.244190216064453, 8.551454544067383, 19.180246353149414, 0.9120607376098633, 1600, 9.976276699833672e-05]
2023-02-14 13:44:25,617	32k	INFO	====> Epoch: 20
2023-02-14 13:45:57,188	32k	INFO	====> Epoch: 21
2023-02-14 13:47:25,925	32k	INFO	Train Epoch: 22 [95%]
2023-02-14 13:47:25,925	32k	INFO	[2.5543036460876465, 2.2138490676879883, 10.766397476196289, 21.196224212646484, 0.7341710329055786, 1800, 9.973782786538036e-05]
2023-02-14 13:47:29,068	32k	INFO	====> Epoch: 22
2023-02-14 13:49:00,647	32k	INFO	====> Epoch: 23
2023-02-14 13:50:32,139	32k	INFO	====> Epoch: 24
2023-02-14 13:51:21,238	32k	INFO	Train Epoch: 25 [39%]
2023-02-14 13:51:21,238	32k	INFO	[2.556088447570801, 2.2134010791778564, 11.56871223449707, 19.313758850097656, 1.033575177192688, 2000, 9.970043085494672e-05]
2023-02-14 13:51:25,444	32k	INFO	Saving model and optimizer state at iteration 25 to ./logs\32k\G_2000.pth
2023-02-14 13:51:42,644	32k	INFO	Saving model and optimizer state at iteration 25 to ./logs\32k\D_2000.pth
2023-02-14 13:52:28,601	32k	INFO	====> Epoch: 25
2023-02-14 13:54:00,018	32k	INFO	====> Epoch: 26
2023-02-14 13:55:20,005	32k	INFO	Train Epoch: 27 [83%]
2023-02-14 13:55:20,005	32k	INFO	[2.6195037364959717, 2.2152504920959473, 9.765068054199219, 19.97799301147461, 0.9908173680305481, 2200, 9.967550730505221e-05]
2023-02-14 13:55:31,723	32k	INFO	====> Epoch: 27
2023-02-14 13:57:03,227	32k	INFO	====> Epoch: 28
2023-02-14 13:58:38,892	32k	INFO	====> Epoch: 29
2023-02-14 13:59:19,484	32k	INFO	Train Epoch: 30 [27%]
2023-02-14 13:59:19,484	32k	INFO	[2.4715116024017334, 2.2915375232696533, 13.360817909240723, 22.131193161010742, 1.3168777227401733, 2400, 9.963813366190753e-05]
2023-02-14 14:00:10,655	32k	INFO	====> Epoch: 30
2023-02-14 14:01:42,081	32k	INFO	====> Epoch: 31
2023-02-14 14:02:53,596	32k	INFO	Train Epoch: 32 [71%]
2023-02-14 14:02:53,596	32k	INFO	[2.593963384628296, 2.0387449264526367, 9.328591346740723, 17.955137252807617, 0.30882203578948975, 2600, 9.961322568533789e-05]
2023-02-14 14:03:13,877	32k	INFO	====> Epoch: 32
2023-02-14 14:04:45,281	32k	INFO	====> Epoch: 33
2023-02-14 14:06:16,692	32k	INFO	====> Epoch: 34
2023-02-14 14:06:48,668	32k	INFO	Train Epoch: 35 [15%]
2023-02-14 14:06:48,669	32k	INFO	[2.5350048542022705, 1.9605062007904053, 7.99984884262085, 16.31661033630371, 0.5040841698646545, 2800, 9.957587539488128e-05]
2023-02-14 14:07:48,449	32k	INFO	====> Epoch: 35
2023-02-14 14:09:20,138	32k	INFO	====> Epoch: 36
2023-02-14 14:10:23,083	32k	INFO	Train Epoch: 37 [59%]
2023-02-14 14:10:23,083	32k	INFO	[2.4650959968566895, 2.2761664390563965, 9.025921821594238, 19.93177604675293, 0.5862252712249756, 3000, 9.95509829819056e-05]
2023-02-14 14:10:27,302	32k	INFO	Saving model and optimizer state at iteration 37 to ./logs\32k\G_3000.pth
2023-02-14 14:10:44,327	32k	INFO	Saving model and optimizer state at iteration 37 to ./logs\32k\D_3000.pth
2023-02-14 14:11:16,616	32k	INFO	====> Epoch: 37
2023-02-14 14:12:50,532	32k	INFO	====> Epoch: 38
2023-02-14 14:14:23,408	32k	INFO	====> Epoch: 39
2023-02-14 14:14:46,793	32k	INFO	Train Epoch: 40 [2%]
2023-02-14 14:14:46,794	32k	INFO	[2.3914408683776855, 2.3529629707336426, 12.639781951904297, 19.53929901123047, 0.5159912109375, 3200, 9.951365602954526e-05]
2023-02-14 14:15:55,371	32k	INFO	====> Epoch: 40
2023-02-14 14:17:27,123	32k	INFO	====> Epoch: 41
2023-02-14 14:18:21,616	32k	INFO	Train Epoch: 42 [46%]
2023-02-14 14:18:21,616	32k	INFO	[2.4428353309631348, 2.258873462677002, 10.040135383605957, 17.569446563720703, 0.9567443132400513, 3400, 9.948877917043875e-05]
2023-02-14 14:18:59,327	32k	INFO	====> Epoch: 42
2023-02-14 14:20:31,127	32k	INFO	====> Epoch: 43
2023-02-14 14:21:56,538	32k	INFO	Train Epoch: 44 [90%]
2023-02-14 14:21:56,539	32k	INFO	[2.3611392974853516, 2.405252456665039, 12.823965072631836, 20.826034545898438, 0.8344462513923645, 3600, 9.94639085301583e-05]
2023-02-14 14:22:03,119	32k	INFO	====> Epoch: 44
2023-02-14 14:23:34,741	32k	INFO	====> Epoch: 45
2023-02-14 14:25:06,470	32k	INFO	====> Epoch: 46
2023-02-14 14:25:52,327	32k	INFO	Train Epoch: 47 [34%]
2023-02-14 14:25:52,327	32k	INFO	[2.4644553661346436, 2.2366631031036377, 11.37562370300293, 20.556499481201172, 0.9424434304237366, 3800, 9.942661422663591e-05]
2023-02-14 14:26:38,621	32k	INFO	====> Epoch: 47
2023-02-14 14:28:10,337	32k	INFO	====> Epoch: 48
2023-02-14 14:29:27,206	32k	INFO	Train Epoch: 49 [78%]
2023-02-14 14:29:27,206	32k	INFO	[2.5212016105651855, 2.3413655757904053, 10.579129219055176, 19.582210540771484, 0.94678795337677, 4000, 9.940175912662009e-05]
2023-02-14 14:29:31,315	32k	INFO	Saving model and optimizer state at iteration 49 to ./logs\32k\G_4000.pth
2023-02-14 14:29:48,727	32k	INFO	Saving model and optimizer state at iteration 49 to ./logs\32k\D_4000.pth
2023-02-14 14:30:07,524	32k	INFO	====> Epoch: 49
2023-02-14 14:31:39,485	32k	INFO	====> Epoch: 50
2023-02-14 14:33:11,385	32k	INFO	====> Epoch: 51
2023-02-14 14:33:48,633	32k	INFO	Train Epoch: 52 [22%]
2023-02-14 14:33:48,634	32k	INFO	[2.2625677585601807, 2.5999932289123535, 16.192663192749023, 21.976665496826172, 0.7557628750801086, 4200, 9.936448812621091e-05]
2023-02-14 14:34:43,699	32k	INFO	====> Epoch: 52
2023-02-14 14:36:15,604	32k	INFO	====> Epoch: 53
2023-02-14 14:37:23,825	32k	INFO	Train Epoch: 54 [66%]
2023-02-14 14:37:23,825	32k	INFO	[2.500643730163574, 2.2777493000030518, 8.213461875915527, 17.501588821411133, 0.7716516256332397, 4400, 9.933964855674948e-05]
2023-02-14 14:37:47,647	32k	INFO	====> Epoch: 54
2023-02-14 14:39:19,631	32k	INFO	====> Epoch: 55
2023-02-14 14:40:51,967	32k	INFO	====> Epoch: 56
2023-02-14 14:41:20,508	32k	INFO	Train Epoch: 57 [10%]
2023-02-14 14:41:20,508	32k	INFO	[2.4037282466888428, 2.369466543197632, 11.0797758102417, 18.995946884155273, 0.6406805515289307, 4600, 9.930240084489267e-05]
2023-02-14 14:42:23,986	32k	INFO	====> Epoch: 57
2023-02-14 14:43:56,023	32k	INFO	====> Epoch: 58
2023-02-14 14:44:55,964	32k	INFO	Train Epoch: 59 [54%]
2023-02-14 14:44:55,965	32k	INFO	[2.243375778198242, 2.4461309909820557, 15.596379280090332, 22.729717254638672, 0.9244585037231445, 4800, 9.927757679628145e-05]
2023-02-14 14:45:28,557	32k	INFO	====> Epoch: 59
2023-02-14 14:47:00,367	32k	INFO	====> Epoch: 60
2023-02-14 14:48:31,104	32k	INFO	Train Epoch: 61 [98%]
2023-02-14 14:48:31,105	32k	INFO	[2.5503604412078857, 2.111229658126831, 8.807084083557129, 16.10047721862793, 1.107606053352356, 5000, 9.92527589532945e-05]
2023-02-14 14:48:35,266	32k	INFO	Saving model and optimizer state at iteration 61 to ./logs\32k\G_5000.pth
2023-02-14 14:48:51,439	32k	INFO	Saving model and optimizer state at iteration 61 to ./logs\32k\D_5000.pth
2023-02-14 14:48:56,220	32k	INFO	====> Epoch: 61
2023-02-14 14:50:27,858	32k	INFO	====> Epoch: 62
2023-02-14 14:51:59,496	32k	INFO	====> Epoch: 63
2023-02-14 14:52:50,498	32k	INFO	Train Epoch: 64 [41%]
2023-02-14 14:52:50,498	32k	INFO	[2.603325128555298, 2.3429884910583496, 11.47877025604248, 19.95724868774414, 0.4535157382488251, 5200, 9.921554382096622e-05]
2023-02-14 14:53:31,542	32k	INFO	====> Epoch: 64
2023-02-14 14:55:03,278	32k	INFO	====> Epoch: 65
2023-02-14 14:56:25,149	32k	INFO	Train Epoch: 66 [85%]
2023-02-14 14:56:25,149	32k	INFO	[2.548145055770874, 2.1794002056121826, 8.618596076965332, 18.201990127563477, 0.42253807187080383, 5400, 9.919074148525384e-05]
2023-02-14 14:56:35,169	32k	INFO	====> Epoch: 66
2023-02-14 14:58:06,861	32k	INFO	====> Epoch: 67
2023-02-14 14:59:38,893	32k	INFO	====> Epoch: 68
2023-02-14 15:00:21,295	32k	INFO	Train Epoch: 69 [29%]
2023-02-14 15:00:21,295	32k	INFO	[2.462594985961914, 2.514474868774414, 8.234503746032715, 16.56818962097168, 0.3946729898452759, 5600, 9.915354960656915e-05]
2023-02-14 15:01:11,023	32k	INFO	====> Epoch: 69
2023-02-14 15:02:42,743	32k	INFO	====> Epoch: 70
2023-02-14 15:03:56,160	32k	INFO	Train Epoch: 71 [73%]
2023-02-14 15:03:56,161	32k	INFO	[2.3674912452697754, 2.400963306427002, 11.68331241607666, 21.614397048950195, 0.8642704486846924, 5800, 9.912876276844171e-05]
2023-02-14 15:04:14,926	32k	INFO	====> Epoch: 71
2023-02-14 15:05:46,720	32k	INFO	====> Epoch: 72
2023-02-14 15:07:18,452	32k	INFO	====> Epoch: 73
2023-02-14 15:07:52,317	32k	INFO	Train Epoch: 74 [17%]
2023-02-14 15:07:52,318	32k	INFO	[2.3848366737365723, 2.3195393085479736, 11.794806480407715, 20.811321258544922, 0.9752936959266663, 6000, 9.909159412887068e-05]
2023-02-14 15:07:56,403	32k	INFO	Saving model and optimizer state at iteration 74 to ./logs\32k\G_6000.pth
2023-02-14 15:08:11,990	32k	INFO	Saving model and optimizer state at iteration 74 to ./logs\32k\D_6000.pth
2023-02-14 15:09:13,833	32k	INFO	====> Epoch: 74
2023-02-14 15:10:45,810	32k	INFO	====> Epoch: 75
2023-02-14 15:11:50,593	32k	INFO	Train Epoch: 76 [61%]
2023-02-14 15:11:50,594	32k	INFO	[2.413219451904297, 2.3621764183044434, 11.660679817199707, 19.284757614135742, 0.46086761355400085, 6200, 9.906682277864462e-05]
2023-02-14 15:12:17,911	32k	INFO	====> Epoch: 76
2023-02-14 15:13:49,709	32k	INFO	====> Epoch: 77
2023-02-14 15:15:21,342	32k	INFO	====> Epoch: 78
2023-02-14 15:15:46,525	32k	INFO	Train Epoch: 79 [5%]
2023-02-14 15:15:46,526	32k	INFO	[2.4900527000427246, 2.1775360107421875, 11.207487106323242, 18.59693717956543, 1.0485306978225708, 6400, 9.902967736366644e-05]
2023-02-14 15:16:53,382	32k	INFO	====> Epoch: 79
2023-02-14 15:18:25,180	32k	INFO	====> Epoch: 80
2023-02-14 15:19:21,341	32k	INFO	Train Epoch: 81 [49%]
2023-02-14 15:19:21,342	32k	INFO	[2.4466681480407715, 2.321159839630127, 10.79179573059082, 18.92087173461914, 0.9803774356842041, 6600, 9.900492149166423e-05]
2023-02-14 15:19:57,171	32k	INFO	====> Epoch: 81
2023-02-14 15:21:28,987	32k	INFO	====> Epoch: 82
2023-02-14 15:22:56,286	32k	INFO	Train Epoch: 83 [93%]
2023-02-14 15:22:56,287	32k	INFO	[2.425461530685425, 2.4654641151428223, 11.531881332397461, 20.07356071472168, 1.0066664218902588, 6800, 9.89801718082432e-05]
2023-02-14 15:23:01,149	32k	INFO	====> Epoch: 83
2023-02-14 15:24:32,889	32k	INFO	====> Epoch: 84
2023-02-14 15:26:04,666	32k	INFO	====> Epoch: 85
2023-02-14 15:26:52,217	32k	INFO	Train Epoch: 86 [37%]
2023-02-14 15:26:52,217	32k	INFO	[2.526332378387451, 2.203555107116699, 11.341078758239746, 20.046627044677734, 1.3592050075531006, 7000, 9.894305888331732e-05]
2023-02-14 15:26:56,417	32k	INFO	Saving model and optimizer state at iteration 86 to ./logs\32k\G_7000.pth
2023-02-14 15:27:14,398	32k	INFO	Saving model and optimizer state at iteration 86 to ./logs\32k\D_7000.pth
2023-02-14 15:28:02,287	32k	INFO	====> Epoch: 86
2023-02-14 15:29:33,965	32k	INFO	====> Epoch: 87
2023-02-14 15:30:52,509	32k	INFO	Train Epoch: 88 [80%]
2023-02-14 15:30:52,509	32k	INFO	[2.5021166801452637, 2.4259166717529297, 8.362788200378418, 17.521188735961914, 1.4112741947174072, 7200, 9.891832466458178e-05]
2023-02-14 15:31:05,995	32k	INFO	====> Epoch: 88
2023-02-14 15:32:37,695	32k	INFO	====> Epoch: 89
2023-02-14 15:34:09,432	32k	INFO	====> Epoch: 90
2023-02-14 15:34:48,347	32k	INFO	Train Epoch: 91 [24%]
2023-02-14 15:34:48,347	32k	INFO	[2.2584304809570312, 2.37516188621521, 11.344868659973145, 20.501863479614258, 0.8927188515663147, 7400, 9.888123492943583e-05]
2023-02-14 15:35:41,516	32k	INFO	====> Epoch: 91
2023-02-14 15:37:13,309	32k	INFO	====> Epoch: 92
2023-02-14 15:38:23,255	32k	INFO	Train Epoch: 93 [68%]
2023-02-14 15:38:23,256	32k	INFO	[2.407334327697754, 2.4313995838165283, 5.9084062576293945, 14.895120620727539, 0.34809207916259766, 7600, 9.885651616572276e-05]
2023-02-14 15:38:45,350	32k	INFO	====> Epoch: 93
2023-02-14 15:40:17,151	32k	INFO	====> Epoch: 94
2023-02-14 15:41:48,881	32k	INFO	====> Epoch: 95
2023-02-14 15:42:19,249	32k	INFO	Train Epoch: 96 [12%]
2023-02-14 15:42:19,250	32k	INFO	[2.475175142288208, 2.287485361099243, 12.157411575317383, 19.902236938476562, 0.8755276799201965, 7800, 9.881944960586671e-05]
2023-02-14 15:43:20,970	32k	INFO	====> Epoch: 96
2023-02-14 15:44:52,717	32k	INFO	====> Epoch: 97
2023-02-14 15:45:54,039	32k	INFO	Train Epoch: 98 [56%]
2023-02-14 15:45:54,040	32k	INFO	[2.4083688259124756, 2.3332595825195312, 8.003752708435059, 16.818313598632812, 0.21219749748706818, 8000, 9.879474628751914e-05]
2023-02-14 15:45:58,128	32k	INFO	Saving model and optimizer state at iteration 98 to ./logs\32k\G_8000.pth
2023-02-14 15:46:17,546	32k	INFO	Saving model and optimizer state at iteration 98 to ./logs\32k\D_8000.pth
2023-02-14 15:46:51,611	32k	INFO	====> Epoch: 98
2023-02-14 15:48:23,605	32k	INFO	====> Epoch: 99
2023-02-14 15:49:55,516	32k	INFO	====> Epoch: 100
2023-02-14 15:50:17,192	32k	INFO	Train Epoch: 101 [0%]
2023-02-14 15:50:17,193	32k	INFO	[2.4577560424804688, 2.4049196243286133, 9.910334587097168, 17.469593048095703, 1.0077900886535645, 8200, 9.875770288847208e-05]
2023-02-14 15:51:27,581	32k	INFO	====> Epoch: 101
2023-02-14 15:52:59,453	32k	INFO	====> Epoch: 102
2023-02-14 15:53:52,351	32k	INFO	Train Epoch: 103 [44%]
2023-02-14 15:53:52,352	32k	INFO	[2.3302505016326904, 2.4835445880889893, 12.40600299835205, 19.35659408569336, 1.1815543174743652, 8400, 9.873301500583906e-05]
2023-02-14 15:54:31,690	32k	INFO	====> Epoch: 103
2023-02-14 15:56:03,804	32k	INFO	====> Epoch: 104
2023-02-14 15:57:27,545	32k	INFO	Train Epoch: 105 [88%]
2023-02-14 15:57:27,546	32k	INFO	[2.442326068878174, 2.3513882160186768, 8.61225700378418, 19.59931755065918, 0.6629721522331238, 8600, 9.870833329479095e-05]
2023-02-14 15:57:35,926	32k	INFO	====> Epoch: 105
2023-02-14 15:59:07,717	32k	INFO	====> Epoch: 106
2023-02-14 16:00:39,510	32k	INFO	====> Epoch: 107
2023-02-14 16:01:23,648	32k	INFO	Train Epoch: 108 [32%]
2023-02-14 16:01:23,649	32k	INFO	[2.329326629638672, 2.2350986003875732, 12.052788734436035, 20.033588409423828, 0.4254380762577057, 8800, 9.867132229656573e-05]
2023-02-14 16:02:11,617	32k	INFO	====> Epoch: 108
2023-02-14 16:03:43,324	32k	INFO	====> Epoch: 109
2023-02-14 16:04:58,449	32k	INFO	Train Epoch: 110 [76%]
2023-02-14 16:04:58,449	32k	INFO	[2.2845022678375244, 2.4646148681640625, 13.254862785339355, 20.803882598876953, 0.39776870608329773, 9000, 9.864665600773098e-05]
2023-02-14 16:05:02,554	32k	INFO	Saving model and optimizer state at iteration 110 to ./logs\32k\G_9000.pth
2023-02-14 16:05:20,872	32k	INFO	Saving model and optimizer state at iteration 110 to ./logs\32k\D_9000.pth
2023-02-14 16:05:41,237	32k	INFO	====> Epoch: 110
2023-02-14 16:07:12,826	32k	INFO	====> Epoch: 111
2023-02-14 16:08:44,738	32k	INFO	====> Epoch: 112
2023-02-14 16:09:20,227	32k	INFO	Train Epoch: 113 [20%]
2023-02-14 16:09:20,228	32k	INFO	[2.4082977771759033, 2.255082845687866, 10.537095069885254, 17.689083099365234, 0.7687245607376099, 9200, 9.86096681355974e-05]
2023-02-14 16:10:16,749	32k	INFO	====> Epoch: 113
2023-02-14 16:11:48,707	32k	INFO	====> Epoch: 114
2023-02-14 16:12:55,268	32k	INFO	Train Epoch: 115 [63%]
2023-02-14 16:12:55,268	32k	INFO	[2.531501293182373, 2.146554946899414, 8.033232688903809, 17.08489990234375, 0.7147048115730286, 9400, 9.858501725933955e-05]
2023-02-14 16:13:20,854	32k	INFO	====> Epoch: 115
2023-02-14 16:14:52,750	32k	INFO	====> Epoch: 116
2023-02-14 16:16:24,652	32k	INFO	====> Epoch: 117
2023-02-14 16:16:51,576	32k	INFO	Train Epoch: 118 [7%]
2023-02-14 16:16:51,576	32k	INFO	[2.305955648422241, 2.5261385440826416, 11.671979904174805, 17.262502670288086, 0.9132139682769775, 9600, 9.854805249884741e-05]
2023-02-14 16:17:56,852	32k	INFO	====> Epoch: 118
2023-02-14 16:19:28,743	32k	INFO	====> Epoch: 119
2023-02-14 16:20:26,662	32k	INFO	Train Epoch: 120 [51%]
2023-02-14 16:20:26,663	32k	INFO	[2.406419038772583, 2.3848094940185547, 12.830516815185547, 19.702857971191406, 0.9984562397003174, 9800, 9.8523417025536e-05]
2023-02-14 16:21:00,932	32k	INFO	====> Epoch: 120
2023-02-14 16:22:32,586	32k	INFO	====> Epoch: 121
2023-02-14 16:24:01,646	32k	INFO	Train Epoch: 122 [95%]
2023-02-14 16:24:01,646	32k	INFO	[2.6088125705718994, 2.3717613220214844, 7.807750225067139, 17.539306640625, 0.6654758453369141, 10000, 9.8498787710708e-05]
2023-02-14 16:24:05,762	32k	INFO	Saving model and optimizer state at iteration 122 to ./logs\32k\G_10000.pth
2023-02-14 16:24:21,684	32k	INFO	Saving model and optimizer state at iteration 122 to ./logs\32k\D_10000.pth
2023-02-14 16:24:28,063	32k	INFO	====> Epoch: 122
2023-02-14 16:26:00,976	32k	INFO	====> Epoch: 123
2023-02-14 16:27:37,900	32k	INFO	====> Epoch: 124
2023-02-14 16:28:33,348	32k	INFO	Train Epoch: 125 [39%]
2023-02-14 16:28:33,348	32k	INFO	[2.4001426696777344, 2.470517158508301, 13.341828346252441, 17.714656829833984, 0.679930567741394, 10200, 9.846185528225477e-05]
2023-02-14 16:29:19,086	32k	INFO	====> Epoch: 125
2023-02-14 16:30:59,664	32k	INFO	====> Epoch: 126
2023-02-14 16:32:28,289	32k	INFO	Train Epoch: 127 [83%]
2023-02-14 16:32:28,290	32k	INFO	[2.491032600402832, 2.2914695739746094, 10.223406791687012, 18.337860107421875, 0.9281787872314453, 10400, 9.84372413569007e-05]
2023-02-14 16:32:41,455	32k	INFO	====> Epoch: 127
2023-02-14 16:34:22,334	32k	INFO	====> Epoch: 128
2023-02-14 16:36:01,890	32k	INFO	====> Epoch: 129
2023-02-14 16:36:43,266	32k	INFO	Train Epoch: 130 [27%]
2023-02-14 16:36:43,266	32k	INFO	[2.5348987579345703, 2.291205883026123, 14.648847579956055, 19.59044647216797, 0.8180601596832275, 10600, 9.840033200544528e-05]
2023-02-14 16:37:34,913	32k	INFO	====> Epoch: 130
2023-02-14 16:39:06,291	32k	INFO	====> Epoch: 131
2023-02-14 16:40:17,727	32k	INFO	Train Epoch: 132 [71%]
2023-02-14 16:40:17,727	32k	INFO	[2.404866933822632, 2.1770548820495605, 9.80937671661377, 16.411579132080078, 0.1555909514427185, 10800, 9.837573345994909e-05]
2023-02-14 16:40:37,992	32k	INFO	====> Epoch: 132
2023-02-14 16:42:09,446	32k	INFO	====> Epoch: 133
2023-02-14 16:43:42,676	32k	INFO	====> Epoch: 134
2023-02-14 16:44:16,320	32k	INFO	Train Epoch: 135 [15%]
2023-02-14 16:44:16,320	32k	INFO	[2.379476547241211, 2.3253965377807617, 12.204051971435547, 18.108963012695312, 0.6560300588607788, 11000, 9.833884717107196e-05]
2023-02-14 16:44:20,463	32k	INFO	Saving model and optimizer state at iteration 135 to ./logs\32k\G_11000.pth
2023-02-14 16:44:40,760	32k	INFO	Saving model and optimizer state at iteration 135 to ./logs\32k\D_11000.pth
2023-02-14 16:45:43,602	32k	INFO	====> Epoch: 135
2023-02-14 16:47:14,892	32k	INFO	====> Epoch: 136
2023-02-14 16:48:17,804	32k	INFO	Train Epoch: 137 [59%]
2023-02-14 16:48:17,804	32k	INFO	[2.182915210723877, 2.490650177001953, 13.673564910888672, 19.964622497558594, 0.8504718542098999, 11200, 9.831426399582366e-05]
2023-02-14 16:48:46,586	32k	INFO	====> Epoch: 137
2023-02-14 16:50:17,982	32k	INFO	====> Epoch: 138
2023-02-14 16:51:49,652	32k	INFO	====> Epoch: 139
2023-02-14 16:52:13,059	32k	INFO	Train Epoch: 140 [2%]
2023-02-14 16:52:13,059	32k	INFO	[2.4849462509155273, 2.2978439331054688, 10.085766792297363, 16.271583557128906, 0.9613367915153503, 11400, 9.827740075511432e-05]
2023-02-14 16:53:21,336	32k	INFO	====> Epoch: 140
2023-02-14 16:54:52,874	32k	INFO	====> Epoch: 141
2023-02-14 16:55:48,996	32k	INFO	Train Epoch: 142 [46%]
2023-02-14 16:55:48,996	32k	INFO	[2.419895887374878, 2.231900215148926, 9.550468444824219, 15.956775665283203, 0.6037944555282593, 11600, 9.825283294050992e-05]
2023-02-14 16:56:34,619	32k	INFO	====> Epoch: 142
2023-02-14 16:58:12,965	32k	INFO	====> Epoch: 143
2023-02-14 16:59:51,011	32k	INFO	Train Epoch: 144 [90%]
2023-02-14 16:59:51,012	32k	INFO	[2.526609182357788, 2.2438442707061768, 9.708596229553223, 18.94228744506836, 0.6018266081809998, 11800, 9.822827126747529e-05]
2023-02-14 16:59:58,459	32k	INFO	====> Epoch: 144
2023-02-14 17:01:41,567	32k	INFO	====> Epoch: 145
2023-02-14 17:03:19,851	32k	INFO	====> Epoch: 146
2023-02-14 17:04:09,172	32k	INFO	Train Epoch: 147 [34%]
2023-02-14 17:04:09,173	32k	INFO	[2.1274771690368652, 2.403050661087036, 14.53994083404541, 20.740161895751953, 0.8522300720214844, 12000, 9.819144027000834e-05]
2023-02-14 17:04:13,341	32k	INFO	Saving model and optimizer state at iteration 147 to ./logs\32k\G_12000.pth
2023-02-14 17:04:30,263	32k	INFO	Saving model and optimizer state at iteration 147 to ./logs\32k\D_12000.pth
2023-02-14 17:05:36,434	32k	INFO	====> Epoch: 147