AlexN commited on
Commit
9d166e4
Β·
1 Parent(s): 195e6f7

Training in progress, step 11500

Browse files
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2e9323c5c9908cecd07a5ed00b5ca1b4965df9be31b6d7365368053c66d68470
3
  size 1262817457
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:227d04111a8bc52d27ab348ad591613317a7e3e38cdd1cfc6c77133b7691878a
3
  size 1262817457
wandb/run-20220129_215451-1vipdbow/files/output.log CHANGED
@@ -13041,3 +13041,591 @@ Configuration saved in ./checkpoint-11000/config.json
13041
  {'eval_loss': 0.24783751368522644, 'eval_wer': 0.3760142070448695, 'eval_runtime': 297.7741, 'eval_samples_per_second': 19.451, 'eval_steps_per_second': 0.306, 'epoch': 1.59}
13042
  Model weights saved in ./checkpoint-11000/pytorch_model.bin
13043
  Configuration saved in ./checkpoint-11000/preprocessor_config.json
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13041
  {'eval_loss': 0.24783751368522644, 'eval_wer': 0.3760142070448695, 'eval_runtime': 297.7741, 'eval_samples_per_second': 19.451, 'eval_steps_per_second': 0.306, 'epoch': 1.59}
13042
  Model weights saved in ./checkpoint-11000/pytorch_model.bin
13043
  Configuration saved in ./checkpoint-11000/preprocessor_config.json
13044
+ Configuration saved in ./preprocessor_config.json
13045
+ Deleting older checkpoint [checkpoint-10000] due to args.save_total_limit
13046
+
13047
+
13048
+
13049
+
13050
+
13051
+
13052
+
13053
+
13054
+
13055
+
13056
+
13057
+
13058
+
13059
+
13060
+
13061
+
13062
+
13063
+
13064
+
13065
+
13066
+
13067
+
13068
+
13069
+
13070
+
13071
+
13072
+
13073
+
13074
+
13075
+
13076
+
13077
+
13078
+
13079
+
13080
+
13081
+
13082
+
13083
+
13084
+
13085
+
13086
+
13087
+
13088
+
13089
+
13090
+
13091
+
13092
+
13093
+
13094
+
13095
+
13096
+
13097
+
13098
+
13099
+
13100
+
13101
+
13102
+
13103
+
13104
+
13105
+
13106
+
13107
+
13108
+
13109
+
13110
+
13111
+
13112
+
13113
+
13114
+
13115
+
13116
+
13117
+
13118
+
13119
+
13120
+
13121
+
13122
+
13123
+
13124
+
13125
+
13126
+
13127
+
13128
+
13129
+
13130
+
13131
+
13132
+
13133
+
13134
+
13135
+
13136
+
13137
+
13138
+
13139
+
13140
+ 80%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 11099/13822 [12:01:24<1:50:13, 2.43s/it]
13141
+
13142
+
13143
+
13144
+
13145
+
13146
+
13147
+
13148
+
13149
+
13150
+
13151
+
13152
+
13153
+
13154
+
13155
+
13156
+
13157
+
13158
+
13159
+
13160
+
13161
+
13162
+
13163
+
13164
+
13165
+
13166
+
13167
+
13168
+
13169
+
13170
+
13171
+
13172
+
13173
+
13174
+
13175
+
13176
+
13177
+
13178
+
13179
+
13180
+
13181
+
13182
+
13183
+
13184
+
13185
+
13186
+
13187
+
13188
+
13189
+
13190
+
13191
+
13192
+
13193
+
13194
+
13195
+
13196
+
13197
+
13198
+
13199
+
13200
+
13201
+
13202
+
13203
+
13204
+
13205
+
13206
+
13207
+
13208
+
13209
+
13210
+
13211
+
13212
+
13213
+
13214
+
13215
+
13216
+
13217
+
13218
+
13219
+
13220
+
13221
+
13222
+
13223
+
13224
+
13225
+
13226
+
13227
+
13228
+
13229
+
13230
+
13231
+
13232
+
13233
+
13234
+
13235
+
13236
+
13237
+
13238
+
13239
+ 81%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 11200/13822 [12:06:38<1:49:03, 2.50s/it]
13240
+
13241
+
13242
+
13243
+
13244
+
13245
+
13246
+
13247
+
13248
+
13249
+
13250
+
13251
+
13252
+
13253
+
13254
+
13255
+
13256
+
13257
+
13258
+
13259
+
13260
+
13261
+
13262
+
13263
+
13264
+
13265
+
13266
+
13267
+
13268
+
13269
+
13270
+
13271
+
13272
+
13273
+
13274
+
13275
+
13276
+
13277
+
13278
+
13279
+
13280
+
13281
+
13282
+
13283
+
13284
+
13285
+
13286
+
13287
+
13288
+
13289
+
13290
+
13291
+
13292
+
13293
+
13294
+
13295
+
13296
+
13297
+
13298
+
13299
+
13300
+
13301
+
13302
+
13303
+
13304
+
13305
+
13306
+
13307
+
13308
+
13309
+
13310
+
13311
+
13312
+
13313
+
13314
+
13315
+
13316
+
13317
+
13318
+
13319
+
13320
+
13321
+
13322
+
13323
+
13324
+
13325
+
13326
+
13327
+
13328
+
13329
+
13330
+
13331
+
13332
+
13333
+
13334
+
13335
+
13336
+
13337
+
13338
+ 82%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 11300/13822 [12:11:49<1:42:14, 2.43s/it]
13339
+
13340
+
13341
+
13342
+
13343
+
13344
+
13345
+
13346
+
13347
+
13348
+
13349
+
13350
+
13351
+
13352
+
13353
+
13354
+
13355
+
13356
+
13357
+
13358
+
13359
+
13360
+
13361
+
13362
+
13363
+
13364
+
13365
+
13366
+
13367
+
13368
+
13369
+
13370
+
13371
+
13372
+
13373
+
13374
+
13375
+
13376
+
13377
+
13378
+
13379
+
13380
+
13381
+
13382
+
13383
+
13384
+
13385
+
13386
+
13387
+
13388
+
13389
+
13390
+
13391
+
13392
+
13393
+
13394
+
13395
+
13396
+
13397
+
13398
+
13399
+
13400
+
13401
+
13402
+
13403
+
13404
+
13405
+
13406
+
13407
+
13408
+
13409
+
13410
+
13411
+
13412
+
13413
+
13414
+
13415
+
13416
+
13417
+
13418
+
13419
+
13420
+
13421
+
13422
+
13423
+
13424
+
13425
+
13426
+
13427
+
13428
+
13429
+
13430
+
13431
+
13432
+
13433
+
13434
+
13435
+ 82%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 11399/13822 [12:16:56<1:37:19, 2.41s/it]
13436
+
13437
+
13438
+
13439
+
13440
+
13441
+
13442
+
13443
+
13444
+
13445
+
13446
+
13447
+
13448
+
13449
+
13450
+
13451
+
13452
+
13453
+
13454
+
13455
+
13456
+
13457
+
13458
+
13459
+
13460
+
13461
+
13462
+
13463
+
13464
+
13465
+
13466
+
13467
+
13468
+
13469
+
13470
+
13471
+
13472
+
13473
+
13474
+
13475
+
13476
+
13477
+
13478
+
13479
+
13480
+
13481
+
13482
+
13483
+
13484
+
13485
+
13486
+
13487
+
13488
+
13489
+
13490
+
13491
+
13492
+
13493
+
13494
+
13495
+
13496
+
13497
+
13498
+
13499
+
13500
+
13501
+
13502
+
13503
+
13504
+
13505
+
13506
+
13507
+
13508
+
13509
+
13510
+
13511
+
13512
+
13513
+
13514
+
13515
+
13516
+
13517
+
13518
+
13519
+
13520
+
13521
+
13522
+
13523
+
13524
+
13525
+
13526
+
13527
+
13528
+
13529
+
13530
+
13531
+
13532
+
13533
+ 83%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 11499/13822 [12:22:06<1:32:56, 2.40s/it]
13534
+ 83%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 11500/13822 [12:22:09<1:34:18, 2.44s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
13535
+ ***** Running Evaluation *****
13536
+ Num examples = 5792
13537
+ Batch size = 64
13538
+
13539
+
13540
+
13541
+
13542
+
13543
+
13544
+
13545
+
13546
+
13547
+
13548
+
13549
+
13550
+
13551
+
13552
+
13553
+
13554
+
13555
+
13556
+
13557
+
13558
+
13559
+
13560
+
13561
+
13562
+
13563
+
13564
+
13565
+
13566
+
13567
+
13568
+
13569
+
13570
+
13571
+
13572
+
13573
+
13574
+
13575
+
13576
+
13577
+
13578
+
13579
+
13580
+
13581
+
13582
+
13583
+
13584
+
13585
+
13586
+
13587
+
13588
+
13589
+
13590
+
13591
+
13592
+
13593
+
13594
+
13595
+
13596
+
13597
+
13598
+
13599
+
13600
+
13601
+
13602
+
13603
+
13604
+
13605
+
13606
+
13607
+
13608
+
13609
+
13610
+
13611
+
13612
+
13613
+
13614
+
13615
+
13616
+
13617
+
13618
+
13619
+
13620
+
13621
+
13622
+
13623
+
13624
+
13625
+
13626
+
13627
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 91/91 [04:05<00:00, 2.35s/it]
13628
+
13629
+ Configuration saved in ./checkpoint-11500/config.json
13630
+ Model weights saved in ./checkpoint-11500/pytorch_model.bin
13631
+ Configuration saved in ./checkpoint-11500/preprocessor_config.json
wandb/run-20220129_215451-1vipdbow/files/wandb-summary.json CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/logs/debug-internal.log CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/run-1vipdbow.wandb CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ab8f0f3a464e49fff37149fc3e7af538529a6d8e4f0a9b368d838220353109b4
3
- size 81698476
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dc92ec3387a344dbc2810a36d3fcbda131e4d2a3029e4f885b116ed4f40a2d91
3
+ size 85777260