AlexN commited on
Commit
e96946c
Β·
1 Parent(s): 8bb97f8

Training in progress, step 9000

Browse files
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ca00bb5c4d1a7b7121b3c1deab69fc35e14d4bf3d206cdabbf56c18628f0527a
3
  size 1262817457
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7c65eeb6c9987c3667e716df5bd99d1f37ec399638a51bc58260494d79f8df9a
3
  size 1262817457
wandb/run-20220129_215451-1vipdbow/files/output.log CHANGED
@@ -10078,3 +10078,597 @@ Deleting older checkpoint [checkpoint-7000] due to args.save_total_limit
10078
  Configuration saved in ./checkpoint-8500/config.json
10079
  Model weights saved in ./checkpoint-8500/pytorch_model.bin
10080
  Configuration saved in ./checkpoint-8500/preprocessor_config.json
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10078
  Configuration saved in ./checkpoint-8500/config.json
10079
  Model weights saved in ./checkpoint-8500/pytorch_model.bin
10080
  Configuration saved in ./checkpoint-8500/preprocessor_config.json
10081
+ Configuration saved in ./preprocessor_config.json
10082
+ Deleting older checkpoint [checkpoint-7500] due to args.save_total_limit
10083
+
10084
+
10085
+
10086
+
10087
+
10088
+
10089
+
10090
+
10091
+
10092
+
10093
+
10094
+
10095
+
10096
+
10097
+
10098
+
10099
+
10100
+
10101
+
10102
+
10103
+
10104
+
10105
+
10106
+
10107
+
10108
+
10109
+
10110
+
10111
+
10112
+
10113
+
10114
+
10115
+
10116
+
10117
+
10118
+
10119
+
10120
+
10121
+
10122
+
10123
+
10124
+
10125
+
10126
+
10127
+
10128
+
10129
+
10130
+
10131
+
10132
+
10133
+
10134
+
10135
+
10136
+
10137
+
10138
+
10139
+
10140
+
10141
+
10142
+
10143
+
10144
+
10145
+
10146
+
10147
+
10148
+
10149
+
10150
+
10151
+
10152
+
10153
+
10154
+
10155
+
10156
+
10157
+
10158
+
10159
+
10160
+
10161
+
10162
+
10163
+
10164
+
10165
+
10166
+
10167
+
10168
+
10169
+
10170
+
10171
+
10172
+
10173
+
10174
+
10175
+
10176
+
10177
+
10178
+
10179
+ 62%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 8599/13822 [9:18:38<3:27:42, 2.39s/it]
10180
+
10181
+
10182
+
10183
+
10184
+
10185
+
10186
+
10187
+
10188
+
10189
+
10190
+
10191
+
10192
+
10193
+
10194
+
10195
+
10196
+
10197
+
10198
+
10199
+
10200
+
10201
+
10202
+
10203
+
10204
+
10205
+
10206
+
10207
+
10208
+
10209
+
10210
+
10211
+
10212
+
10213
+
10214
+
10215
+
10216
+
10217
+
10218
+
10219
+
10220
+
10221
+
10222
+
10223
+
10224
+
10225
+
10226
+
10227
+
10228
+
10229
+
10230
+
10231
+
10232
+
10233
+
10234
+
10235
+
10236
+
10237
+
10238
+
10239
+
10240
+
10241
+
10242
+
10243
+
10244
+
10245
+
10246
+
10247
+
10248
+
10249
+
10250
+
10251
+
10252
+
10253
+
10254
+
10255
+
10256
+
10257
+
10258
+
10259
+
10260
+
10261
+
10262
+
10263
+
10264
+
10265
+
10266
+
10267
+
10268
+
10269
+
10270
+
10271
+
10272
+
10273
+
10274
+
10275
+
10276
+
10277
+
10278
+ 63%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 8699/13822 [9:23:47<3:28:13, 2.44s/it]
10279
+
10280
+
10281
+
10282
+
10283
+
10284
+
10285
+
10286
+
10287
+
10288
+
10289
+
10290
+
10291
+
10292
+
10293
+
10294
+
10295
+
10296
+
10297
+
10298
+
10299
+
10300
+
10301
+
10302
+
10303
+
10304
+
10305
+
10306
+
10307
+
10308
+
10309
+
10310
+
10311
+
10312
+
10313
+
10314
+
10315
+
10316
+
10317
+
10318
+
10319
+
10320
+
10321
+
10322
+
10323
+
10324
+
10325
+
10326
+
10327
+
10328
+
10329
+
10330
+
10331
+
10332
+
10333
+
10334
+
10335
+
10336
+
10337
+
10338
+
10339
+
10340
+
10341
+
10342
+
10343
+
10344
+
10345
+
10346
+
10347
+
10348
+
10349
+
10350
+
10351
+
10352
+
10353
+
10354
+
10355
+
10356
+
10357
+
10358
+
10359
+
10360
+
10361
+
10362
+
10363
+
10364
+
10365
+
10366
+
10367
+
10368
+
10369
+
10370
+
10371
+
10372
+
10373
+
10374
+
10375
+
10376
+
10377
+ 64%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 8799/13822 [9:29:00<3:25:24, 2.45s/it]
10378
+
10379
+
10380
+
10381
+
10382
+
10383
+
10384
+
10385
+
10386
+
10387
+
10388
+
10389
+
10390
+
10391
+
10392
+
10393
+
10394
+
10395
+
10396
+
10397
+
10398
+
10399
+
10400
+
10401
+
10402
+
10403
+
10404
+
10405
+
10406
+
10407
+
10408
+
10409
+
10410
+
10411
+
10412
+
10413
+
10414
+
10415
+
10416
+
10417
+
10418
+
10419
+
10420
+
10421
+
10422
+
10423
+
10424
+
10425
+
10426
+
10427
+
10428
+
10429
+
10430
+
10431
+
10432
+
10433
+
10434
+
10435
+
10436
+
10437
+
10438
+
10439
+
10440
+
10441
+
10442
+
10443
+
10444
+
10445
+
10446
+
10447
+
10448
+
10449
+
10450
+
10451
+
10452
+
10453
+
10454
+
10455
+
10456
+
10457
+
10458
+
10459
+
10460
+
10461
+
10462
+
10463
+
10464
+
10465
+
10466
+
10467
+
10468
+
10469
+
10470
+
10471
+
10472
+
10473
+
10474
+
10475
+
10476
+ 64%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 8899/13822 [9:34:14<3:23:26, 2.48s/it]
10477
+
10478
+
10479
+
10480
+
10481
+
10482
+
10483
+
10484
+
10485
+
10486
+
10487
+
10488
+
10489
+
10490
+
10491
+
10492
+
10493
+
10494
+
10495
+
10496
+
10497
+
10498
+
10499
+
10500
+
10501
+
10502
+
10503
+
10504
+
10505
+
10506
+
10507
+
10508
+
10509
+
10510
+
10511
+
10512
+
10513
+
10514
+
10515
+
10516
+
10517
+
10518
+
10519
+
10520
+
10521
+
10522
+
10523
+
10524
+
10525
+
10526
+
10527
+
10528
+
10529
+
10530
+
10531
+
10532
+
10533
+
10534
+
10535
+
10536
+
10537
+
10538
+
10539
+
10540
+
10541
+
10542
+
10543
+
10544
+
10545
+
10546
+
10547
+
10548
+
10549
+
10550
+
10551
+
10552
+
10553
+
10554
+
10555
+
10556
+
10557
+
10558
+
10559
+
10560
+
10561
+
10562
+
10563
+
10564
+
10565
+
10566
+
10567
+
10568
+
10569
+
10570
+
10571
+
10572
+
10573
+
10574
+
10575
+ 65%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 9000/13822 [9:39:28<3:16:28, 2.44s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
10576
+ ***** Running Evaluation *****
10577
+ Num examples = 5792
10578
+ Batch size = 64
10579
+ {'loss': 0.9815, 'learning_rate': 3.922252881025807e-05, 'epoch': 1.3}
10580
+
10581
+
10582
+
10583
+
10584
+
10585
+
10586
+
10587
+
10588
+
10589
+
10590
+
10591
+
10592
+
10593
+
10594
+
10595
+
10596
+
10597
+
10598
+
10599
+
10600
+
10601
+
10602
+
10603
+
10604
+
10605
+
10606
+
10607
+
10608
+
10609
+
10610
+
10611
+
10612
+
10613
+
10614
+
10615
+
10616
+
10617
+
10618
+
10619
+
10620
+
10621
+
10622
+
10623
+
10624
+
10625
+
10626
+
10627
+
10628
+
10629
+
10630
+
10631
+
10632
+
10633
+
10634
+
10635
+
10636
+
10637
+
10638
+
10639
+
10640
+
10641
+
10642
+
10643
+
10644
+
10645
+
10646
+
10647
+
10648
+
10649
+
10650
+
10651
+
10652
+
10653
+
10654
+
10655
+
10656
+
10657
+
10658
+
10659
+
10660
+
10661
+
10662
+
10663
+
10664
+
10665
+
10666
+
10667
+
10668
+
10669
+
10670
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 91/91 [04:08<00:00, 2.35s/it]
10671
+
10672
+ Configuration saved in ./checkpoint-9000/config.json
10673
+ Model weights saved in ./checkpoint-9000/pytorch_model.bin
10674
+ Configuration saved in ./checkpoint-9000/preprocessor_config.json
wandb/run-20220129_215451-1vipdbow/files/wandb-summary.json CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/logs/debug-internal.log CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/run-1vipdbow.wandb CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:843f5466c5ee268210ffb8ecd88bf908e30734d58ac7ed215a27b61327f973ca
3
- size 61553459
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8af167633f0ef10779890f3ae768b04aecb64ca6164efc4c6818f81f2cdb93fc
3
+ size 65555282