AlexN commited on
Commit
8e95003
Β·
1 Parent(s): 15cc7df

Training in progress, step 8000

Browse files
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3a64731777bd746d359ffa586729d25bea4492bc28fa247fa4547bb22bc3101c
3
  size 1262817457
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a422bed3f403d2dc5b83d987fbc333a772df96ba06cf9954d12ef5c10f4a445f
3
  size 1262817457
wandb/run-20220129_215451-1vipdbow/files/output.log CHANGED
@@ -8895,3 +8895,594 @@ Deleting older checkpoint [checkpoint-6000] due to args.save_total_limit
8895
  Configuration saved in ./checkpoint-7500/config.json
8896
  Model weights saved in ./checkpoint-7500/pytorch_model.bin
8897
  Configuration saved in ./checkpoint-7500/preprocessor_config.json
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8895
  Configuration saved in ./checkpoint-7500/config.json
8896
  Model weights saved in ./checkpoint-7500/pytorch_model.bin
8897
  Configuration saved in ./checkpoint-7500/preprocessor_config.json
8898
+ Configuration saved in ./preprocessor_config.json
8899
+ Deleting older checkpoint [checkpoint-6500] due to args.save_total_limit
8900
+
8901
+
8902
+
8903
+
8904
+
8905
+
8906
+
8907
+
8908
+
8909
+
8910
+
8911
+
8912
+
8913
+
8914
+
8915
+
8916
+
8917
+
8918
+
8919
+
8920
+
8921
+
8922
+
8923
+
8924
+
8925
+
8926
+
8927
+
8928
+
8929
+
8930
+
8931
+
8932
+
8933
+
8934
+
8935
+
8936
+
8937
+
8938
+
8939
+
8940
+
8941
+
8942
+
8943
+
8944
+
8945
+
8946
+
8947
+
8948
+
8949
+
8950
+
8951
+
8952
+
8953
+
8954
+
8955
+
8956
+
8957
+
8958
+
8959
+
8960
+
8961
+
8962
+
8963
+
8964
+
8965
+
8966
+
8967
+
8968
+
8969
+
8970
+
8971
+
8972
+
8973
+
8974
+
8975
+
8976
+
8977
+
8978
+
8979
+
8980
+
8981
+
8982
+
8983
+
8984
+
8985
+
8986
+
8987
+
8988
+
8989
+
8990
+
8991
+
8992
+
8993
+
8994
+
8995
+ 55%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 7599/13822 [8:13:27<4:11:44, 2.43s/it]
8996
+
8997
+
8998
+
8999
+
9000
+
9001
+
9002
+
9003
+
9004
+
9005
+
9006
+
9007
+
9008
+
9009
+
9010
+
9011
+
9012
+
9013
+
9014
+
9015
+
9016
+
9017
+
9018
+
9019
+
9020
+
9021
+
9022
+
9023
+
9024
+
9025
+
9026
+
9027
+
9028
+
9029
+
9030
+
9031
+
9032
+
9033
+
9034
+
9035
+
9036
+
9037
+
9038
+
9039
+
9040
+
9041
+
9042
+
9043
+
9044
+
9045
+
9046
+
9047
+
9048
+
9049
+
9050
+
9051
+
9052
+
9053
+
9054
+
9055
+
9056
+
9057
+
9058
+
9059
+
9060
+
9061
+
9062
+
9063
+
9064
+
9065
+
9066
+
9067
+
9068
+
9069
+
9070
+
9071
+
9072
+
9073
+
9074
+
9075
+
9076
+
9077
+
9078
+
9079
+
9080
+
9081
+
9082
+
9083
+
9084
+
9085
+
9086
+
9087
+
9088
+
9089
+
9090
+
9091
+
9092
+
9093
+
9094
+ 56%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 7699/13822 [8:18:40<4:06:50, 2.42s/it]
9095
+
9096
+
9097
+
9098
+
9099
+
9100
+
9101
+
9102
+
9103
+
9104
+
9105
+
9106
+
9107
+
9108
+
9109
+
9110
+
9111
+
9112
+
9113
+
9114
+
9115
+
9116
+
9117
+
9118
+
9119
+
9120
+
9121
+
9122
+
9123
+
9124
+
9125
+
9126
+
9127
+
9128
+
9129
+
9130
+
9131
+
9132
+
9133
+
9134
+
9135
+
9136
+
9137
+
9138
+
9139
+
9140
+
9141
+
9142
+
9143
+
9144
+
9145
+
9146
+
9147
+
9148
+
9149
+
9150
+
9151
+
9152
+
9153
+
9154
+
9155
+
9156
+
9157
+
9158
+
9159
+
9160
+
9161
+
9162
+
9163
+
9164
+
9165
+
9166
+
9167
+
9168
+
9169
+
9170
+
9171
+
9172
+
9173
+
9174
+
9175
+
9176
+
9177
+
9178
+
9179
+
9180
+
9181
+
9182
+
9183
+
9184
+
9185
+
9186
+
9187
+
9188
+
9189
+
9190
+
9191
+
9192
+
9193
+
9194
+ 56%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 7800/13822 [8:23:55<4:13:21, 2.52s/it]
9195
+
9196
+
9197
+
9198
+
9199
+
9200
+
9201
+
9202
+
9203
+
9204
+
9205
+
9206
+
9207
+
9208
+
9209
+
9210
+
9211
+
9212
+
9213
+
9214
+
9215
+
9216
+
9217
+
9218
+
9219
+
9220
+
9221
+
9222
+
9223
+
9224
+
9225
+
9226
+
9227
+
9228
+
9229
+
9230
+
9231
+
9232
+
9233
+
9234
+
9235
+
9236
+
9237
+
9238
+
9239
+
9240
+
9241
+
9242
+
9243
+
9244
+
9245
+
9246
+
9247
+
9248
+
9249
+
9250
+
9251
+
9252
+
9253
+
9254
+
9255
+
9256
+
9257
+
9258
+
9259
+
9260
+
9261
+
9262
+
9263
+
9264
+
9265
+
9266
+
9267
+
9268
+
9269
+
9270
+
9271
+
9272
+
9273
+
9274
+
9275
+
9276
+
9277
+
9278
+
9279
+
9280
+
9281
+
9282
+
9283
+
9284
+
9285
+
9286
+
9287
+
9288
+
9289
+
9290
+
9291
+ 57%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 7899/13822 [8:29:05<4:01:42, 2.45s/it]
9292
+
9293
+
9294
+
9295
+
9296
+
9297
+
9298
+
9299
+
9300
+
9301
+
9302
+
9303
+
9304
+
9305
+
9306
+
9307
+
9308
+
9309
+
9310
+
9311
+
9312
+
9313
+
9314
+
9315
+
9316
+
9317
+
9318
+
9319
+
9320
+
9321
+
9322
+
9323
+
9324
+
9325
+
9326
+
9327
+
9328
+
9329
+
9330
+
9331
+
9332
+
9333
+
9334
+
9335
+
9336
+
9337
+
9338
+
9339
+
9340
+
9341
+
9342
+
9343
+
9344
+
9345
+
9346
+
9347
+
9348
+
9349
+
9350
+
9351
+
9352
+
9353
+
9354
+
9355
+
9356
+
9357
+
9358
+
9359
+
9360
+
9361
+
9362
+
9363
+
9364
+
9365
+
9366
+
9367
+
9368
+
9369
+
9370
+
9371
+
9372
+
9373
+
9374
+
9375
+
9376
+
9377
+
9378
+
9379
+
9380
+
9381
+
9382
+
9383
+
9384
+
9385
+
9386
+
9387
+
9388
+
9389
+ 58%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 7999/13822 [8:34:17<3:55:02, 2.42s/it]
9390
+ 58%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Ž | 8000/13822 [8:34:19<3:56:14, 2.43s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
9391
+ ***** Running Evaluation *****
9392
+ Num examples = 5792
9393
+ Batch size = 64
9394
+
9395
+
9396
+
9397
+
9398
+
9399
+
9400
+
9401
+
9402
+
9403
+
9404
+
9405
+
9406
+
9407
+
9408
+
9409
+
9410
+
9411
+
9412
+
9413
+
9414
+
9415
+
9416
+
9417
+
9418
+
9419
+
9420
+
9421
+
9422
+
9423
+
9424
+
9425
+
9426
+
9427
+
9428
+
9429
+
9430
+
9431
+
9432
+
9433
+
9434
+
9435
+
9436
+
9437
+
9438
+
9439
+
9440
+
9441
+
9442
+
9443
+
9444
+
9445
+
9446
+
9447
+
9448
+
9449
+
9450
+
9451
+
9452
+
9453
+
9454
+
9455
+
9456
+
9457
+
9458
+
9459
+
9460
+
9461
+
9462
+
9463
+
9464
+
9465
+
9466
+
9467
+
9468
+
9469
+
9470
+
9471
+
9472
+
9473
+
9474
+
9475
+
9476
+
9477
+
9478
+
9479
+
9480
+
9481
+
9482
+
9483
+
9484
+
9485
+ Configuration saved in ./checkpoint-8000/config.json
9486
+ {'eval_loss': 0.26817116141319275, 'eval_wer': 0.3888526833718922, 'eval_runtime': 298.1504, 'eval_samples_per_second': 19.426, 'eval_steps_per_second': 0.305, 'epoch': 1.16}
9487
+ Model weights saved in ./checkpoint-8000/pytorch_model.bin
9488
+ Configuration saved in ./checkpoint-8000/preprocessor_config.json
wandb/run-20220129_215451-1vipdbow/files/wandb-summary.json CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/logs/debug-internal.log CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/run-1vipdbow.wandb CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2e4e819638fa4d56d583d51a37a78e1415ff0e9dd3ac011f7929b6c763317f3a
3
- size 53856973
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3920cec34508f992c87c6b894a0a1fc466949458914aec0c8f3f3e06ce7f1f8b
3
+ size 57646698