AlexN commited on
Commit
8bb97f8
Β·
1 Parent(s): 8e95003

Training in progress, step 8500

Browse files
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a422bed3f403d2dc5b83d987fbc333a772df96ba06cf9954d12ef5c10f4a445f
3
  size 1262817457
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ca00bb5c4d1a7b7121b3c1deab69fc35e14d4bf3d206cdabbf56c18628f0527a
3
  size 1262817457
wandb/run-20220129_215451-1vipdbow/files/output.log CHANGED
@@ -9486,3 +9486,595 @@ Configuration saved in ./checkpoint-8000/config.json
9486
  {'eval_loss': 0.26817116141319275, 'eval_wer': 0.3888526833718922, 'eval_runtime': 298.1504, 'eval_samples_per_second': 19.426, 'eval_steps_per_second': 0.305, 'epoch': 1.16}
9487
  Model weights saved in ./checkpoint-8000/pytorch_model.bin
9488
  Configuration saved in ./checkpoint-8000/preprocessor_config.json
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9486
  {'eval_loss': 0.26817116141319275, 'eval_wer': 0.3888526833718922, 'eval_runtime': 298.1504, 'eval_samples_per_second': 19.426, 'eval_steps_per_second': 0.305, 'epoch': 1.16}
9487
  Model weights saved in ./checkpoint-8000/pytorch_model.bin
9488
  Configuration saved in ./checkpoint-8000/preprocessor_config.json
9489
+ Configuration saved in ./preprocessor_config.json
9490
+ Deleting older checkpoint [checkpoint-7000] due to args.save_total_limit
9491
+
9492
+
9493
+
9494
+
9495
+
9496
+
9497
+
9498
+
9499
+
9500
+
9501
+
9502
+
9503
+
9504
+
9505
+
9506
+
9507
+
9508
+
9509
+
9510
+
9511
+
9512
+
9513
+
9514
+
9515
+
9516
+
9517
+
9518
+
9519
+
9520
+
9521
+
9522
+
9523
+
9524
+
9525
+
9526
+
9527
+
9528
+
9529
+
9530
+
9531
+
9532
+
9533
+
9534
+
9535
+
9536
+
9537
+
9538
+
9539
+
9540
+
9541
+
9542
+
9543
+
9544
+
9545
+
9546
+
9547
+
9548
+
9549
+
9550
+
9551
+
9552
+
9553
+
9554
+
9555
+
9556
+
9557
+
9558
+
9559
+
9560
+
9561
+
9562
+
9563
+
9564
+
9565
+
9566
+
9567
+
9568
+
9569
+
9570
+
9571
+
9572
+
9573
+
9574
+
9575
+
9576
+
9577
+
9578
+
9579
+
9580
+
9581
+
9582
+
9583
+
9584
+
9585
+
9586
+ 59%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 8099/13822 [8:45:59<3:54:22, 2.46s/it]
9587
+
9588
+
9589
+
9590
+
9591
+
9592
+
9593
+
9594
+
9595
+
9596
+
9597
+
9598
+
9599
+
9600
+
9601
+
9602
+
9603
+
9604
+
9605
+
9606
+
9607
+
9608
+
9609
+
9610
+
9611
+
9612
+
9613
+
9614
+
9615
+
9616
+
9617
+
9618
+
9619
+
9620
+
9621
+
9622
+
9623
+
9624
+
9625
+
9626
+
9627
+
9628
+
9629
+
9630
+
9631
+
9632
+
9633
+
9634
+
9635
+
9636
+
9637
+
9638
+
9639
+
9640
+
9641
+
9642
+
9643
+
9644
+
9645
+
9646
+
9647
+
9648
+
9649
+
9650
+
9651
+
9652
+
9653
+
9654
+
9655
+
9656
+
9657
+
9658
+
9659
+
9660
+
9661
+
9662
+
9663
+
9664
+
9665
+
9666
+
9667
+
9668
+
9669
+
9670
+
9671
+
9672
+
9673
+
9674
+
9675
+
9676
+
9677
+
9678
+
9679
+
9680
+
9681
+
9682
+
9683
+
9684
+
9685
+
9686
+ 59%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 8200/13822 [8:51:16<3:49:04, 2.44s/it]
9687
+
9688
+
9689
+
9690
+
9691
+
9692
+
9693
+
9694
+
9695
+
9696
+
9697
+
9698
+
9699
+
9700
+
9701
+
9702
+
9703
+
9704
+
9705
+
9706
+
9707
+
9708
+
9709
+
9710
+
9711
+
9712
+
9713
+
9714
+
9715
+
9716
+
9717
+
9718
+
9719
+
9720
+
9721
+
9722
+
9723
+
9724
+
9725
+
9726
+
9727
+
9728
+
9729
+
9730
+
9731
+
9732
+
9733
+
9734
+
9735
+
9736
+
9737
+
9738
+
9739
+
9740
+
9741
+
9742
+
9743
+
9744
+
9745
+
9746
+
9747
+
9748
+
9749
+
9750
+
9751
+
9752
+
9753
+
9754
+
9755
+
9756
+
9757
+
9758
+
9759
+
9760
+
9761
+
9762
+
9763
+
9764
+
9765
+
9766
+
9767
+
9768
+
9769
+
9770
+
9771
+
9772
+
9773
+
9774
+
9775
+
9776
+
9777
+
9778
+
9779
+
9780
+
9781
+
9782
+
9783
+
9784
+ 60%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 8299/13822 [8:56:28<3:51:37, 2.52s/it]
9785
+
9786
+
9787
+
9788
+
9789
+
9790
+
9791
+
9792
+
9793
+
9794
+
9795
+
9796
+
9797
+
9798
+
9799
+
9800
+
9801
+
9802
+
9803
+
9804
+
9805
+
9806
+
9807
+
9808
+
9809
+
9810
+
9811
+
9812
+
9813
+
9814
+
9815
+
9816
+
9817
+
9818
+
9819
+
9820
+
9821
+
9822
+
9823
+
9824
+
9825
+
9826
+
9827
+
9828
+
9829
+
9830
+
9831
+
9832
+
9833
+
9834
+
9835
+
9836
+
9837
+
9838
+
9839
+
9840
+
9841
+
9842
+
9843
+
9844
+
9845
+
9846
+
9847
+
9848
+
9849
+
9850
+
9851
+
9852
+
9853
+
9854
+
9855
+
9856
+
9857
+
9858
+
9859
+
9860
+
9861
+
9862
+
9863
+
9864
+
9865
+
9866
+
9867
+
9868
+
9869
+
9870
+
9871
+
9872
+
9873
+
9874
+
9875
+
9876
+
9877
+
9878
+
9879
+
9880
+
9881
+
9882
+
9883
+
9884
+ 61%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Š | 8400/13822 [9:01:44<3:42:31, 2.46s/it]
9885
+
9886
+
9887
+
9888
+
9889
+
9890
+
9891
+
9892
+
9893
+
9894
+
9895
+
9896
+
9897
+
9898
+
9899
+
9900
+
9901
+
9902
+
9903
+
9904
+
9905
+
9906
+
9907
+
9908
+
9909
+
9910
+
9911
+
9912
+
9913
+
9914
+
9915
+
9916
+
9917
+
9918
+
9919
+
9920
+
9921
+
9922
+
9923
+
9924
+
9925
+
9926
+
9927
+
9928
+
9929
+
9930
+
9931
+
9932
+
9933
+
9934
+
9935
+
9936
+
9937
+
9938
+
9939
+
9940
+
9941
+
9942
+
9943
+
9944
+
9945
+
9946
+
9947
+
9948
+
9949
+
9950
+
9951
+
9952
+
9953
+
9954
+
9955
+
9956
+
9957
+
9958
+
9959
+
9960
+
9961
+
9962
+
9963
+
9964
+
9965
+
9966
+
9967
+
9968
+
9969
+
9970
+
9971
+
9972
+
9973
+
9974
+
9975
+
9976
+
9977
+
9978
+
9979
+
9980
+
9981
+
9982
+ 61%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 8499/13822 [9:06:55<3:35:32, 2.43s/it]
9983
+ 61%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 8500/13822 [9:06:57<3:37:53, 2.46s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
9984
+ ***** Running Evaluation *****
9985
+ Num examples = 5792
9986
+ Batch size = 64
9987
+
9988
+
9989
+
9990
+
9991
+
9992
+
9993
+
9994
+
9995
+
9996
+
9997
+
9998
+
9999
+
10000
+
10001
+
10002
+
10003
+
10004
+
10005
+
10006
+
10007
+
10008
+
10009
+
10010
+
10011
+
10012
+
10013
+
10014
+
10015
+
10016
+
10017
+
10018
+
10019
+
10020
+
10021
+
10022
+
10023
+
10024
+
10025
+
10026
+
10027
+
10028
+
10029
+
10030
+
10031
+
10032
+
10033
+
10034
+
10035
+
10036
+
10037
+
10038
+
10039
+
10040
+
10041
+
10042
+
10043
+
10044
+
10045
+
10046
+
10047
+
10048
+
10049
+
10050
+
10051
+
10052
+
10053
+
10054
+
10055
+
10056
+
10057
+
10058
+
10059
+
10060
+
10061
+
10062
+
10063
+
10064
+
10065
+
10066
+
10067
+
10068
+
10069
+
10070
+
10071
+
10072
+
10073
+
10074
+
10075
+
10076
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 91/91 [04:07<00:00, 2.38s/it]
10077
+
10078
+ Configuration saved in ./checkpoint-8500/config.json
10079
+ Model weights saved in ./checkpoint-8500/pytorch_model.bin
10080
+ Configuration saved in ./checkpoint-8500/preprocessor_config.json
wandb/run-20220129_215451-1vipdbow/files/wandb-summary.json CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/logs/debug-internal.log CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_215451-1vipdbow/run-1vipdbow.wandb CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3920cec34508f992c87c6b894a0a1fc466949458914aec0c8f3f3e06ce7f1f8b
3
- size 57646698
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:843f5466c5ee268210ffb8ecd88bf908e30734d58ac7ed215a27b61327f973ca
3
+ size 61553459