sulaimank's picture
End of training
102774f verified
|
raw
history blame
7.05 kB
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-xls-r-300m-CV-Fleurs-lg-5hrs-v6
    results: []

wav2vec2-xls-r-300m-CV-Fleurs-lg-5hrs-v6

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3699
  • Wer: 0.7068
  • Cer: 0.1674

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3.4673 1.0 515 2.9455 1.0 1.0
2.7771 2.0 1030 2.2858 1.0 0.8202
1.8563 3.0 1545 1.3887 0.9943 0.3860
1.4104 4.0 2060 1.1308 0.9612 0.3178
1.2231 5.0 2575 1.0093 0.9368 0.2874
1.0872 6.0 3090 0.9377 0.9282 0.2764
0.9686 7.0 3605 0.8713 0.9118 0.2548
0.8675 8.0 4120 0.8353 0.9062 0.2504
0.7817 9.0 4635 0.8204 0.8985 0.2440
0.7045 10.0 5150 0.8144 0.8841 0.2332
0.6322 11.0 5665 0.8112 0.8416 0.2176
0.5865 12.0 6180 0.8228 0.8404 0.2203
0.5264 13.0 6695 0.8488 0.8297 0.2149
0.4879 14.0 7210 0.8404 0.8047 0.2070
0.4408 15.0 7725 0.9070 0.8233 0.2115
0.4079 16.0 8240 0.9762 0.8107 0.2087
0.3777 17.0 8755 0.8993 0.8119 0.2063
0.356 18.0 9270 1.0907 0.8091 0.2071
0.3234 19.0 9785 1.0084 0.8201 0.2042
0.3157 20.0 10300 0.9811 0.8201 0.2032
0.2892 21.0 10815 1.0994 0.8067 0.1999
0.2793 22.0 11330 1.0639 0.7842 0.1986
0.2609 23.0 11845 1.0425 0.7925 0.1996
0.2535 24.0 12360 1.0799 0.7888 0.1988
0.2422 25.0 12875 1.0773 0.7795 0.1932
0.2336 26.0 13390 1.0731 0.7732 0.1930
0.2241 27.0 13905 1.1465 0.7730 0.1907
0.205 28.0 14420 1.1303 0.7853 0.1935
0.2045 29.0 14935 1.1377 0.7825 0.1919
0.2004 30.0 15450 1.1406 0.7701 0.1884
0.1874 31.0 15965 1.2273 0.7749 0.1869
0.1901 32.0 16480 1.2571 0.7551 0.1846
0.178 33.0 16995 1.2050 0.7666 0.1900
0.176 34.0 17510 1.2171 0.7550 0.1842
0.174 35.0 18025 1.2065 0.7790 0.1850
0.1668 36.0 18540 1.2275 0.7582 0.1863
0.1663 37.0 19055 1.2588 0.7574 0.1862
0.1673 38.0 19570 1.2510 0.7556 0.1830
0.1542 39.0 20085 1.2482 0.7526 0.1818
0.1504 40.0 20600 1.2521 0.7545 0.1831
0.1524 41.0 21115 1.3708 0.7838 0.1863
0.1425 42.0 21630 1.2846 0.7711 0.1838
0.1458 43.0 22145 1.2877 0.7509 0.1820
0.1416 44.0 22660 1.2903 0.7581 0.1810
0.137 45.0 23175 1.2775 0.7472 0.1807
0.131 46.0 23690 1.3168 0.7404 0.1793
0.1384 47.0 24205 1.2914 0.7545 0.1805
0.1281 48.0 24720 1.2716 0.7421 0.1799
0.1306 49.0 25235 1.3053 0.7443 0.1784
0.1326 50.0 25750 1.3336 0.7419 0.1795
0.1202 51.0 26265 1.3539 0.7342 0.1784
0.1182 52.0 26780 1.3186 0.7584 0.1812
0.117 53.0 27295 1.3012 0.7317 0.1757
0.1154 54.0 27810 1.2908 0.7333 0.1757
0.1123 55.0 28325 1.3116 0.7356 0.1762
0.1124 56.0 28840 1.3920 0.7315 0.1745
0.1185 57.0 29355 1.3557 0.7285 0.1737
0.1032 58.0 29870 1.3676 0.7260 0.1742
0.1047 59.0 30385 1.3938 0.7328 0.1743
0.1047 60.0 30900 1.3472 0.7355 0.1761
0.1047 61.0 31415 1.3843 0.7294 0.1739
0.1008 62.0 31930 1.3270 0.7314 0.1749
0.0971 63.0 32445 1.3778 0.7297 0.1739
0.0947 64.0 32960 1.3629 0.7253 0.1734
0.0955 65.0 33475 1.4170 0.7174 0.1716
0.0977 66.0 33990 1.3668 0.7118 0.1707
0.0961 67.0 34505 1.4107 0.7150 0.1709
0.093 68.0 35020 1.4030 0.7140 0.1701
0.0856 69.0 35535 1.3854 0.7068 0.1681
0.0879 70.0 36050 1.3952 0.7152 0.1706
0.0878 71.0 36565 1.4117 0.7219 0.1717
0.0842 72.0 37080 1.4185 0.7131 0.1699
0.0833 73.0 37595 1.3656 0.7099 0.1684
0.081 74.0 38110 1.3637 0.7091 0.1694
0.0798 75.0 38625 1.4499 0.7156 0.1701
0.0783 76.0 39140 1.4385 0.7126 0.1700
0.0767 77.0 39655 1.4507 0.7058 0.1674
0.0772 78.0 40170 1.4279 0.7058 0.1683
0.0785 79.0 40685 1.3699 0.7068 0.1674

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1