dmusingu's picture
asr-africa/wav2vec2-xls-r-Wolof-28-hours-alffa-plus-fleurs-dataset
f5414af verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
  - generated_from_trainer
datasets:
  - fleurs
metrics:
  - wer
model-index:
  - name: wav2vec2-xls-r-Wolof-28-hours-alffa-plus-fleurs-dataset
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: fleurs
          type: fleurs
          config: wo_sn
          split: None
          args: wo_sn
        metrics:
          - name: Wer
            type: wer
            value: 0.4408273991183452

Visualize in Weights & Biases

wav2vec2-xls-r-Wolof-28-hours-alffa-plus-fleurs-dataset

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the fleurs dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2602
  • Wer: 0.4408
  • Cer: 0.1556

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
7.4709 0.7286 400 3.3573 1.0 1.0
3.0751 1.4572 800 3.2424 1.0 1.0
2.3348 2.1858 1200 1.1856 0.8107 0.2961
0.7384 2.9144 1600 0.9239 0.6241 0.2184
0.5978 3.6430 2000 0.7904 0.5831 0.2078
0.5443 4.3716 2400 0.7605 0.5613 0.2024
0.517 5.1002 2800 0.7471 0.5778 0.2085
0.4672 5.8288 3200 0.7260 0.5458 0.1966
0.4348 6.5574 3600 0.7106 0.5102 0.1839
0.4117 7.2860 4000 0.6811 0.5134 0.1873
0.3845 8.0146 4400 0.6847 0.5192 0.1877
0.3617 8.7432 4800 0.6870 0.5747 0.2167
0.3395 9.4718 5200 0.6642 0.5366 0.1938
0.323 10.2004 5600 0.6752 0.5127 0.1880
0.309 10.9290 6000 0.6696 0.5075 0.1838
0.2853 11.6576 6400 0.7383 0.5071 0.1852
0.2652 12.3862 6800 0.6571 0.5063 0.1834
0.2557 13.1148 7200 0.6866 0.4921 0.1791
0.2443 13.8434 7600 0.6916 0.4945 0.1800
0.224 14.5719 8000 0.6833 0.5365 0.1904
0.2202 15.3005 8400 0.6920 0.5205 0.1886
0.2117 16.0291 8800 0.7279 0.5241 0.1881
0.1932 16.7577 9200 0.6772 0.5071 0.1837
0.1874 17.4863 9600 0.7134 0.4961 0.1769
0.1733 18.2149 10000 0.7350 0.5096 0.1849
0.1705 18.9435 10400 0.7188 0.5021 0.1808
0.1631 19.6721 10800 0.7608 0.5126 0.1861
0.1518 20.4007 11200 0.7117 0.4884 0.1760
0.147 21.1293 11600 0.7853 0.4754 0.1698
0.1421 21.8579 12000 0.8173 0.4717 0.1682
0.1365 22.5865 12400 0.7970 0.4797 0.1734
0.1328 23.3151 12800 0.8331 0.4799 0.1752
0.1273 24.0437 13200 0.7806 0.4729 0.1713
0.1202 24.7723 13600 0.8037 0.4704 0.1666
0.1185 25.5009 14000 0.8201 0.4650 0.1690
0.1119 26.2295 14400 0.9294 0.4774 0.1720
0.1087 26.9581 14800 0.8380 0.4718 0.1697
0.1043 27.6867 15200 0.9948 0.4677 0.1670
0.1034 28.4153 15600 0.9864 0.4707 0.1689
0.0979 29.1439 16000 1.0066 0.4694 0.1704
0.0947 29.8725 16400 0.8745 0.4802 0.1709
0.0876 30.6011 16800 0.9511 0.4910 0.1750
0.0857 31.3297 17200 0.9594 0.4525 0.1625
0.0842 32.0583 17600 1.0274 0.4681 0.1662
0.08 32.7869 18000 0.9747 0.4623 0.1647
0.08 33.5155 18400 0.9912 0.4676 0.1640
0.077 34.2441 18800 1.1352 0.4629 0.1634
0.0753 34.9727 19200 1.0100 0.4542 0.1614
0.0712 35.7013 19600 1.0493 0.4554 0.1605
0.0679 36.4299 20000 1.1336 0.4528 0.1620
0.0664 37.1585 20400 1.1095 0.4496 0.1599
0.0679 37.8871 20800 1.0197 0.4576 0.1621
0.0615 38.6157 21200 1.1053 0.4567 0.1606
0.0588 39.3443 21600 1.1809 0.4469 0.1594
0.0594 40.0729 22000 1.1607 0.4538 0.1619
0.0566 40.8015 22400 1.1570 0.4498 0.1586
0.0567 41.5301 22800 1.1453 0.4505 0.1590
0.0554 42.2587 23200 1.1740 0.4563 0.1586
0.0524 42.9872 23600 1.1408 0.4557 0.1598
0.0504 43.7158 24000 1.1360 0.4511 0.1587
0.0492 44.4444 24400 1.2167 0.4487 0.1576
0.0505 45.1730 24800 1.1709 0.4442 0.1571
0.0479 45.9016 25200 1.2109 0.4443 0.1569
0.047 46.6302 25600 1.2031 0.4430 0.1555
0.0454 47.3588 26000 1.2316 0.4401 0.1555
0.0442 48.0874 26400 1.2515 0.4390 0.1554
0.0445 48.8160 26800 1.2538 0.4409 0.1554
0.041 49.5446 27200 1.2602 0.4408 0.1556

Framework versions

  • Transformers 4.44.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.17.0
  • Tokenizers 0.19.1