You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Visualize in Weights & Biases

w2v2_bert-Wolof-28-hours-alffa-plus-fleurs-dataset

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the fleurs dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7814
  • Wer: 0.4379
  • Cer: 0.1524

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.3117 0.3643 400 1.0506 0.6341 0.2221
0.6003 0.7286 800 0.8588 0.5837 0.1976
0.5505 1.0929 1200 0.7719 0.5297 0.1822
0.5277 1.4572 1600 0.7566 0.5279 0.1722
0.5204 1.8215 2000 0.7483 0.5400 0.1853
0.5097 2.1858 2400 0.6775 0.5114 0.1724
0.5058 2.5501 2800 0.7678 0.5563 0.2016
0.5238 2.9144 3200 0.7946 0.5721 0.2025
0.5272 3.2787 3600 0.8778 0.5666 0.2171
0.5451 3.6430 4000 0.7964 0.5638 0.2050
0.557 4.0073 4400 0.8373 0.5667 0.2119
0.525 4.3716 4800 0.7748 0.5472 0.1924
0.5527 4.7359 5200 0.9768 0.6022 0.2476
0.5546 5.1002 5600 0.9679 0.6175 0.2315
0.524 5.4645 6000 0.8655 0.6170 0.2297
0.517 5.8288 6400 0.9147 0.6181 0.2262
0.4996 6.1931 6800 0.9784 0.6436 0.2970
0.4707 6.5574 7200 0.8462 0.5831 0.2189
0.484 6.9217 7600 0.9278 0.5697 0.2127
0.4413 7.2860 8000 0.9034 0.6156 0.2416
0.4386 7.6503 8400 0.9047 0.6102 0.2517
0.4138 8.0146 8800 0.8722 0.5843 0.2124
0.383 8.3789 9200 0.8663 0.6164 0.2370
0.3857 8.7432 9600 0.9396 0.5816 0.2361
0.3813 9.1075 10000 0.8662 0.6306 0.2451
0.3445 9.4718 10400 0.8234 0.5586 0.2111
0.3635 9.8361 10800 0.8676 0.5675 0.2314
0.3428 10.2004 11200 0.8794 0.5669 0.2251
0.3196 10.5647 11600 0.8398 0.5421 0.2090
0.3083 10.9290 12000 0.8148 0.5517 0.2203
0.2969 11.2933 12400 0.7556 0.5435 0.1995
0.2914 11.6576 12800 0.8325 0.5544 0.2109
0.2893 12.0219 13200 0.7453 0.5317 0.2037
0.2541 12.3862 13600 0.8518 0.5542 0.2170
0.2705 12.7505 14000 0.7374 0.5296 0.1921
0.2588 13.1148 14400 0.7741 0.5114 0.1910
0.2316 13.4791 14800 0.7961 0.5250 0.1943
0.2351 13.8434 15200 0.7988 0.5542 0.2155
0.2256 14.2077 15600 0.7971 0.5367 0.2014
0.2139 14.5719 16000 0.7724 0.5036 0.1855
0.2095 14.9362 16400 0.7601 0.5055 0.1848
0.1909 15.3005 16800 0.7622 0.5144 0.1859
0.1945 15.6648 17200 0.7337 0.5081 0.1855
0.1895 16.0291 17600 0.8038 0.5332 0.1965
0.1644 16.3934 18000 0.7720 0.5432 0.2024
0.175 16.7577 18400 0.7946 0.5175 0.1917
0.1786 17.1220 18800 0.7847 0.5441 0.1992
0.1617 17.4863 19200 0.7441 0.5015 0.1882
0.1529 17.8506 19600 0.7367 0.5024 0.1832
0.1437 18.2149 20000 0.7440 0.4999 0.1815
0.1348 18.5792 20400 0.7607 0.5010 0.1840
0.1421 18.9435 20800 0.7563 0.5430 0.1964
0.129 19.3078 21200 0.7929 0.5015 0.1855
0.1246 19.6721 21600 0.7812 0.5223 0.1943
0.1341 20.0364 22000 0.8550 0.5188 0.1985
0.1122 20.4007 22400 0.7657 0.5084 0.1875
0.1127 20.7650 22800 0.7788 0.5211 0.1896
0.1131 21.1293 23200 0.8108 0.4675 0.1706
0.0977 21.4936 23600 0.7568 0.5405 0.1895
0.0993 21.8579 24000 0.7105 0.4919 0.1768
0.0921 22.2222 24400 0.8427 0.4973 0.1843
0.0905 22.5865 24800 0.7752 0.5114 0.1785
0.0903 22.9508 25200 0.7315 0.5051 0.1800
0.0786 23.3151 25600 0.8089 0.4909 0.1827
0.0911 23.6794 26000 0.8048 0.5161 0.1877
0.0875 24.0437 26400 0.8438 0.5413 0.1979
0.0775 24.4080 26800 0.8683 0.5032 0.1842
0.0798 24.7723 27200 0.7693 0.5066 0.1846
0.0681 25.1366 27600 0.7252 0.4901 0.1731
0.0621 25.5009 28000 0.7520 0.4814 0.1710
0.0645 25.8652 28400 0.7620 0.4706 0.1660
0.0636 26.2295 28800 0.7567 0.4823 0.1709
0.0579 26.5938 29200 0.7601 0.4824 0.1708
0.0626 26.9581 29600 0.7750 0.4738 0.1714
0.053 27.3224 30000 0.7709 0.4751 0.1692
0.0513 27.6867 30400 0.7936 0.4738 0.1692
0.0575 28.0510 30800 0.8438 0.4816 0.1726
0.0487 28.4153 31200 0.7352 0.4718 0.1656
0.0462 28.7796 31600 0.7660 0.4612 0.1621
0.0434 29.1439 32000 0.7735 0.4778 0.1684
0.0424 29.5082 32400 0.8004 0.4660 0.1628
0.0405 29.8725 32800 0.7835 0.4713 0.1637
0.0374 30.2368 33200 0.8197 0.4632 0.1664
0.0378 30.6011 33600 0.8158 0.4658 0.1620
0.0347 30.9654 34000 0.8216 0.4600 0.1578
0.033 31.3297 34400 0.7858 0.4769 0.1686
0.0325 31.6940 34800 0.7995 0.4725 0.1670
0.0312 32.0583 35200 0.8798 0.4961 0.1765
0.0297 32.4226 35600 0.8786 0.4604 0.1623
0.031 32.7869 36000 0.8855 0.4665 0.1630
0.0278 33.1512 36400 0.8873 0.4732 0.1702
0.024 33.5155 36800 0.9000 0.4787 0.1693
0.0272 33.8798 37200 0.8656 0.4759 0.1692
0.0199 34.2441 37600 0.9720 0.4588 0.1584
0.0198 34.6084 38000 0.9094 0.4652 0.1623
0.0216 34.9727 38400 0.8951 0.4841 0.1713
0.0167 35.3370 38800 0.9824 0.4806 0.1663
0.0174 35.7013 39200 0.9770 0.4936 0.1716
0.0198 36.0656 39600 0.9284 0.4749 0.1644
0.0153 36.4299 40000 1.0008 0.4796 0.1697
0.015 36.7942 40400 1.1019 0.4770 0.1641
0.0145 37.1585 40800 1.0591 0.4663 0.1605
0.0119 37.5228 41200 1.0535 0.4581 0.1607
0.0121 37.8871 41600 1.0635 0.4657 0.1634
0.0141 38.2514 42000 1.0896 0.4681 0.1633
0.0104 38.6157 42400 1.1029 0.4588 0.1613
0.0112 38.9800 42800 1.1009 0.4586 0.1614
0.0084 39.3443 43200 1.1865 0.4674 0.1642
0.009 39.7086 43600 1.0865 0.4625 0.1664
0.009 40.0729 44000 1.1308 0.4678 0.1620
0.0064 40.4372 44400 1.1246 0.4624 0.1645
0.008 40.8015 44800 1.1420 0.4481 0.1577
0.0074 41.1658 45200 1.1738 0.4543 0.1570
0.0065 41.5301 45600 1.1550 0.4598 0.1591
0.0063 41.8944 46000 1.1695 0.4582 0.1600
0.0049 42.2587 46400 1.2457 0.4456 0.1530
0.0054 42.6230 46800 1.2477 0.4554 0.1566
0.0054 42.9872 47200 1.2428 0.4483 0.1597
0.0042 43.3515 47600 1.2694 0.4598 0.1584
0.0041 43.7158 48000 1.3141 0.4463 0.1552
0.0039 44.0801 48400 1.3956 0.4463 0.1555
0.0026 44.4444 48800 1.3849 0.4437 0.1528
0.0028 44.8087 49200 1.4267 0.4565 0.1564
0.0027 45.1730 49600 1.4942 0.4479 0.1542
0.0021 45.5373 50000 1.4483 0.4451 0.1544
0.002 45.9016 50400 1.5475 0.4431 0.1535
0.0019 46.2659 50800 1.4928 0.4450 0.1536
0.0014 46.6302 51200 1.5448 0.4466 0.1566
0.0015 46.9945 51600 1.5942 0.4418 0.1537
0.0009 47.3588 52000 1.6472 0.4390 0.1520
0.0009 47.7231 52400 1.6661 0.4409 0.1522
0.0008 48.0874 52800 1.7172 0.4389 0.1517
0.0005 48.4517 53200 1.7628 0.4409 0.1526
0.0007 48.8160 53600 1.7515 0.4379 0.1513
0.0006 49.1803 54000 1.7836 0.4400 0.1523
0.0008 49.5446 54400 1.7794 0.4394 0.1525
0.0006 49.9089 54800 1.7814 0.4379 0.1524

Framework versions

  • Transformers 4.44.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.17.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
606M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for asr-africa/w2v2_bert-Wolof-28-hours-alffa-plus-fleurs-dataset

Finetuned
(350)
this model

Evaluation results