--- language: - nl license: apache-2.0 tags: - automatic-speech-recognition - mozilla-foundation/common_voice_8_0 - generated_from_trainer - nl - robust-speech-event - model_for_talk datasets: - mozilla-foundation/common_voice_8_0 model-index: - name: wav2vec2-large-xls-r-300m-cv8-nl results: - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Common Voice 8 type: mozilla-foundation/common_voice_8_0 args: nl metrics: - name: Test WER type: wer value: 17.56 - name: Test CER type: cer value: 5.49 - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: Robust Speech Event - Dev Data type: speech-recognition-community-v2/dev_data args: nl metrics: - name: Test WER type: wer value: 39.25 - name: Test CER type: cer value: 16.64 --- # wav2vec2-large-xls-r-300m-cv8-nl This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset. It achieves the following results on the testset of commonvoice: - Loss: NA, see below loss on validation (eval) - Wer: 0.371 -> detailed metrics ## Model description Dutch wav2vec2-xls-r-300m model ## Intended uses & limitations More information needed ## Training and evaluation data The model was trained on Dutch common voice 8 with 75 epochs. The train set consisted of the common voice 8 train set and evaluation set was the common voice 8 validation set. The WER reported is on the common voice 8 test set which was not part of training nor validation (eval) ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 7.5e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 75 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:------:| | 4.4631 | 0.45 | 400 | 3.0212 | 1.0 | | 2.8895 | 0.9 | 800 | 2.8555 | 1.0 | | 2.5142 | 1.36 | 1200 | 0.8149 | 0.6375 | | 1.5593 | 1.81 | 1600 | 0.3854 | 0.3574 | | 1.3414 | 2.26 | 2000 | 0.3059 | 0.3060 | | 1.2564 | 2.71 | 2400 | 0.2728 | 0.2898 | | 1.2059 | 3.17 | 2800 | 0.2637 | 0.2748 | | 1.1632 | 3.62 | 3200 | 0.2366 | 0.2637 | | 1.1177 | 4.07 | 3600 | 0.2285 | 0.2478 | | 1.097 | 4.52 | 4000 | 0.2194 | 0.2408 | | 1.086 | 4.98 | 4400 | 0.2138 | 0.2340 | | 1.0584 | 5.43 | 4800 | 0.2100 | 0.2294 | | 1.0539 | 5.88 | 5200 | 0.2033 | 0.2285 | | 1.042 | 6.33 | 5600 | 0.2066 | 0.2320 | | 1.0253 | 6.79 | 6000 | 0.2012 | 0.2260 | | 1.0115 | 7.24 | 6400 | 0.1960 | 0.2201 | | 1.007 | 7.69 | 6800 | 0.1983 | 0.2210 | | 0.9987 | 8.14 | 7200 | 0.1955 | 0.2212 | | 0.9864 | 8.6 | 7600 | 0.1879 | 0.2157 | | 0.9831 | 9.05 | 8000 | 0.1910 | 0.2221 | | 0.9707 | 9.5 | 8400 | 0.1944 | 0.2199 | | 0.9717 | 9.95 | 8800 | 0.1885 | 0.2133 | | 0.9536 | 10.41 | 9200 | 0.1981 | 0.2157 | | 0.9551 | 10.86 | 9600 | 0.1880 | 0.2184 | | 0.9449 | 11.31 | 10000 | 0.1938 | 0.2233 | | 0.9393 | 11.76 | 10400 | 0.1926 | 0.2136 | | 0.9348 | 12.22 | 10800 | 0.1893 | 0.2143 | | 0.9379 | 12.67 | 11200 | 0.1935 | 0.2206 | | 0.931 | 13.12 | 11600 | 0.1868 | 0.2148 | | 0.9171 | 13.57 | 12000 | 0.1880 | 0.2085 | | 0.92 | 14.03 | 12400 | 0.1963 | 0.2132 | | 0.9057 | 14.48 | 12800 | 0.1881 | 0.2131 | | 0.9074 | 14.93 | 13200 | 0.1839 | 0.2146 | | 0.8979 | 15.38 | 13600 | 0.1834 | 0.1954 | | 0.9079 | 15.84 | 14000 | 0.1800 | 0.1899 | | 0.8863 | 16.29 | 14400 | 0.1845 | 0.2058 | | 0.8931 | 16.74 | 14800 | 0.1902 | 0.1973 | | 0.8859 | 17.19 | 15200 | 0.1956 | 0.1982 | | 0.8898 | 17.65 | 15600 | 0.1784 | 0.1946 | | 0.8827 | 18.1 | 16000 | 0.1883 | 0.2067 | | 0.8882 | 18.55 | 16400 | 0.1851 | 0.2006 | | 0.884 | 19.0 | 16800 | 0.1877 | 0.1978 | | 0.8717 | 19.46 | 17200 | 0.1804 | 0.1994 | | 0.8642 | 19.91 | 17600 | 0.1758 | 0.1987 | | 0.8635 | 20.36 | 18000 | 0.1840 | 0.2003 | | 0.8687 | 20.81 | 18400 | 0.1782 | 0.2082 | | 0.8674 | 21.27 | 18800 | 0.1803 | 0.2046 | | 0.8555 | 21.72 | 19200 | 0.1858 | 0.2059 | | 0.8542 | 22.17 | 19600 | 0.1850 | 0.1958 | | 0.8551 | 22.62 | 20000 | 0.1825 | 0.1946 | | 0.8424 | 23.08 | 20400 | 0.1827 | 0.1726 | | 0.8424 | 23.53 | 20800 | 0.1843 | 0.1936 | | 0.8498 | 23.98 | 21200 | 0.1810 | 0.1985 | | 0.8299 | 24.43 | 21600 | 0.1774 | 0.1888 | | 0.8361 | 24.89 | 22000 | 0.1927 | 0.1942 | | 0.841 | 25.34 | 22400 | 0.1871 | 0.1903 | | 0.8277 | 25.79 | 22800 | 0.1786 | 0.1867 | | 0.8272 | 26.24 | 23200 | 0.1893 | 0.1616 | | 0.8321 | 26.7 | 23600 | 0.1856 | 0.1521 | | 0.8321 | 27.15 | 24000 | 0.1807 | 0.1477 | | 0.8212 | 27.6 | 24400 | 0.1777 | 0.1508 | | 0.8238 | 28.05 | 24800 | 0.1829 | 0.1539 | | 0.8158 | 28.51 | 25200 | 0.1888 | 0.1619 | | 0.8042 | 28.96 | 25600 | 0.1864 | 0.1510 | | 0.8141 | 29.41 | 26000 | 0.1909 | 0.1548 | | 0.8119 | 29.86 | 26400 | 0.1842 | 0.1523 | | 0.8023 | 30.32 | 26800 | 0.1852 | 0.1459 | | 0.8043 | 30.77 | 27200 | 0.1747 | 0.1496 | | 0.8082 | 31.22 | 27600 | 0.1827 | 0.1512 | | 0.8011 | 31.67 | 28000 | 0.1850 | 0.1480 | | 0.7869 | 32.13 | 28400 | 0.1816 | 0.1502 | | 0.7975 | 32.58 | 28800 | 0.1832 | 0.1511 | | 0.7811 | 33.03 | 29200 | 0.1810 | 0.1429 | | 0.7982 | 33.48 | 29600 | 0.1706 | 0.1407 | | 0.8007 | 33.94 | 30000 | 0.1844 | 0.1548 | | 0.7907 | 34.39 | 30400 | 0.1843 | 0.1539 | | 0.8005 | 34.84 | 30800 | 0.1798 | 0.1462 | | 0.7769 | 35.29 | 31200 | 0.1798 | 0.1494 | | 0.7869 | 35.75 | 31600 | 0.1868 | 0.1643 | | 0.7789 | 36.2 | 32000 | 0.1817 | 0.1477 | | 0.7881 | 36.65 | 32400 | 0.1801 | 0.1419 | | 0.7832 | 37.1 | 32800 | 0.1765 | 0.1454 | | 0.778 | 37.56 | 33200 | 0.1779 | 0.1467 | | 0.779 | 38.01 | 33600 | 0.1829 | 0.1565 | | 0.7693 | 38.46 | 34000 | 0.1748 | 0.1583 | | 0.7765 | 38.91 | 34400 | 0.1842 | 0.1683 | | 0.7786 | 39.37 | 34800 | 0.1897 | 0.1543 | | 0.7652 | 39.82 | 35200 | 0.1861 | 0.1495 | | 0.773 | 40.27 | 35600 | 0.1775 | 0.1419 | | 0.7625 | 40.72 | 36000 | 0.1916 | 0.1525 | | 0.7625 | 41.18 | 36400 | 0.1800 | 0.1429 | | 0.7548 | 41.63 | 36800 | 0.1788 | 0.1464 | | 0.7608 | 42.08 | 37200 | 0.1841 | 0.1457 | | 0.7614 | 42.53 | 37600 | 0.1805 | 0.1401 | | 0.7646 | 42.99 | 38000 | 0.1863 | 0.1455 | | 0.7488 | 43.44 | 38400 | 0.1903 | 0.1479 | | 0.7566 | 43.89 | 38800 | 0.1825 | 0.1414 | | 0.7495 | 44.34 | 39200 | 0.1873 | 0.1476 | | 0.7453 | 44.8 | 39600 | 0.1887 | 0.1473 | | 0.7414 | 45.25 | 40000 | 0.1783 | 0.1430 | | 0.7431 | 45.7 | 40400 | 0.1866 | 0.1459 | | 0.7405 | 46.15 | 40800 | 0.1847 | 0.1442 | | 0.7421 | 46.61 | 41200 | 0.1824 | 0.1626 | | 0.7423 | 47.06 | 41600 | 0.1843 | 0.1443 | | 0.7405 | 47.51 | 42000 | 0.1787 | 0.1444 | | 0.7339 | 47.96 | 42400 | 0.1764 | 0.1646 | | 0.7297 | 48.42 | 42800 | 0.1749 | 0.1430 | | 0.7397 | 48.87 | 43200 | 0.1823 | 0.1518 | | 0.7328 | 49.32 | 43600 | 0.1838 | 0.1565 | | 0.7342 | 49.77 | 44000 | 0.1797 | 0.1628 | | 0.7408 | 50.23 | 44400 | 0.1771 | 0.1641 | | 0.7286 | 50.68 | 44800 | 0.1826 | 0.1692 | | 0.7305 | 51.13 | 45200 | 0.1760 | 0.1673 | | 0.721 | 51.58 | 45600 | 0.1769 | 0.1611 | | 0.7354 | 52.04 | 46000 | 0.1836 | 0.1604 | | 0.7181 | 52.49 | 46400 | 0.1777 | 0.1576 | | 0.7212 | 52.94 | 46800 | 0.1809 | 0.1461 | | 0.7177 | 53.39 | 47200 | 0.1768 | 0.1430 | | 0.7173 | 53.85 | 47600 | 0.1759 | 0.1388 | | 0.7135 | 54.3 | 48000 | 0.1668 | 0.1325 | | 0.7114 | 54.75 | 48400 | 0.1793 | 0.1422 | | 0.7104 | 55.2 | 48800 | 0.1735 | 0.1440 | | 0.7135 | 55.66 | 49200 | 0.1795 | 0.1577 | | 0.7096 | 56.11 | 49600 | 0.1803 | 0.1589 | | 0.709 | 56.56 | 50000 | 0.1790 | 0.1651 | | 0.7044 | 57.01 | 50400 | 0.1795 | 0.1632 | | 0.7081 | 57.47 | 50800 | 0.1738 | 0.1490 | | 0.6993 | 57.92 | 51200 | 0.1745 | 0.1406 | | 0.6972 | 58.37 | 51600 | 0.1734 | 0.1380 | | 0.6984 | 58.82 | 52000 | 0.1799 | 0.1402 | | 0.7066 | 59.28 | 52400 | 0.1727 | 0.1381 | | 0.7046 | 59.73 | 52800 | 0.1760 | 0.1360 | | 0.7024 | 60.18 | 53200 | 0.1793 | 0.1526 | | 0.6951 | 60.63 | 53600 | 0.1832 | 0.1598 | | 0.6987 | 61.09 | 54000 | 0.1771 | 0.1563 | | 0.6966 | 61.54 | 54400 | 0.1768 | 0.1388 | | 0.6937 | 61.99 | 54800 | 0.1728 | 0.1374 | | 0.6882 | 62.44 | 55200 | 0.1782 | 0.1385 | | 0.6919 | 62.9 | 55600 | 0.1781 | 0.1395 | | 0.6856 | 63.35 | 56000 | 0.1721 | 0.1351 | | 0.6948 | 63.8 | 56400 | 0.1761 | 0.1383 | | 0.6947 | 64.25 | 56800 | 0.1701 | 0.1352 | | 0.6831 | 64.71 | 57200 | 0.1751 | 0.1371 | | 0.6858 | 65.16 | 57600 | 0.1704 | 0.1383 | | 0.6787 | 65.61 | 58000 | 0.1730 | 0.1457 | | 0.6897 | 66.06 | 58400 | 0.1728 | 0.1412 | | 0.6845 | 66.52 | 58800 | 0.1734 | 0.1394 | | 0.6763 | 66.97 | 59200 | 0.1741 | 0.1408 | | 0.6801 | 67.42 | 59600 | 0.1742 | 0.1460 | | 0.6901 | 67.87 | 60000 | 0.1755 | 0.1449 | | 0.6802 | 68.33 | 60400 | 0.1743 | 0.1424 | | 0.6791 | 68.78 | 60800 | 0.1721 | 0.1359 | | 0.6819 | 69.23 | 61200 | 0.1749 | 0.1363 | | 0.6794 | 69.68 | 61600 | 0.1770 | 0.1369 | | 0.6734 | 70.14 | 62000 | 0.1756 | 0.1353 | | 0.6811 | 70.59 | 62400 | 0.1777 | 0.1371 | | 0.6813 | 71.04 | 62800 | 0.1763 | 0.1362 | | 0.6675 | 71.49 | 63200 | 0.1769 | 0.1372 | | 0.668 | 71.95 | 63600 | 0.1751 | 0.1368 | | 0.6695 | 72.4 | 64000 | 0.1757 | 0.1370 | | 0.668 | 72.85 | 64400 | 0.1758 | 0.1363 | | 0.667 | 73.3 | 64800 | 0.1769 | 0.1363 | | 0.6634 | 73.76 | 65200 | 0.1763 | 0.1361 | | 0.676 | 74.21 | 65600 | 0.1751 | 0.1358 | | 0.667 | 74.66 | 66000 | 0.1755 | 0.1362 | ### Framework versions - Transformers 4.16.0.dev0 - Pytorch 1.10.1+cu102 - Datasets 1.17.1.dev0 - Tokenizers 0.11.0