metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-xls-r-300m-CV-Fleurs-lg-10hrs-v6
results: []
wav2vec2-xls-r-300m-CV-Fleurs-lg-10hrs-v6
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.3042
- Wer: 0.6058
- Cer: 0.1381
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
3.1664 | 1.0 | 1025 | 2.5342 | 1.0 | 0.8840 |
1.8493 | 2.0 | 2050 | 1.2129 | 0.9929 | 0.3862 |
1.2976 | 3.0 | 3075 | 0.9944 | 0.9482 | 0.2930 |
1.1249 | 4.0 | 4100 | 0.9405 | 0.9249 | 0.2753 |
0.9905 | 5.0 | 5125 | 0.8406 | 0.9072 | 0.2535 |
0.8961 | 6.0 | 6150 | 0.8121 | 0.8699 | 0.2368 |
0.8101 | 7.0 | 7175 | 0.7705 | 0.8607 | 0.2273 |
0.728 | 8.0 | 8200 | 0.7325 | 0.8330 | 0.2163 |
0.6653 | 9.0 | 9225 | 0.7483 | 0.8089 | 0.2060 |
0.6063 | 10.0 | 10250 | 0.7343 | 0.7981 | 0.2018 |
0.5503 | 11.0 | 11275 | 0.7499 | 0.7667 | 0.1887 |
0.5016 | 12.0 | 12300 | 0.7474 | 0.7734 | 0.1917 |
0.4574 | 13.0 | 13325 | 0.7665 | 0.7479 | 0.1858 |
0.4227 | 14.0 | 14350 | 0.8014 | 0.7605 | 0.1866 |
0.3898 | 15.0 | 15375 | 0.8140 | 0.7618 | 0.1817 |
0.3664 | 16.0 | 16400 | 0.7794 | 0.7234 | 0.1758 |
0.3389 | 17.0 | 17425 | 0.8175 | 0.7290 | 0.1762 |
0.3155 | 18.0 | 18450 | 0.8647 | 0.7284 | 0.1799 |
0.2991 | 19.0 | 19475 | 0.8268 | 0.7134 | 0.1731 |
0.2825 | 20.0 | 20500 | 0.9408 | 0.7312 | 0.1756 |
0.2665 | 21.0 | 21525 | 0.9131 | 0.7307 | 0.1715 |
0.253 | 22.0 | 22550 | 0.9645 | 0.7242 | 0.1747 |
0.2354 | 23.0 | 23575 | 0.9436 | 0.7125 | 0.1699 |
0.231 | 24.0 | 24600 | 0.9521 | 0.7239 | 0.1702 |
0.2178 | 25.0 | 25625 | 0.9751 | 0.7076 | 0.1694 |
0.2086 | 26.0 | 26650 | 0.9704 | 0.6945 | 0.1689 |
0.2002 | 27.0 | 27675 | 0.9937 | 0.7077 | 0.1682 |
0.1968 | 28.0 | 28700 | 0.9523 | 0.6959 | 0.1682 |
0.1889 | 29.0 | 29725 | 1.0351 | 0.6908 | 0.1653 |
0.182 | 30.0 | 30750 | 1.0054 | 0.6933 | 0.1644 |
0.1723 | 31.0 | 31775 | 1.0039 | 0.6930 | 0.1646 |
0.1695 | 32.0 | 32800 | 1.0005 | 0.6855 | 0.1632 |
0.1633 | 33.0 | 33825 | 1.0273 | 0.6897 | 0.1633 |
0.1571 | 34.0 | 34850 | 1.0361 | 0.6850 | 0.1615 |
0.1573 | 35.0 | 35875 | 1.0092 | 0.6767 | 0.1604 |
0.1511 | 36.0 | 36900 | 1.0353 | 0.6816 | 0.1622 |
0.1469 | 37.0 | 37925 | 1.0394 | 0.6716 | 0.1618 |
0.1495 | 38.0 | 38950 | 1.1006 | 0.6804 | 0.1621 |
0.1411 | 39.0 | 39975 | 1.1300 | 0.6742 | 0.1603 |
0.1391 | 40.0 | 41000 | 1.0378 | 0.6801 | 0.1591 |
0.138 | 41.0 | 42025 | 1.0655 | 0.6679 | 0.1581 |
0.1304 | 42.0 | 43050 | 1.1279 | 0.6777 | 0.1594 |
0.1308 | 43.0 | 44075 | 1.0743 | 0.6786 | 0.1572 |
0.128 | 44.0 | 45100 | 1.1424 | 0.6683 | 0.1569 |
0.1261 | 45.0 | 46125 | 1.0351 | 0.6787 | 0.1596 |
0.1242 | 46.0 | 47150 | 1.1587 | 0.6656 | 0.1556 |
0.1179 | 47.0 | 48175 | 1.1617 | 0.6538 | 0.1555 |
0.1164 | 48.0 | 49200 | 1.1593 | 0.6604 | 0.1567 |
0.1137 | 49.0 | 50225 | 1.1450 | 0.6622 | 0.1554 |
0.1102 | 50.0 | 51250 | 1.1221 | 0.6593 | 0.1555 |
0.1107 | 51.0 | 52275 | 1.1194 | 0.6618 | 0.1538 |
0.1088 | 52.0 | 53300 | 1.1452 | 0.6503 | 0.1527 |
0.1078 | 53.0 | 54325 | 1.1679 | 0.6529 | 0.1529 |
0.1054 | 54.0 | 55350 | 1.1926 | 0.6437 | 0.1503 |
0.1012 | 55.0 | 56375 | 1.1483 | 0.6568 | 0.1531 |
0.1047 | 56.0 | 57400 | 1.1756 | 0.6544 | 0.1528 |
0.0958 | 57.0 | 58425 | 1.2168 | 0.6531 | 0.1512 |
0.0966 | 58.0 | 59450 | 1.1973 | 0.6383 | 0.1493 |
0.0966 | 59.0 | 60475 | 1.1830 | 0.6493 | 0.1511 |
0.0948 | 60.0 | 61500 | 1.2027 | 0.6438 | 0.1509 |
0.0888 | 61.0 | 62525 | 1.1959 | 0.6413 | 0.1498 |
0.0909 | 62.0 | 63550 | 1.2046 | 0.6507 | 0.1512 |
0.0888 | 63.0 | 64575 | 1.2052 | 0.6347 | 0.1487 |
0.0869 | 64.0 | 65600 | 1.2118 | 0.6324 | 0.1482 |
0.0846 | 65.0 | 66625 | 1.1967 | 0.6365 | 0.1480 |
0.0833 | 66.0 | 67650 | 1.1957 | 0.6323 | 0.1460 |
0.0827 | 67.0 | 68675 | 1.1928 | 0.6370 | 0.1470 |
0.0824 | 68.0 | 69700 | 1.2578 | 0.6416 | 0.1472 |
0.08 | 69.0 | 70725 | 1.2427 | 0.6284 | 0.1447 |
0.0787 | 70.0 | 71750 | 1.2061 | 0.6295 | 0.1462 |
0.0777 | 71.0 | 72775 | 1.2185 | 0.6315 | 0.1454 |
0.0736 | 72.0 | 73800 | 1.2454 | 0.6237 | 0.1445 |
0.0746 | 73.0 | 74825 | 1.2629 | 0.6298 | 0.1464 |
0.0735 | 74.0 | 75850 | 1.2398 | 0.6218 | 0.1428 |
0.0724 | 75.0 | 76875 | 1.2727 | 0.6269 | 0.1440 |
0.0698 | 76.0 | 77900 | 1.2327 | 0.6259 | 0.1439 |
0.0677 | 77.0 | 78925 | 1.2338 | 0.6213 | 0.1442 |
0.0699 | 78.0 | 79950 | 1.2755 | 0.6226 | 0.1442 |
0.0656 | 79.0 | 80975 | 1.2734 | 0.6237 | 0.1431 |
0.0622 | 80.0 | 82000 | 1.2733 | 0.6211 | 0.1427 |
0.0648 | 81.0 | 83025 | 1.2345 | 0.6274 | 0.1421 |
0.0626 | 82.0 | 84050 | 1.2670 | 0.6273 | 0.1430 |
0.0632 | 83.0 | 85075 | 1.2634 | 0.6150 | 0.1422 |
0.0611 | 84.0 | 86100 | 1.3266 | 0.6192 | 0.1418 |
0.0608 | 85.0 | 87125 | 1.2889 | 0.6153 | 0.1414 |
0.0581 | 86.0 | 88150 | 1.2808 | 0.6146 | 0.1406 |
0.0586 | 87.0 | 89175 | 1.3120 | 0.6142 | 0.1406 |
0.0575 | 88.0 | 90200 | 1.2701 | 0.6171 | 0.1409 |
0.0577 | 89.0 | 91225 | 1.2916 | 0.6116 | 0.1400 |
0.0569 | 90.0 | 92250 | 1.3074 | 0.6132 | 0.1401 |
0.0552 | 91.0 | 93275 | 1.3030 | 0.6115 | 0.1388 |
0.0563 | 92.0 | 94300 | 1.2719 | 0.6082 | 0.1387 |
0.0516 | 93.0 | 95325 | 1.2853 | 0.6078 | 0.1380 |
0.0523 | 94.0 | 96350 | 1.2953 | 0.6096 | 0.1389 |
0.0489 | 95.0 | 97375 | 1.3099 | 0.6097 | 0.1387 |
0.0513 | 96.0 | 98400 | 1.3082 | 0.6095 | 0.1388 |
0.0522 | 97.0 | 99425 | 1.3076 | 0.6097 | 0.1384 |
0.0498 | 98.0 | 100450 | 1.3003 | 0.6073 | 0.1383 |
0.0506 | 99.0 | 101475 | 1.3012 | 0.6067 | 0.1382 |
0.0491 | 100.0 | 102500 | 1.3042 | 0.6058 | 0.1381 |
Framework versions
- Transformers 4.46.1
- Pytorch 2.1.0+cu118
- Datasets 3.1.0
- Tokenizers 0.20.1