You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-5hrs-v1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6073
  • Wer: 0.4634
  • Cer: 0.1366

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
7.8759 1.0 179 3.0491 1.0 1.0
5.9463 2.0 358 2.9001 1.0 1.0
5.5686 3.0 537 2.4338 1.0 0.9902
3.9978 4.0 716 1.5262 0.9887 0.4761
3.0408 5.0 895 1.3407 0.9508 0.3576
2.6151 6.0 1074 1.1558 0.8725 0.2897
2.2732 7.0 1253 1.0802 0.8431 0.2686
2.0209 8.0 1432 1.0647 0.7911 0.2462
1.8618 9.0 1611 1.0133 0.7449 0.2271
1.6644 10.0 1790 0.9757 0.7252 0.2150
1.4638 11.0 1969 0.9112 0.7038 0.2120
1.3418 12.0 2148 0.9370 0.6767 0.2005
1.2076 13.0 2327 0.9517 0.6633 0.1979
1.1146 14.0 2506 0.9519 0.6507 0.1939
1.0189 15.0 2685 0.9588 0.6419 0.1893
0.9423 16.0 2864 1.0426 0.6284 0.1865
0.8809 17.0 3043 0.9869 0.6203 0.1839
0.8469 18.0 3222 1.0109 0.6090 0.1795
0.7525 19.0 3401 1.0700 0.6030 0.1787
0.728 20.0 3580 1.0819 0.6012 0.1774
0.6834 21.0 3759 1.0525 0.5909 0.1773
0.6571 22.0 3938 1.1196 0.5857 0.1739
0.6178 23.0 4117 1.0958 0.5818 0.1729
0.5993 24.0 4296 1.1396 0.5916 0.1717
0.577 25.0 4475 1.1452 0.5719 0.1684
0.5449 26.0 4654 1.1572 0.5660 0.1655
0.5367 27.0 4833 1.1660 0.5697 0.1674
0.5024 28.0 5012 1.2139 0.5561 0.1640
0.494 29.0 5191 1.2187 0.5681 0.1662
0.469 30.0 5370 1.3079 0.5620 0.1664
0.4669 31.0 5549 1.3099 0.5548 0.1654
0.4561 32.0 5728 1.3299 0.5517 0.1625
0.4442 33.0 5907 1.2960 0.5444 0.1619
0.4089 34.0 6086 1.4029 0.5412 0.1604
0.4129 35.0 6265 1.3533 0.5407 0.1592
0.3945 36.0 6444 1.3797 0.5427 0.1594
0.3711 37.0 6623 1.3541 0.5340 0.1561
0.3848 38.0 6802 1.3520 0.5362 0.1578
0.3825 39.0 6981 1.3308 0.5403 0.1574
0.3616 40.0 7160 1.4293 0.5353 0.1570
0.3462 41.0 7339 1.3654 0.5348 0.1573
0.345 42.0 7518 1.3706 0.5350 0.1558
0.3395 43.0 7697 1.3590 0.5248 0.1542
0.3412 44.0 7876 1.4078 0.5320 0.1553
0.3191 45.0 8055 1.3834 0.5156 0.1520
0.3186 46.0 8234 1.4047 0.5281 0.1547
0.3097 47.0 8413 1.4442 0.5247 0.1538
0.3038 48.0 8592 1.4164 0.5259 0.1544
0.2987 49.0 8771 1.3756 0.5186 0.1528
0.2882 50.0 8950 1.4459 0.5108 0.1503
0.2876 51.0 9129 1.4389 0.5180 0.1523
0.2734 52.0 9308 1.4438 0.5157 0.1506
0.2757 53.0 9487 1.4145 0.5143 0.1504
0.2673 54.0 9666 1.4038 0.5016 0.1480
0.2616 55.0 9845 1.4359 0.5077 0.1499
0.2644 56.0 10024 1.3912 0.5063 0.1500
0.2518 57.0 10203 1.4313 0.5058 0.1487
0.2475 58.0 10382 1.5002 0.5121 0.1513
0.239 59.0 10561 1.4860 0.5089 0.1492
0.2621 60.0 10740 1.4857 0.4987 0.1468
0.2403 61.0 10919 1.4985 0.5006 0.1477
0.2379 62.0 11098 1.4909 0.4986 0.1466
0.239 63.0 11277 1.4673 0.4903 0.1460
0.2282 64.0 11456 1.5314 0.4953 0.1461
0.2233 65.0 11635 1.5335 0.4937 0.1455
0.2315 66.0 11814 1.5235 0.4998 0.1467
0.2103 67.0 11993 1.5594 0.4890 0.1440
0.2184 68.0 12172 1.5146 0.4916 0.1449
0.211 69.0 12351 1.5347 0.4881 0.1437
0.2065 70.0 12530 1.4990 0.4886 0.1448
0.2076 71.0 12709 1.5132 0.4873 0.1442
0.2198 72.0 12888 1.5327 0.4815 0.1432
0.1974 73.0 13067 1.5127 0.4882 0.1431
0.1946 74.0 13246 1.5406 0.4846 0.1424
0.1972 75.0 13425 1.5506 0.4842 0.1422
0.1966 76.0 13604 1.5648 0.4827 0.1421
0.1947 77.0 13783 1.5309 0.4786 0.1414
0.1905 78.0 13962 1.5258 0.4745 0.1406
0.1907 79.0 14141 1.5577 0.4766 0.1399
0.1932 80.0 14320 1.5470 0.4766 0.1399
0.1818 81.0 14499 1.5697 0.4746 0.1398
0.1767 82.0 14678 1.5587 0.4735 0.1392
0.1775 83.0 14857 1.5325 0.4699 0.1387
0.1725 84.0 15036 1.5820 0.4703 0.1386
0.1625 85.0 15215 1.6143 0.4723 0.1391
0.1715 86.0 15394 1.5605 0.4693 0.1383
0.1627 87.0 15573 1.6053 0.4702 0.1387
0.1657 88.0 15752 1.5764 0.4723 0.1388
0.166 89.0 15931 1.5878 0.4711 0.1383
0.1582 90.0 16110 1.6015 0.4694 0.1377
0.1494 91.0 16289 1.5885 0.4679 0.1377
0.1508 92.0 16468 1.5958 0.4678 0.1375
0.1571 93.0 16647 1.5899 0.4667 0.1373
0.1555 94.0 16826 1.6068 0.4658 0.1371
0.1518 95.0 17005 1.6047 0.4648 0.1368
0.1465 96.0 17184 1.6081 0.4644 0.1366
0.1463 97.0 17363 1.6151 0.4634 0.1365
0.1452 98.0 17542 1.6170 0.4634 0.1366
0.1491 99.0 17721 1.6100 0.4640 0.1365
0.1442 100.0 17900 1.6073 0.4634 0.1366

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
18
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-5hrs-v1

Finetuned
(524)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-5hrs-v1