You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-50hrs-v1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8823
  • Wer: 0.1833
  • Cer: 0.0585

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3.7224 1.0 1925 0.9619 0.7892 0.2446
1.7594 2.0 3850 0.6918 0.5563 0.1559
1.3188 3.0 5775 0.5537 0.4605 0.1327
1.0808 4.0 7700 0.4913 0.3844 0.1096
0.9384 5.0 9625 0.4543 0.3450 0.0994
0.8278 6.0 11550 0.4557 0.3280 0.0939
0.7492 7.0 13475 0.4450 0.3099 0.0901
0.691 8.0 15400 0.4290 0.2968 0.0880
0.637 9.0 17325 0.4323 0.2924 0.0879
0.5923 10.0 19250 0.4212 0.2762 0.0832
0.5559 11.0 21175 0.4269 0.2708 0.0803
0.5227 12.0 23100 0.4129 0.2683 0.0814
0.4909 13.0 25025 0.4288 0.2581 0.0780
0.4659 14.0 26950 0.4275 0.2666 0.0831
0.4435 15.0 28875 0.4050 0.2569 0.0758
0.4222 16.0 30800 0.4199 0.2522 0.0791
0.4072 17.0 32725 0.4439 0.2494 0.0751
0.3862 18.0 34650 0.4807 0.2466 0.0737
0.3701 19.0 36575 0.4573 0.2440 0.0729
0.3503 20.0 38500 0.4852 0.2400 0.0725
0.3388 21.0 40425 0.4395 0.2411 0.0735
0.3288 22.0 42350 0.4487 0.2350 0.0716
0.3123 23.0 44275 0.4713 0.2314 0.0702
0.3012 24.0 46200 0.4772 0.2326 0.0707
0.2896 25.0 48125 0.5196 0.2339 0.0720
0.2905 26.0 50050 0.5252 0.2335 0.0711
0.2781 27.0 51975 0.4986 0.2324 0.0726
0.2684 28.0 53900 0.5678 0.2282 0.0700
0.2597 29.0 55825 0.5315 0.2308 0.0702
0.2534 30.0 57750 0.5705 0.2293 0.0695
0.2459 31.0 59675 0.5227 0.2280 0.0699
0.2377 32.0 61600 0.5707 0.2247 0.0686
0.2339 33.0 63525 0.5204 0.2272 0.0698
0.2305 34.0 65450 0.5717 0.2239 0.0697
0.2193 35.0 67375 0.5546 0.2272 0.0701
0.2186 36.0 69300 0.5419 0.2228 0.0692
0.2116 37.0 71225 0.5994 0.2242 0.0698
0.2054 38.0 73150 0.5910 0.2191 0.0676
0.1966 39.0 75075 0.5838 0.2169 0.0676
0.1973 40.0 77000 0.5837 0.2257 0.0684
0.1964 41.0 78925 0.5976 0.2201 0.0671
0.1868 42.0 80850 0.5925 0.2130 0.0656
0.1829 43.0 82775 0.6467 0.2147 0.0665
0.1786 44.0 84700 0.5984 0.2122 0.0665
0.1738 45.0 86625 0.6250 0.2101 0.0655
0.1723 46.0 88550 0.6194 0.2138 0.0660
0.1674 47.0 90475 0.6554 0.2148 0.0668
0.1642 48.0 92400 0.6235 0.2147 0.0663
0.1611 49.0 94325 0.6161 0.2097 0.0648
0.1542 50.0 96250 0.6382 0.2077 0.0651
0.1578 51.0 98175 0.6596 0.2087 0.0653
0.1501 52.0 100100 0.6419 0.2076 0.0648
0.1484 53.0 102025 0.6912 0.2054 0.0647
0.1452 54.0 103950 0.6458 0.2073 0.0647
0.1438 55.0 105875 0.6216 0.2075 0.0647
0.1403 56.0 107800 0.6800 0.2008 0.0636
0.1356 57.0 109725 0.7435 0.2063 0.0650
0.1343 58.0 111650 0.6753 0.2015 0.0640
0.1303 59.0 113575 0.7011 0.2065 0.0646
0.1301 60.0 115500 0.7136 0.2010 0.0631
0.1263 61.0 117425 0.7077 0.2047 0.0640
0.1217 62.0 119350 0.7678 0.2020 0.0628
0.1197 63.0 121275 0.7193 0.2002 0.0631
0.1174 64.0 123200 0.7187 0.1964 0.0617
0.1157 65.0 125125 0.7753 0.2001 0.0628
0.1132 66.0 127050 0.6996 0.1996 0.0632
0.1118 67.0 128975 0.7671 0.2004 0.0629
0.1093 68.0 130900 0.7801 0.1982 0.0622
0.107 69.0 132825 0.7609 0.1975 0.0622
0.1032 70.0 134750 0.7991 0.1977 0.0620
0.1016 71.0 136675 0.7903 0.1968 0.0621
0.0994 72.0 138600 0.8030 0.1947 0.0613
0.0979 73.0 140525 0.7994 0.1945 0.0612
0.0955 74.0 142450 0.7784 0.1931 0.0612
0.0941 75.0 144375 0.7525 0.1937 0.0615
0.0898 76.0 146300 0.7595 0.1935 0.0616
0.0893 77.0 148225 0.7978 0.1906 0.0605
0.0863 78.0 150150 0.8105 0.1901 0.0600
0.0906 79.0 152075 0.7857 0.1908 0.0608
0.0851 80.0 154000 0.7962 0.1892 0.0605
0.0844 81.0 155925 0.7940 0.1887 0.0603
0.0788 82.0 157850 0.8440 0.1880 0.0600
0.0791 83.0 159775 0.8272 0.1867 0.0596
0.0774 84.0 161700 0.8184 0.1865 0.0595
0.0758 85.0 163625 0.8066 0.1849 0.0591
0.0752 86.0 165550 0.8169 0.1859 0.0598
0.0724 87.0 167475 0.8312 0.1843 0.0591
0.0701 88.0 169400 0.8580 0.1843 0.0591
0.0698 89.0 171325 0.8579 0.1849 0.0593
0.0686 90.0 173250 0.8696 0.1843 0.0588
0.0667 91.0 175175 0.8539 0.1846 0.0591
0.0664 92.0 177100 0.8340 0.1842 0.0588
0.0654 93.0 179025 0.8690 0.1841 0.0585
0.0628 94.0 180950 0.8788 0.1848 0.0588
0.0622 95.0 182875 0.8823 0.1833 0.0585

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
2
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-50hrs-v1

Finetuned
(532)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-50hrs-v1