You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-20hrs-v1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1489
  • Wer: 0.2863
  • Cer: 0.0861

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
5.8088 0.9993 752 1.5829 0.9976 0.5075
2.8319 2.0 1505 1.1351 0.8587 0.2839
2.2211 2.9993 2257 0.9053 0.7716 0.2313
1.8858 4.0 3010 0.8276 0.6929 0.2025
1.6621 4.9993 3762 0.7863 0.6353 0.1842
1.4776 6.0 4515 0.7335 0.5902 0.1687
1.33 6.9993 5267 0.7035 0.5585 0.1592
1.1967 8.0 6020 0.6806 0.5353 0.1533
1.0763 8.9993 6772 0.6649 0.5126 0.1482
0.9834 10.0 7525 0.6615 0.4852 0.1400
0.9072 10.9993 8277 0.6760 0.4776 0.1374
0.8453 12.0 9030 0.6869 0.4568 0.1311
0.7749 12.9993 9782 0.6667 0.4582 0.1339
0.7246 14.0 10535 0.6778 0.4384 0.1290
0.6844 14.9993 11287 0.6678 0.4239 0.1225
0.6404 16.0 12040 0.6619 0.4218 0.1229
0.5983 16.9993 12792 0.7178 0.4176 0.1223
0.5783 18.0 13545 0.7103 0.4068 0.1172
0.5414 18.9993 14297 0.7358 0.4018 0.1166
0.5223 20.0 15050 0.7178 0.4028 0.1148
0.5012 20.9993 15802 0.7341 0.3924 0.1153
0.4729 22.0 16555 0.7488 0.3965 0.1145
0.4672 22.9993 17307 0.7782 0.3830 0.1115
0.4397 24.0 18060 0.7940 0.3715 0.1078
0.4339 24.9993 18812 0.7789 0.3830 0.1108
0.4164 26.0 19565 0.7889 0.3904 0.1130
0.4085 26.9993 20317 0.7793 0.3794 0.1101
0.3883 28.0 21070 0.7890 0.3680 0.1073
0.383 28.9993 21822 0.8251 0.3691 0.1076
0.3676 30.0 22575 0.8064 0.3675 0.1075
0.3531 30.9993 23327 0.8620 0.3669 0.1084
0.3391 32.0 24080 0.8385 0.3597 0.1060
0.3358 32.9993 24832 0.8355 0.3609 0.1075
0.3284 34.0 25585 0.8700 0.3704 0.1089
0.3189 34.9993 26337 0.8719 0.3628 0.1080
0.3086 36.0 27090 0.8334 0.3629 0.1055
0.3084 36.9993 27842 0.8646 0.3496 0.1036
0.2911 38.0 28595 0.8664 0.3560 0.1044
0.2887 38.9993 29347 0.9033 0.3467 0.1019
0.279 40.0 30100 0.8944 0.3500 0.1031
0.277 40.9993 30852 0.8604 0.3503 0.1029
0.2643 42.0 31605 0.8859 0.3459 0.1023
0.2623 42.9993 32357 0.9263 0.3394 0.1003
0.2558 44.0 33110 0.9256 0.3387 0.1000
0.2527 44.9993 33862 0.9429 0.3370 0.1000
0.2439 46.0 34615 0.9764 0.3383 0.1001
0.2377 46.9993 35367 0.9389 0.3330 0.0983
0.2434 48.0 36120 0.9903 0.3325 0.0986
0.2271 48.9993 36872 0.9591 0.3305 0.0986
0.2219 50.0 37625 0.9128 0.3310 0.0981
0.2176 50.9993 38377 0.9322 0.3310 0.0978
0.2174 52.0 39130 0.9558 0.3282 0.0967
0.2205 52.9993 39882 0.9752 0.3281 0.0972
0.2011 54.0 40635 1.0161 0.3296 0.0974
0.2027 54.9993 41387 1.0044 0.3231 0.0957
0.2007 56.0 42140 0.9645 0.3203 0.0955
0.1958 56.9993 42892 1.0168 0.3202 0.0953
0.1908 58.0 43645 1.0060 0.3241 0.0960
0.1874 58.9993 44397 0.9889 0.3209 0.0953
0.1832 60.0 45150 0.9944 0.3218 0.0951
0.1835 60.9993 45902 1.0524 0.3195 0.0948
0.1821 62.0 46655 1.0135 0.3189 0.0946
0.1746 62.9993 47407 0.9728 0.3158 0.0931
0.1714 64.0 48160 1.0200 0.3128 0.0931
0.1677 64.9993 48912 0.9743 0.3123 0.0928
0.1654 66.0 49665 1.0243 0.3137 0.0931
0.1649 66.9993 50417 1.0631 0.3149 0.0939
0.1574 68.0 51170 1.0807 0.3139 0.0935
0.155 68.9993 51922 1.0730 0.3100 0.0919
0.1576 70.0 52675 1.0481 0.3111 0.0927
0.1529 70.9993 53427 1.0492 0.3092 0.0920
0.1518 72.0 54180 1.0321 0.3062 0.0916
0.1455 72.9993 54932 1.0690 0.3032 0.0914
0.1428 74.0 55685 1.0508 0.3003 0.0903
0.1399 74.9993 56437 1.0759 0.3050 0.0910
0.1384 76.0 57190 1.0915 0.2993 0.0906
0.1348 76.9993 57942 1.1089 0.3015 0.0903
0.1346 78.0 58695 1.1002 0.3024 0.0902
0.1302 78.9993 59447 1.0900 0.3004 0.0897
0.1294 80.0 60200 1.0783 0.2995 0.0891
0.1266 80.9993 60952 1.0696 0.2966 0.0884
0.121 82.0 61705 1.1130 0.2957 0.0881
0.1212 82.9993 62457 1.0869 0.2960 0.0888
0.1203 84.0 63210 1.1091 0.2921 0.0882
0.1173 84.9993 63962 1.1155 0.2943 0.0885
0.1148 86.0 64715 1.1247 0.2954 0.0884
0.1122 86.9993 65467 1.1156 0.2942 0.0879
0.1215 88.0 66220 1.1048 0.2948 0.0880
0.1143 88.9993 66972 1.1065 0.2946 0.0879
0.1114 90.0 67725 1.1122 0.2919 0.0875
0.1067 90.9993 68477 1.1245 0.2938 0.0878
0.1056 92.0 69230 1.1245 0.2918 0.0872
0.1084 92.9993 69982 1.1358 0.2899 0.0870
0.104 94.0 70735 1.1432 0.2882 0.0867
0.1019 94.9993 71487 1.1435 0.2877 0.0862
0.0993 96.0 72240 1.1500 0.2870 0.0860
0.1012 96.9993 72992 1.1342 0.2855 0.0859
0.1023 98.0 73745 1.1397 0.2855 0.0859
0.0962 98.9993 74497 1.1491 0.2855 0.0859
0.0954 99.9336 75200 1.1489 0.2863 0.0861

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
71
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-20hrs-v1

Finetuned
(524)
this model

Collection including asr-africa/wav2vec2-xls-r-300m-CV_Fleurs_AMMI_ALFFA-sw-20hrs-v1