wav2vec2-xls-r-300m-CV-Fleurs-lg-5hrs-v7

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7601
  • Wer: 0.4290
  • Cer: 0.0891

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 70
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3.767 1.0 515 2.9466 1.0 1.0
2.5111 2.0 1030 1.1124 0.9072 0.2504
1.0272 3.0 1545 0.6239 0.7472 0.1651
0.7225 4.0 2060 0.5120 0.6347 0.1391
0.5936 5.0 2575 0.4842 0.6171 0.1380
0.5104 6.0 3090 0.4519 0.6032 0.1387
0.4486 7.0 3605 0.4516 0.5645 0.1177
0.3944 8.0 4120 0.4659 0.5373 0.1146
0.3587 9.0 4635 0.4384 0.5231 0.1092
0.3259 10.0 5150 0.4401 0.5228 0.1155
0.2929 11.0 5665 0.4667 0.5164 0.1078
0.278 12.0 6180 0.4671 0.5099 0.1074
0.2597 13.0 6695 0.4737 0.5001 0.1059
0.2428 14.0 7210 0.4900 0.4987 0.1043
0.2277 15.0 7725 0.5479 0.5282 0.1077
0.2126 16.0 8240 0.5342 0.5019 0.1052
0.1955 17.0 8755 0.5155 0.4901 0.1014
0.1892 18.0 9270 0.5369 0.4889 0.1011
0.1733 19.0 9785 0.5470 0.4821 0.1016
0.1736 20.0 10300 0.5909 0.4833 0.1003
0.1594 21.0 10815 0.5675 0.4851 0.1017
0.1576 22.0 11330 0.6015 0.4846 0.1016
0.1511 23.0 11845 0.5394 0.4690 0.0983
0.1419 24.0 12360 0.6172 0.4848 0.0996
0.1359 25.0 12875 0.6323 0.4782 0.1005
0.1411 26.0 13390 0.5898 0.4739 0.0992
0.1279 27.0 13905 0.6021 0.4792 0.0994
0.1152 28.0 14420 0.6077 0.4713 0.0981
0.1187 29.0 14935 0.6468 0.4702 0.0987
0.109 30.0 15450 0.6424 0.4731 0.0981
0.1054 31.0 15965 0.6620 0.4729 0.0985
0.1105 32.0 16480 0.6593 0.4607 0.0972
0.1011 33.0 16995 0.6920 0.4724 0.0974
0.0972 34.0 17510 0.6946 0.4719 0.0975
0.1009 35.0 18025 0.6664 0.4652 0.0962
0.0925 36.0 18540 0.6852 0.4664 0.0966
0.0907 37.0 19055 0.6871 0.4619 0.0971
0.0931 38.0 19570 0.6975 0.4512 0.0944
0.0859 39.0 20085 0.6803 0.4528 0.0950
0.0844 40.0 20600 0.7284 0.4646 0.0966
0.0832 41.0 21115 0.7195 0.4502 0.0940
0.0752 42.0 21630 0.6899 0.4517 0.0944
0.0787 43.0 22145 0.7047 0.4494 0.0939
0.0762 44.0 22660 0.7115 0.4531 0.0940
0.0709 45.0 23175 0.7336 0.4522 0.0932
0.0697 46.0 23690 0.7023 0.4468 0.0921
0.0737 47.0 24205 0.7567 0.4470 0.0929
0.0684 48.0 24720 0.7335 0.4437 0.0912
0.0677 49.0 25235 0.7230 0.4442 0.0916
0.0704 50.0 25750 0.7172 0.4510 0.0928
0.0637 51.0 26265 0.7513 0.4450 0.0927
0.0632 52.0 26780 0.7554 0.4548 0.0938
0.0562 53.0 27295 0.7654 0.4508 0.0938
0.0576 54.0 27810 0.7444 0.4389 0.0914
0.055 55.0 28325 0.7285 0.4426 0.0915
0.0555 56.0 28840 0.7418 0.4405 0.0915
0.0633 57.0 29355 0.7509 0.4412 0.0918
0.0533 58.0 29870 0.7486 0.4392 0.0916
0.0554 59.0 30385 0.7415 0.4420 0.0918
0.0555 60.0 30900 0.7485 0.4413 0.0916
0.0524 61.0 31415 0.7463 0.4372 0.0906
0.0476 62.0 31930 0.7501 0.4337 0.0901
0.0503 63.0 32445 0.7474 0.4309 0.0898
0.0458 64.0 32960 0.7603 0.4302 0.0897
0.0495 65.0 33475 0.7558 0.4309 0.0894
0.0518 66.0 33990 0.7585 0.4282 0.0887
0.051 67.0 34505 0.7586 0.4297 0.0893
0.0483 68.0 35020 0.7586 0.4291 0.0893
0.0445 69.0 35535 0.7604 0.4297 0.0893
0.0437 70.0 36050 0.7601 0.4290 0.0891

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
44
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/wav2vec2-xls-r-300m-CV-Fleurs-lg-5hrs-v7

Finetuned
(523)
this model