wav2vec2-timit-xls-r-300m-colab

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3293
  • Wer: 0.2879
  • Cer: 0.0927

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer Cer
No log 0.69 400 3.1501 1.0 0.9865
4.3829 1.38 800 2.9694 1.0 0.9865
2.6389 2.08 1200 0.6558 0.5876 0.1878
0.9293 2.77 1600 0.4232 0.4722 0.1462
0.5686 3.46 2000 0.3513 0.4118 0.1279
0.5686 4.15 2400 0.3246 0.3994 0.1227
0.4388 4.84 2800 0.3037 0.3716 0.1158
0.3431 5.54 3200 0.3055 0.3674 0.1158
0.3164 6.23 3600 0.2973 0.3589 0.1128
0.2678 6.92 4000 0.3053 0.3421 0.1080
0.2678 7.61 4400 0.3058 0.3435 0.1083
0.2376 8.3 4800 0.3144 0.3408 0.1094
0.2199 9.0 5200 0.3177 0.3371 0.1052
0.1988 9.69 5600 0.3123 0.3299 0.1057
0.1816 10.38 6000 0.2918 0.3282 0.1049
0.1816 11.07 6400 0.3195 0.3270 0.1049
0.1652 11.76 6800 0.3080 0.3280 0.1056
0.1576 12.46 7200 0.2859 0.3218 0.1031
0.1558 13.15 7600 0.3143 0.3179 0.1018
0.1411 13.84 8000 0.3354 0.3171 0.1045
0.1411 14.53 8400 0.3285 0.3149 0.1018
0.1381 15.22 8800 0.3048 0.3138 0.1010
0.1178 15.92 9200 0.3421 0.3140 0.1012
0.1182 16.61 9600 0.3258 0.3109 0.1001
0.1131 17.3 10000 0.3220 0.3120 0.1002
0.1131 17.99 10400 0.3156 0.3098 0.0991
0.1031 18.69 10800 0.3198 0.3062 0.0980
0.1023 19.38 11200 0.3227 0.3021 0.0972
0.0959 20.07 11600 0.3187 0.3025 0.0973
0.0881 20.76 12000 0.3177 0.3004 0.0965
0.0881 21.45 12400 0.3435 0.2976 0.0960
0.0919 22.15 12800 0.3142 0.2958 0.0954
0.0787 22.84 13200 0.3010 0.3000 0.0970
0.0794 23.53 13600 0.3528 0.3008 0.0973
0.0751 24.22 14000 0.3352 0.2954 0.0961
0.0751 24.91 14400 0.3314 0.2977 0.0963
0.0778 25.61 14800 0.3214 0.2955 0.0953
0.0711 26.3 15200 0.3277 0.2936 0.0944
0.0681 26.99 15600 0.3237 0.2915 0.0940
0.0682 27.68 16000 0.3284 0.2918 0.0943
0.0682 28.37 16400 0.3304 0.2904 0.0933
0.0731 29.07 16800 0.3307 0.2881 0.0927
0.0619 29.76 17200 0.3293 0.2879 0.0927

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 1.18.3
  • Tokenizers 0.13.3
Downloads last month
17
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for scarlett623/wav2vec2-timit-xls-r-300m-colab

Finetuned
(523)
this model