xlsr-wav2vec2-3

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4201
  • Wer: 0.3998

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 800
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
5.0117 0.68 400 3.0284 0.9999
2.6502 1.35 800 1.0868 0.9374
0.9362 2.03 1200 0.5216 0.6491
0.6675 2.7 1600 0.4744 0.5837
0.5799 3.38 2000 0.4400 0.5802
0.5196 4.05 2400 0.4266 0.5314
0.4591 4.73 2800 0.3808 0.5190
0.4277 5.41 3200 0.3987 0.5036
0.4125 6.08 3600 0.3902 0.5040
0.3797 6.76 4000 0.4105 0.5025
0.3606 7.43 4400 0.3975 0.4823
0.3554 8.11 4800 0.3733 0.4747
0.3373 8.78 5200 0.3737 0.4726
0.3252 9.46 5600 0.3795 0.4736
0.3192 10.14 6000 0.3935 0.4736
0.3012 10.81 6400 0.3974 0.4648
0.2972 11.49 6800 0.4497 0.4724
0.2873 12.16 7200 0.4645 0.4843
0.2849 12.84 7600 0.4461 0.4709
0.274 13.51 8000 0.4002 0.4695
0.2709 14.19 8400 0.4188 0.4627
0.2619 14.86 8800 0.3987 0.4646
0.2545 15.54 9200 0.4083 0.4668
0.2477 16.22 9600 0.4525 0.4728
0.2455 16.89 10000 0.4148 0.4515
0.2281 17.57 10400 0.4304 0.4514
0.2267 18.24 10800 0.4077 0.4446
0.2136 18.92 11200 0.4209 0.4445
0.2032 19.59 11600 0.4543 0.4534
0.1999 20.27 12000 0.4184 0.4373
0.1898 20.95 12400 0.4044 0.4424
0.1846 21.62 12800 0.4098 0.4288
0.1796 22.3 13200 0.4047 0.4262
0.1715 22.97 13600 0.4077 0.4189
0.1641 23.65 14000 0.4162 0.4248
0.1615 24.32 14400 0.4392 0.4222
0.1575 25.0 14800 0.4296 0.4185
0.1456 25.68 15200 0.4363 0.4129
0.1461 26.35 15600 0.4305 0.4124
0.1422 27.03 16000 0.4237 0.4086
0.1378 27.7 16400 0.4294 0.4051
0.1326 28.38 16800 0.4311 0.4051
0.1286 29.05 17200 0.4153 0.3992
0.1283 29.73 17600 0.4201 0.3998

Framework versions

  • Transformers 4.19.2
  • Pytorch 1.11.0+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1
Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support