khaoulaoub's picture
End of training
180d66b
|
raw
history blame
3.63 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2_XLSR_darija_maroc
    results: []

wav2vec2_XLSR_darija_maroc

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2860
  • Wer: 0.3290

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer
4.9354 0.83 400 2.0492 1.0371
0.8236 1.66 800 0.4434 0.5832
0.4821 2.49 1200 0.3597 0.5114
0.3823 3.32 1600 0.3265 0.4758
0.3231 4.15 2000 0.3149 0.4526
0.2854 4.97 2400 0.2797 0.4237
0.2529 5.8 2800 0.3027 0.4415
0.2493 6.63 3200 0.2926 0.4264
0.2138 7.46 3600 0.2857 0.4169
0.2067 8.29 4000 0.2743 0.4099
0.1898 9.12 4400 0.2798 0.3993
0.1755 9.95 4800 0.2800 0.3913
0.1603 10.78 5200 0.2709 0.3860
0.1608 11.61 5600 0.2716 0.3872
0.1462 12.44 6000 0.2697 0.3825
0.137 13.26 6400 0.2855 0.3819
0.1326 14.09 6800 0.2860 0.3733
0.123 14.92 7200 0.2677 0.3813
0.1168 15.75 7600 0.2780 0.3740
0.1113 16.58 8000 0.2926 0.3719
0.1057 17.41 8400 0.2927 0.3704
0.0996 18.24 8800 0.2825 0.3602
0.0967 19.07 9200 0.2983 0.3641
0.0925 19.9 9600 0.2843 0.3576
0.0894 20.73 10000 0.2726 0.3668
0.0836 21.55 10400 0.2829 0.3560
0.0789 22.38 10800 0.2806 0.3508
0.0778 23.21 11200 0.2849 0.3540
0.0742 24.04 11600 0.2770 0.3436
0.0679 24.87 12000 0.2850 0.3425
0.063 25.7 12400 0.2846 0.3366
0.0593 26.53 12800 0.2811 0.3351
0.0586 27.36 13200 0.2863 0.3322
0.0555 28.19 13600 0.2819 0.3311
0.053 29.02 14000 0.2874 0.3301
0.0498 29.84 14400 0.2860 0.3290

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1