wav2vec2-large-xls-r-300m-tsovatush-demo-colab-bryn-hauk-8-6

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4337
  • Wer: 0.5188

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
5.3164 4.1667 400 3.2339 None
2.1905 8.3333 800 1.1741 0.6596
1.2308 12.5 1200 1.1192 0.6275
0.8868 16.6667 1600 1.1354 0.5776
0.6593 20.8333 2000 1.2042 0.5831
0.466 25.0 2400 1.3671 0.5654
0.3319 29.1667 2800 1.3642 0.5388
0.2511 33.3333 3200 1.3517 0.5621
0.1895 37.5 3600 1.3898 0.5366
0.1413 41.6667 4000 1.4396 0.5277
0.1086 45.8333 4400 1.4229 0.5344
0.0897 50.0 4800 1.4337 0.5188

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 1.18.3
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
315M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for kkahadze/wav2vec2-large-xls-r-300m-tsovatush-demo-colab-bryn-hauk-8-6

Finetuned
(524)
this model