wav2vec2-E30_pause

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0562
  • Cer: 22.1393

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
30.6015 0.1289 200 5.2403 100.0
4.9516 0.2579 400 4.7381 100.0
4.8148 0.3868 600 4.6941 100.0
4.7459 0.5158 800 4.5804 100.0
4.7075 0.6447 1000 4.5734 100.0
4.6607 0.7737 1200 4.5106 100.0
4.3823 0.9026 1400 4.0717 96.7340
3.3809 1.0316 1600 3.0736 55.4746
2.7624 1.1605 1800 2.5641 45.3654
2.4 1.2895 2000 2.3126 43.4269
2.1103 1.4184 2200 2.0157 38.2343
1.9096 1.5474 2400 1.8777 35.6086
1.7178 1.6763 2600 1.7423 34.2164
1.5487 1.8053 2800 1.5431 30.9504
1.4451 1.9342 3000 1.4347 29.0002
1.3031 2.0632 3200 1.3301 26.7975
1.1895 2.1921 3400 1.2335 25.5110
1.1376 2.3211 3600 1.2340 25.0352
1.071 2.4500 3800 1.1303 23.8957
1.0488 2.5790 4000 1.1066 22.7855
1.0068 2.7079 4200 1.0825 22.4624
0.9812 2.8369 4400 1.0719 22.3449
0.9545 2.9658 4600 1.0562 22.1393

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
317M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Gummybear05/wav2vec2-E30_pause

Finetuned
(524)
this model