Whisper Small Basque

This model is a fine-tuned version of openai/whisper-small on the mozilla-foundation/common_voice_16_1 eu dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3785
  • Wer: 12.7374

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 40000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.0153 10.03 1000 0.2690 15.3119
0.0029 20.05 2000 0.3132 15.0334
0.0018 30.08 3000 0.3312 14.6113
0.0009 40.1 4000 0.3375 14.0916
0.0037 50.13 5000 0.3306 14.3241
0.0002 60.15 6000 0.3628 13.5464
0.0001 70.18 7000 0.3804 13.4985
0.0001 80.2 8000 0.3961 13.5298
0.0 90.23 9000 0.4117 13.5650
0.0 100.25 10000 0.4282 13.6246
0.0001 110.28 11000 0.3542 13.0061
0.0001 120.3 12000 0.3697 13.1282
0.0 130.33 13000 0.3874 12.9934
0.0 140.35 14000 0.4002 12.9582
0.0 150.38 15000 0.4120 12.9455
0.0 160.4 16000 0.4246 12.9631
0.0 170.43 17000 0.4369 13.0071
0.0 180.45 18000 0.4501 13.0364
0.0 190.48 19000 0.4638 13.0374
0.0 200.5 20000 0.4786 13.0891
0.0001 210.53 21000 0.3785 12.7374
0.0 220.55 22000 0.4097 12.8166
0.0 230.58 23000 0.4236 12.8175
0.0 240.6 24000 0.4340 12.8039
0.0 250.63 25000 0.4431 12.8156
0.0 260.65 26000 0.4517 12.8058
0.0 270.68 27000 0.4601 12.7921
0.0 280.7 28000 0.4689 12.8029
0.0 290.73 29000 0.4774 12.8039
0.0 300.75 30000 0.4863 12.7960
0.0 310.78 31000 0.4949 12.7912
0.0 320.8 32000 0.5037 12.8107
0.0 330.83 33000 0.5115 12.8087
0.0 340.85 34000 0.5191 12.8293
0.0 350.88 35000 0.5256 12.8918
0.0 360.9 36000 0.5313 12.8810
0.0 370.93 37000 0.5361 12.9045
0.0 380.95 38000 0.5394 12.8996
0.0 390.98 39000 0.5417 12.9123
0.0 401.0 40000 0.5425 12.9123

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.2.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
23
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for zuazo/whisper-small-eu-cv16_1

Finetuned
(2103)
this model
Finetunes
1 model

Dataset used to train zuazo/whisper-small-eu-cv16_1

Evaluation results