Whisper Large V2

This model is a fine-tuned version of openai/whisper-large-v2 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4586
  • Wer: 22.0846

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 12
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 20
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Wer
0.7137 0.1852 30 0.4795 35.1951
0.4678 0.3704 60 0.4433 43.9372
0.4255 0.5556 90 0.4112 36.2069
0.4164 0.7407 120 0.3886 29.2672
0.4173 0.9259 150 0.3784 30.1063
0.2763 1.1111 180 0.3815 29.2618
0.2067 1.2963 210 0.3685 25.0445
0.2142 1.4815 240 0.3779 32.5320
0.2079 1.6667 270 0.3749 22.7619
0.1928 1.8519 300 0.3703 24.0840
0.1812 2.0370 330 0.3788 21.3183
0.0992 2.2222 360 0.3961 20.3659
0.1028 2.4074 390 0.3885 26.0914
0.0992 2.5926 420 0.3888 22.8401
0.0957 2.7778 450 0.3771 26.2965
0.1015 2.9630 480 0.3729 24.6641
0.0607 3.1481 510 0.4192 21.8472
0.05 3.3333 540 0.4098 20.6276
0.0463 3.5185 570 0.4043 23.7035
0.0404 3.7037 600 0.4159 21.9524
0.0359 3.8889 630 0.4163 23.7116
0.0303 4.0741 660 0.4252 22.9588
0.014 4.2593 690 0.4608 22.4030
0.0152 4.4444 720 0.4586 22.7106
0.0142 4.6296 750 0.4547 22.7430
0.0143 4.8148 780 0.4591 21.9875
0.0138 5.0 810 0.4586 22.0846

Framework versions

  • Transformers 4.44.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
18
Safetensors
Model size
1.54B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for golesheed/whisper-v2-Limburgian

Finetuned
(192)
this model