Whisper medium to Kaggle Albanian

This model is a fine-tuned version of openai/whisper-medium on the rishabhjain16/kaggle_albanian default dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3393
  • Wer: 26.91

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 5500
  • mixed_precision_training: Native AMP

Training results

Training Loss Step Validation Loss Wer
0.4716 500 0.5687 45.1093
0.3883 1000 0.4522 36.6402
0.3695 1500 0.4099 33.9459
0.3375 2000 0.3783 31.4906
0.2146 2500 0.3662 29.5184
0.2010 3000 0.3638 29.7474
0.1799 3500 0.3519 28.6020
0.1791 4000 0.3455 28.3978
0.0621 4500 0.3592 27.1452
0.0679 5000 0.3561 27.1079
0.0687 5500 0.3560 26.9186

Framework versions

  • Transformers 4.37.2
  • Pytorch 1.14.0a0+44dac51
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
32
Safetensors
Model size
764M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for rishabhjain16/whisper-medium_to_kaggle_albanian

Finetuned
(497)
this model

Evaluation results