You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

whisper-small-CV_Fleurs_AMMI_ALFFA-sw-20hrs-v1

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7663
  • Wer: 0.2445
  • Cer: 0.0922

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3.059 0.9993 752 0.7311 0.5301 0.2009
1.0691 2.0 1505 0.5315 0.3752 0.1488
0.6341 2.9993 2257 0.4776 0.2844 0.0992
0.3826 4.0 3010 0.4720 0.2725 0.1037
0.2525 4.9993 3762 0.4979 0.2961 0.1204
0.1966 6.0 4515 0.5199 0.2599 0.0924
0.1748 6.9993 5267 0.5406 0.3148 0.1430
0.1642 8.0 6020 0.5674 0.2654 0.0958
0.1639 8.9993 6772 0.5802 0.2621 0.1186
0.1601 10.0 7525 0.6058 0.2652 0.1026
0.155 10.9993 8277 0.6102 0.2733 0.1036
0.1289 12.0 9030 0.6181 0.2613 0.0996
0.1105 12.9993 9782 0.6233 0.2615 0.0984
0.0981 14.0 10535 0.6377 0.2493 0.0931
0.089 14.9993 11287 0.6352 0.2612 0.1007
0.0761 16.0 12040 0.6494 0.2518 0.0972
0.0722 16.9993 12792 0.6665 0.2470 0.0917
0.063 18.0 13545 0.6687 0.2428 0.0898
0.0584 18.9993 14297 0.6715 0.2550 0.0984
0.0551 20.0 15050 0.6856 0.2506 0.0935
0.0503 20.9993 15802 0.6928 0.2480 0.0969
0.0464 22.0 16555 0.6887 0.2432 0.0913
0.0426 22.9993 17307 0.7118 0.2457 0.0925
0.0376 24.0 18060 0.7240 0.2357 0.0882
0.0376 24.9993 18812 0.7268 0.2458 0.0946
0.0339 26.0 19565 0.7335 0.2492 0.0931
0.0313 26.9993 20317 0.7185 0.2419 0.0908
0.0322 28.0 21070 0.7345 0.2396 0.0919
0.0313 28.9993 21822 0.7401 0.2432 0.0937
0.0268 30.0 22575 0.7576 0.2474 0.0946
0.0267 30.9993 23327 0.7653 0.2432 0.0938
0.025 32.0 24080 0.7593 0.2432 0.0940
0.0241 32.9993 24832 0.7670 0.2443 0.0930
0.0238 34.0 25585 0.7663 0.2445 0.0922

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.1
Downloads last month
3
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/whisper-small-CV_Fleurs_AMMI_ALFFA-sw-20hrs-v1

Finetuned
(2164)
this model

Collection including asr-africa/whisper-small-CV_Fleurs_AMMI_ALFFA-sw-20hrs-v1