You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-50hrs-v1

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8648
  • Wer: 0.2183
  • Cer: 0.0778

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.803 1.0 1433 0.8132 0.7769 0.3908
0.7362 2.0 2866 0.5782 0.6072 0.3526
0.5456 3.0 4299 0.4854 0.7275 0.4563
0.4305 4.0 5732 0.4406 0.5160 0.3061
0.334 5.0 7165 0.4257 0.6079 0.3707
0.2498 6.0 8598 0.4294 0.4371 0.2520
0.1795 7.0 10031 0.4439 0.4619 0.2753
0.1222 8.0 11464 0.4568 0.3858 0.2024
0.0833 9.0 12897 0.4943 0.3937 0.2128
0.0605 10.0 14330 0.5112 0.3280 0.1484
0.0447 11.0 15763 0.5211 0.3067 0.1349
0.0323 12.0 17196 0.5395 0.2942 0.1250
0.0259 13.0 18629 0.5620 0.2688 0.1060
0.0202 14.0 20062 0.5688 0.2684 0.1063
0.0166 15.0 21495 0.5958 0.2737 0.1070
0.0152 16.0 22928 0.6086 0.2635 0.0931
0.0127 17.0 24361 0.6199 0.2600 0.0942
0.0109 18.0 25794 0.6309 0.2609 0.0908
0.01 19.0 27227 0.6446 0.2591 0.0930
0.0092 20.0 28660 0.6449 0.2593 0.0907
0.008 21.0 30093 0.6688 0.2481 0.0831
0.0072 22.0 31526 0.6631 0.2436 0.0836
0.0066 23.0 32959 0.6677 0.2595 0.0925
0.0065 24.0 34392 0.6984 0.2492 0.0888
0.0056 25.0 35825 0.7061 0.2434 0.0832
0.005 26.0 37258 0.7019 0.2402 0.0826
0.0055 27.0 38691 0.7125 0.2402 0.0815
0.0046 28.0 40124 0.7261 0.2385 0.0816
0.0041 29.0 41557 0.7352 0.2377 0.0831
0.0043 30.0 42990 0.7332 0.2403 0.0826
0.0037 31.0 44423 0.7439 0.2412 0.0832
0.0035 32.0 45856 0.7460 0.2369 0.0824
0.0033 33.0 47289 0.7579 0.2349 0.0819
0.0029 34.0 48722 0.7684 0.2356 0.0818
0.0029 35.0 50155 0.7658 0.2356 0.0829
0.0032 36.0 51588 0.7754 0.2292 0.0823
0.0023 37.0 53021 0.7624 0.2275 0.0805
0.0027 38.0 54454 0.7865 0.2281 0.0816
0.0024 39.0 55887 0.7885 0.2290 0.0810
0.0024 40.0 57320 0.7819 0.2296 0.0802
0.0022 41.0 58753 0.7875 0.2263 0.0792
0.0021 42.0 60186 0.7929 0.2270 0.0802
0.0024 43.0 61619 0.7969 0.2324 0.0822
0.0018 44.0 63052 0.7989 0.2338 0.0858
0.0019 45.0 64485 0.8155 0.2318 0.0849
0.0016 46.0 65918 0.8111 0.2265 0.0800
0.0016 47.0 67351 0.7933 0.2223 0.0776
0.0016 48.0 68784 0.8118 0.2232 0.0785
0.0013 49.0 70217 0.8045 0.2199 0.0774
0.0015 50.0 71650 0.8136 0.2183 0.0756
0.0012 51.0 73083 0.8240 0.2188 0.0766
0.0014 52.0 74516 0.8266 0.2198 0.0781
0.001 53.0 75949 0.8334 0.2215 0.0780
0.0013 54.0 77382 0.8225 0.2213 0.0793
0.0011 55.0 78815 0.8296 0.2190 0.0783
0.0009 56.0 80248 0.8354 0.2174 0.0772
0.0008 57.0 81681 0.8091 0.2154 0.0781
0.0009 58.0 83114 0.8459 0.2148 0.0751
0.001 59.0 84547 0.8301 0.2156 0.0755
0.0008 60.0 85980 0.8385 0.2134 0.0753
0.0008 61.0 87413 0.8474 0.2180 0.0794
0.0008 62.0 88846 0.8553 0.2170 0.0788
0.0006 63.0 90279 0.8357 0.2154 0.0756
0.0007 64.0 91712 0.8495 0.2163 0.0772
0.0006 65.0 93145 0.8508 0.2182 0.0769
0.0006 66.0 94578 0.8625 0.2226 0.0779
0.0006 67.0 96011 0.8718 0.2150 0.0763
0.0004 68.0 97444 0.8600 0.2191 0.0790
0.0005 69.0 98877 0.8622 0.2181 0.0779
0.0005 70.0 100310 0.8617 0.2205 0.0771
0.0003 71.0 101743 0.8703 0.2177 0.0790
0.0002 72.0 103176 0.8648 0.2183 0.0778

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
30
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-50hrs-v1

Finetuned
(2103)
this model

Collection including asr-africa/whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-50hrs-v1