Beijuka's picture
End of training
052c171 verified
metadata
library_name: transformers
language:
  - sn
license: cc-by-nc-4.0
base_model: facebook/mms-300m
tags:
  - generated_from_trainer
datasets:
  - DigitalUmuganda_Afrivoice/Shona
metrics:
  - wer
model-index:
  - name: facebook/mms-300m
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: DigitalUmuganda
          type: DigitalUmuganda_Afrivoice/Shona
        metrics:
          - name: Wer
            type: wer
            value: 0.5175178580105934

facebook/mms-300m

This model is a fine-tuned version of facebook/mms-300m on the DigitalUmuganda dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9857
  • Wer: 0.5175
  • Cer: 0.1156

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
54.8231 0.9818 27 5.5912 1.0 1.0
16.0974 1.9727 54 3.2813 1.0 1.0
12.3829 2.9636 81 2.9582 1.0 1.0
11.6369 3.9909 109 2.9327 1.0 1.0
11.78 4.9818 136 3.0289 1.0 1.0
11.7294 5.9727 163 2.8970 1.0 1.0
11.5829 6.9636 190 2.8451 1.0 1.0
10.9847 7.9909 218 2.7870 1.0 1.0
11.269 8.9818 245 2.6834 1.0 0.9652
10.1931 9.9727 272 2.2750 1.0 0.8276
8.6755 10.9636 299 1.8566 1.0 0.5798
6.7633 11.9909 327 1.4861 1.0 0.4707
5.7016 12.9818 354 1.2004 0.9976 0.3762
4.6139 13.9727 381 0.9838 0.9509 0.2874
3.7333 14.9636 408 0.8431 0.9037 0.2456
2.9457 15.9909 436 0.6929 0.8081 0.1956
2.4083 16.9818 463 0.5662 0.7055 0.1549
1.8317 17.9727 490 0.5390 0.6683 0.1435
1.4594 18.9636 517 0.4966 0.6362 0.1308
1.1558 19.9909 545 0.4624 0.5875 0.1182
0.9968 20.9818 572 0.4782 0.5759 0.1137
0.8777 21.9727 599 0.4626 0.5409 0.1046
0.7342 22.9636 626 0.4618 0.5367 0.1057
0.6756 23.9909 654 0.4537 0.5375 0.1032
0.5935 24.9818 681 0.4574 0.5195 0.0998
0.5394 25.9727 708 0.4524 0.5122 0.0974
0.4708 26.9636 735 0.4890 0.5034 0.0951
0.4191 27.9909 763 0.4569 0.5058 0.0949
0.4185 28.9818 790 0.5009 0.4964 0.0949
0.4048 29.9727 817 0.5135 0.5163 0.0983
0.3577 30.9636 844 0.4956 0.4942 0.0929
0.3449 31.9909 872 0.4670 0.4757 0.0919
0.3383 32.9818 899 0.4793 0.4903 0.0910
0.3013 33.9727 926 0.5074 0.4920 0.0907
0.2725 34.9636 953 0.5002 0.4842 0.0890
0.2641 35.9909 981 0.5312 0.4723 0.0882
0.2538 36.9818 1008 0.4744 0.4737 0.0863
0.2392 37.9727 1035 0.5041 0.4621 0.0863
0.2282 38.9636 1062 0.5037 0.4511 0.0848
0.2088 39.9909 1090 0.4988 0.4655 0.0869
0.2062 40.9818 1117 0.4873 0.4579 0.0842
0.204 41.9727 1144 0.4689 0.4521 0.0836
0.1856 42.9636 1171 0.5070 0.4555 0.0823
0.1847 43.9909 1199 0.5058 0.4458 0.0826
0.1899 44.9818 1226 0.4997 0.4334 0.0808
0.1716 45.9727 1253 0.4995 0.4244 0.0796
0.1772 46.9636 1280 0.4993 0.4399 0.0812
0.1612 47.9909 1308 0.4982 0.4343 0.0800
0.1645 48.9818 1335 0.4861 0.4321 0.0799
0.1596 49.9727 1362 0.4963 0.4236 0.0788
0.1544 50.9636 1389 0.5150 0.4358 0.0795
0.1356 51.9909 1417 0.5069 0.4470 0.0808
0.1445 52.9818 1444 0.5112 0.4343 0.0790
0.1381 53.9727 1471 0.5201 0.4146 0.0760
0.1355 54.9636 1498 0.4991 0.4110 0.0756
0.1309 55.9909 1526 0.5260 0.4397 0.0798
0.1359 56.9818 1553 0.5096 0.4285 0.0789
0.1181 57.9727 1580 0.5013 0.4224 0.0767
0.1227 58.9636 1607 0.5219 0.4125 0.0758
0.1127 59.9909 1635 0.5043 0.4210 0.0759
0.1083 60.9818 1662 0.4853 0.4010 0.0744
0.1152 61.9727 1689 0.5032 0.4054 0.0740
0.1142 62.9636 1716 0.5048 0.4086 0.0745
0.0975 63.9909 1744 0.5218 0.3996 0.0723
0.105 64.9818 1771 0.5210 0.4112 0.0740
0.094 65.9727 1798 0.5418 0.4073 0.0727
0.0987 66.9636 1825 0.5166 0.4008 0.0721
0.0958 67.9909 1853 0.5008 0.4098 0.0722
0.0936 68.9818 1880 0.5419 0.3988 0.0719
0.0896 69.9727 1907 0.5570 0.4219 0.0747
0.0853 70.9636 1934 0.5534 0.4117 0.0740
0.0793 71.9909 1962 0.5557 0.4078 0.0726
0.0805 72.9818 1989 0.5368 0.4018 0.0717
0.0875 73.9727 2016 0.5476 0.4049 0.0741
0.076 74.9636 2043 0.5561 0.4066 0.0729
0.0703 75.9909 2071 0.5527 0.4052 0.0722
0.0707 76.9818 2098 0.5543 0.3959 0.0713
0.0665 77.9727 2125 0.5628 0.4003 0.0708
0.0677 78.9636 2152 0.5413 0.3957 0.0699
0.0638 79.9909 2180 0.5498 0.3988 0.0706
0.0652 80.9818 2207 0.5507 0.3930 0.0699
0.061 81.9727 2234 0.5259 0.3881 0.0682
0.06 82.9636 2261 0.5397 0.3896 0.0684
0.0564 83.9909 2289 0.5441 0.3842 0.0677
0.0635 84.9818 2316 0.5372 0.3840 0.0678
0.0514 85.9727 2343 0.5504 0.3816 0.0683
0.0467 86.9636 2370 0.5573 0.3774 0.0674
0.0485 87.9909 2398 0.5604 0.3811 0.0674
0.0519 88.9818 2425 0.5459 0.3733 0.0665
0.0514 89.9727 2452 0.5411 0.3799 0.0668
0.0475 90.9636 2479 0.5369 0.3772 0.0664
0.0434 91.9909 2507 0.5510 0.3850 0.0672
0.0488 92.9818 2534 0.5488 0.3774 0.0659
0.046 93.9727 2561 0.5443 0.3794 0.0663
0.0463 94.9636 2588 0.5463 0.3806 0.0666
0.0399 95.9909 2616 0.5500 0.3796 0.0663
0.0401 96.9818 2643 0.5494 0.3769 0.0657
0.0431 97.9727 2670 0.5516 0.375 0.0657
0.0404 98.9636 2697 0.5523 0.3765 0.0659
0.0409 99.0818 2700 0.5523 0.3765 0.0661

Framework versions

  • Transformers 4.47.0.dev0
  • Pytorch 2.1.0+cu118
  • Datasets 3.0.2
  • Tokenizers 0.20.1