You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

w2v-bert-2.0-Fleurs_AMMI_AFRIVOICE_LRSC-ln-10hrs-v3

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8673
  • Wer: 0.2284
  • Cer: 0.0697

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 80
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.1462 1.0 192 0.5640 0.3575 0.1097
0.5018 2.0 384 0.4470 0.3268 0.0983
0.3871 3.0 576 0.4247 0.3089 0.0921
0.3318 4.0 768 0.4154 0.3006 0.0923
0.2848 5.0 960 0.3772 0.2615 0.0824
0.2488 6.0 1152 0.3803 0.2772 0.0828
0.2203 7.0 1344 0.4229 0.2391 0.0764
0.192 8.0 1536 0.4192 0.2345 0.0742
0.1671 9.0 1728 0.4143 0.2460 0.0758
0.151 10.0 1920 0.4041 0.2633 0.0840
0.1298 11.0 2112 0.4404 0.2664 0.0784
0.1183 12.0 2304 0.4589 0.2571 0.0780
0.1087 13.0 2496 0.5140 0.2353 0.0729
0.1059 14.0 2688 0.5535 0.2294 0.0722
0.1065 15.0 2880 0.5584 0.2311 0.0729
0.0995 16.0 3072 0.6294 0.2379 0.0734
0.0958 17.0 3264 0.5974 0.2194 0.0692
0.08 18.0 3456 0.5714 0.2317 0.0713
0.0697 19.0 3648 0.5725 0.2538 0.0758
0.0603 20.0 3840 0.5144 0.2465 0.0768
0.0535 21.0 4032 0.5700 0.2431 0.0741
0.0483 22.0 4224 0.6042 0.2313 0.0723
0.044 23.0 4416 0.6013 0.2434 0.0737
0.0395 24.0 4608 0.6270 0.2291 0.0705
0.0364 25.0 4800 0.6701 0.2152 0.0674
0.0299 26.0 4992 0.6459 0.2315 0.0728
0.0288 27.0 5184 0.6444 0.2323 0.0744
0.024 28.0 5376 0.6615 0.2409 0.0727
0.0232 29.0 5568 0.7127 0.2319 0.0692
0.0213 30.0 5760 0.6741 0.2452 0.0752
0.021 31.0 5952 0.7123 0.2308 0.0723
0.0167 32.0 6144 0.7742 0.2230 0.0680
0.0154 33.0 6336 0.7341 0.2276 0.0716
0.0143 34.0 6528 0.7328 0.2391 0.0735
0.0116 35.0 6720 0.8131 0.2317 0.0717
0.0112 36.0 6912 0.8430 0.2281 0.0700
0.01 37.0 7104 0.7926 0.2251 0.0703
0.0085 38.0 7296 0.8107 0.2297 0.0714
0.0073 39.0 7488 0.8272 0.2234 0.0687
0.0062 40.0 7680 0.8673 0.2284 0.0697

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
0
Safetensors
Model size
606M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for asr-africa/w2v-bert-2.0-Fleurs_AMMI_AFRIVOICE_LRSC-ln-10hrs-v3

Finetuned
(246)
this model

Collection including asr-africa/w2v-bert-2.0-Fleurs_AMMI_AFRIVOICE_LRSC-ln-10hrs-v3