wav2vec2-xls-r-1b-scandinavian-faroese-100h-60-epochs-20250112_v4

This model is a fine-tuned version of davidilag/wav2vec2-xls-r-1b-scandinavian-251h-30-epochs-20250111_v9 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1070
  • Wer: 17.8261
  • Cer: 3.7769

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 6000
  • num_epochs: 60
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.0345 0.4877 1000 0.3260 39.9656 10.9180
0.7814 0.9754 2000 0.2024 29.6867 7.6587
0.6454 1.4628 3000 0.1783 27.9376 7.1064
0.6608 1.9505 4000 0.1803 28.9377 7.2713
0.5288 2.4379 5000 0.2009 28.5060 7.2681
0.6023 2.9256 6000 0.1948 29.0787 7.4425
0.5827 3.4131 7000 0.1799 28.5544 7.2760
0.5251 3.9008 8000 0.1759 27.4794 6.9036
0.489 4.3882 9000 0.1710 26.3647 6.7032
0.5095 4.8759 10000 0.1579 26.5013 6.6078
0.446 5.3633 11000 0.1476 25.0518 6.1541
0.4329 5.8510 12000 0.1504 25.1928 6.2038
0.3607 6.3385 13000 0.1389 24.7125 5.9892
0.3411 6.8261 14000 0.1480 24.7478 6.1312
0.3508 7.3136 15000 0.1381 24.3116 5.8511
0.3302 7.8013 16000 0.1430 24.5936 5.9576
0.2681 8.2887 17000 0.1434 23.8093 5.7012
0.2914 8.7764 18000 0.1312 23.8225 5.7099
0.2386 9.2638 19000 0.1334 23.4436 5.5766
0.2496 9.7515 20000 0.1363 23.6507 5.6207
0.2329 10.2390 21000 0.1306 23.0515 5.3754
0.2416 10.7267 22000 0.1300 22.9986 5.4779
0.2566 11.2141 23000 0.1235 22.6990 5.3335
0.2267 11.7018 24000 0.1301 22.8356 5.3659
0.2147 12.1892 25000 0.1276 22.4743 5.2657
0.2009 12.6769 26000 0.1246 22.2717 5.2136
0.1959 13.1644 27000 0.1214 22.0293 5.1434
0.2053 13.6520 28000 0.1211 22.0866 5.0890
0.1648 14.1395 29000 0.1309 21.8707 5.0361
0.1711 14.6272 30000 0.1265 22.0249 5.0850
0.1612 15.1146 31000 0.1214 21.7165 4.9848
0.162 15.6023 32000 0.1210 21.7209 5.0030
0.1483 16.0897 33000 0.1265 21.5888 4.9935
0.1539 16.5774 34000 0.1203 21.6020 4.9801
0.1382 17.0649 35000 0.1139 21.1966 4.8207
0.159 17.5525 36000 0.1154 21.2099 4.8128
0.1176 18.0400 37000 0.1213 21.2715 4.8010
0.1218 18.5277 38000 0.1175 20.9411 4.7860
0.1239 19.0151 39000 0.1195 20.9719 4.7742
0.1277 19.5028 40000 0.1133 21.0865 4.7410
0.1327 19.9905 41000 0.1117 20.7120 4.6842
0.1403 20.4779 42000 0.1247 21.0645 4.8128
0.1251 20.9656 43000 0.1083 20.6591 4.6416
0.1084 21.4531 44000 0.1174 20.8001 4.6795
0.1258 21.9407 45000 0.1185 20.6371 4.6495
0.0906 22.4282 46000 0.1232 20.6723 4.6740
0.1074 22.9159 47000 0.1187 20.3771 4.6416
0.1125 23.4033 48000 0.1138 20.4212 4.6140
0.1055 23.8910 49000 0.1220 20.5754 4.7008
0.1116 24.3784 50000 0.1181 20.3463 4.6100
0.0997 24.8661 51000 0.1261 20.3331 4.6361
0.1025 25.3536 52000 0.1166 20.2714 4.6085
0.1032 25.8413 53000 0.1118 20.1833 4.4862
0.0997 26.3287 54000 0.1165 20.2978 4.5603
0.0923 26.8164 55000 0.1121 20.0555 4.4901
0.087 27.3038 56000 0.1236 20.2229 4.5785
0.0961 27.7915 57000 0.1130 19.8881 4.4380
0.095 28.2790 58000 0.1168 19.9806 4.4570
0.0995 28.7666 59000 0.1187 20.1436 4.5501
0.1021 29.2541 60000 0.1195 20.1613 4.5209
0.1168 29.7418 61000 0.1211 19.8132 4.4412
0.1029 30.2292 62000 0.1161 19.9277 4.3994
0.1023 30.7169 63000 0.1148 19.7868 4.4089
0.1044 31.2043 64000 0.1089 19.5753 4.3599
0.0886 31.6920 65000 0.1109 19.4211 4.3165
0.0885 32.1795 66000 0.1192 19.5621 4.3576
0.0723 32.6672 67000 0.1170 19.4960 4.3505
0.078 33.1546 68000 0.1129 19.5444 4.3292
0.066 33.6423 69000 0.1197 19.4255 4.2960
0.0684 34.1297 70000 0.1205 19.7603 4.3402
0.0856 34.6174 71000 0.1131 19.5180 4.3173
0.0857 35.1049 72000 0.1141 19.6590 4.3473
0.0813 35.5925 73000 0.1168 19.4828 4.2881
0.0752 36.0800 74000 0.1181 19.3594 4.2684
0.0722 36.5677 75000 0.1112 19.2096 4.2313
0.0699 37.0551 76000 0.1178 19.1787 4.2203
0.0717 37.5428 77000 0.1112 19.0334 4.1903
0.1025 38.0302 78000 0.1081 19.1171 4.1911
0.0668 38.5179 79000 0.1176 19.0334 4.1745
0.0858 39.0054 80000 0.1089 19.0201 4.1580
0.078 39.4931 81000 0.1079 19.0466 4.1777
0.0718 39.9807 82000 0.1079 19.0950 4.1390
0.0766 40.4682 83000 0.1109 19.0025 4.1335
0.0611 40.9559 84000 0.1136 18.8747 4.1146
0.0846 41.4433 85000 0.1093 18.8703 4.1153
0.0719 41.9310 86000 0.1107 18.8924 4.1106
0.0551 42.4184 87000 0.1077 18.9540 4.0625
0.0849 42.9061 88000 0.1026 18.8747 4.0846
0.0715 43.3936 89000 0.1106 18.8042 4.0830
0.0682 43.8812 90000 0.1157 18.8659 4.0956
0.0754 44.3687 91000 0.1137 18.6677 4.0467
0.0627 44.8564 92000 0.1154 18.6809 4.0341
0.0821 45.3438 93000 0.1076 18.5972 4.0459
0.0514 45.8315 94000 0.1088 18.5619 4.0175
0.0505 46.3189 95000 0.1110 18.5972 4.0049
0.0611 46.8066 96000 0.1123 18.5179 4.0270
0.0568 47.2941 97000 0.1099 18.4209 3.9836
0.0464 47.7818 98000 0.1047 18.4606 3.9694
0.0543 48.2692 99000 0.1075 18.3460 3.9449
0.0464 48.7569 100000 0.1101 18.3020 3.9339
0.0408 49.2443 101000 0.1070 18.2006 3.9031
0.0632 49.7320 102000 0.1053 18.3196 3.9078
0.0648 50.2195 103000 0.1050 18.3593 3.9370
0.0479 50.7071 104000 0.1080 18.4253 3.9702
0.0534 51.1946 105000 0.1054 18.2183 3.9031
0.0361 51.6823 106000 0.1076 18.0685 3.8850
0.0402 52.1697 107000 0.1010 18.1478 3.8629
0.0367 52.6574 108000 0.1076 18.1125 3.8637
0.0465 53.1448 109000 0.1032 18.0244 3.8266
0.0364 53.6325 110000 0.1055 17.9936 3.8353
0.044 54.1200 111000 0.1041 18.0905 3.8637
0.0451 54.6077 112000 0.1060 18.0244 3.8384
0.0438 55.0951 113000 0.1078 17.9099 3.8155
0.045 55.5828 114000 0.1062 17.8658 3.8124
0.0259 56.0702 115000 0.1068 17.9143 3.7927
0.0393 56.5579 116000 0.1058 17.9055 3.7879
0.0457 57.0454 117000 0.1060 17.8746 3.7927
0.029 57.5330 118000 0.1055 17.8306 3.7761
0.0265 58.0205 119000 0.1067 17.8746 3.7856
0.0282 58.5082 120000 0.1073 17.8217 3.7737
0.0264 58.9959 121000 0.1076 17.8438 3.7792
0.0295 59.4833 122000 0.1072 17.8173 3.7721
0.0276 59.9710 123000 0.1070 17.8261 3.7769

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
25
Safetensors
Model size
963M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for davidilag/wav2vec2-xls-r-1b-scandinavian-faroese-100h-60-epochs-20250112_v4