metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-1b
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-xls-r-1b-scandinavian-251h-30-epochs-20250111_v9
results: []
wav2vec2-xls-r-1b-scandinavian-251h-30-epochs-20250111_v9
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2085
- Wer: 22.6884
- Cer: 5.8380
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 3000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
2.7619 | 0.3502 | 1000 | 1.0505 | 79.7500 | 27.7494 |
1.5883 | 0.7004 | 2000 | 0.5542 | 52.5223 | 16.3287 |
1.1356 | 1.0504 | 3000 | 0.5714 | 52.2068 | 16.0785 |
1.0134 | 1.4006 | 4000 | 0.4909 | 48.9568 | 14.9144 |
1.049 | 1.7508 | 5000 | 0.4658 | 46.3140 | 14.0887 |
0.8085 | 2.1009 | 6000 | 0.4274 | 44.0995 | 12.9620 |
0.725 | 2.4511 | 7000 | 0.3970 | 41.9410 | 12.1666 |
0.805 | 2.8013 | 8000 | 0.3763 | 40.2846 | 11.4667 |
0.6876 | 3.1513 | 9000 | 0.3804 | 40.3847 | 11.7141 |
0.6424 | 3.5015 | 10000 | 0.3761 | 39.6587 | 11.3674 |
0.6857 | 3.8517 | 11000 | 0.3511 | 37.7114 | 11.1336 |
0.6235 | 4.2017 | 12000 | 0.3565 | 37.6741 | 10.7539 |
0.6165 | 4.5519 | 13000 | 0.3212 | 36.0533 | 10.1479 |
0.5964 | 4.9021 | 14000 | 0.3031 | 34.7786 | 9.6009 |
0.6145 | 5.2521 | 15000 | 0.3198 | 35.3324 | 9.8529 |
0.6662 | 5.6023 | 16000 | 0.3006 | 34.1068 | 9.5917 |
0.663 | 5.9525 | 17000 | 0.3007 | 33.6539 | 9.4145 |
0.6603 | 6.3026 | 18000 | 0.3044 | 33.9652 | 9.3871 |
0.6754 | 6.6528 | 19000 | 0.2873 | 33.1790 | 9.0868 |
0.6519 | 7.0028 | 20000 | 0.2672 | 31.9034 | 8.8680 |
0.6412 | 7.3530 | 21000 | 0.2759 | 32.1841 | 8.8061 |
0.6028 | 7.7032 | 22000 | 0.2853 | 31.8364 | 8.7611 |
0.5709 | 8.0532 | 23000 | 0.2838 | 32.5454 | 9.1492 |
0.571 | 8.4034 | 24000 | 0.2512 | 31.2495 | 8.6213 |
0.6812 | 8.7536 | 25000 | 0.2698 | 32.6989 | 9.2092 |
0.5007 | 9.1037 | 26000 | 0.2726 | 31.5692 | 8.6188 |
0.4518 | 9.4539 | 27000 | 0.2600 | 30.0146 | 8.3456 |
0.5155 | 9.8041 | 28000 | 0.2494 | 30.0994 | 8.2447 |
0.4055 | 10.1541 | 29000 | 0.2502 | 29.6261 | 8.0922 |
0.3963 | 10.5043 | 30000 | 0.2512 | 30.0646 | 8.2523 |
0.4581 | 10.8545 | 31000 | 0.2405 | 28.9417 | 7.8634 |
0.3554 | 11.2045 | 32000 | 0.2547 | 29.5303 | 8.0054 |
0.3613 | 11.5547 | 33000 | 0.2452 | 28.4074 | 7.7358 |
0.3798 | 11.9049 | 34000 | 0.2373 | 29.2818 | 8.0558 |
0.347 | 12.2549 | 35000 | 0.2354 | 28.1614 | 7.6554 |
0.3401 | 12.6051 | 36000 | 0.2216 | 27.2794 | 7.2458 |
0.3242 | 12.9553 | 37000 | 0.2315 | 26.9537 | 7.1938 |
0.3393 | 13.3054 | 38000 | 0.2313 | 27.3888 | 7.3823 |
0.3717 | 13.6556 | 39000 | 0.2218 | 26.5771 | 7.0449 |
0.3662 | 14.0056 | 40000 | 0.2357 | 27.6390 | 7.5118 |
0.3541 | 14.3558 | 41000 | 0.2265 | 26.1751 | 6.8972 |
0.3686 | 14.7060 | 42000 | 0.2215 | 26.1937 | 6.9842 |
0.3832 | 15.0560 | 43000 | 0.2242 | 26.0046 | 6.9059 |
0.4 | 15.4062 | 44000 | 0.2239 | 26.0326 | 6.9814 |
0.4254 | 15.7564 | 45000 | 0.2192 | 25.6408 | 6.8145 |
0.3963 | 16.1065 | 46000 | 0.2154 | 25.6933 | 6.8087 |
0.3457 | 16.4567 | 47000 | 0.2177 | 26.4965 | 7.1219 |
0.3609 | 16.8069 | 48000 | 0.2196 | 25.6645 | 7.0066 |
0.3055 | 17.1569 | 49000 | 0.2217 | 26.0267 | 6.8638 |
0.296 | 17.5071 | 50000 | 0.2094 | 25.0878 | 6.5902 |
0.2876 | 17.8573 | 51000 | 0.2244 | 25.5763 | 6.7381 |
0.2356 | 18.2073 | 52000 | 0.2205 | 24.9292 | 6.5937 |
0.2801 | 18.5575 | 53000 | 0.2203 | 24.8028 | 6.5381 |
0.2626 | 18.9077 | 54000 | 0.2076 | 24.4457 | 6.3823 |
0.224 | 19.2577 | 55000 | 0.2080 | 24.4262 | 6.4048 |
0.2196 | 19.6079 | 56000 | 0.2169 | 25.1039 | 6.6133 |
0.2341 | 19.9582 | 57000 | 0.2072 | 24.2269 | 6.3431 |
0.1899 | 20.3082 | 58000 | 0.2080 | 24.1506 | 6.3084 |
0.1848 | 20.6584 | 59000 | 0.2076 | 24.3211 | 6.3481 |
0.1816 | 21.0084 | 60000 | 0.2068 | 23.8003 | 6.1728 |
0.207 | 21.3586 | 61000 | 0.2126 | 23.6527 | 6.1666 |
0.1935 | 21.7088 | 62000 | 0.2104 | 23.3550 | 6.0682 |
0.2268 | 22.0588 | 63000 | 0.2122 | 23.5009 | 6.0946 |
0.2236 | 22.4090 | 64000 | 0.2082 | 23.2601 | 6.0101 |
0.2118 | 22.7592 | 65000 | 0.2071 | 23.4195 | 6.0859 |
0.2608 | 23.1093 | 66000 | 0.2023 | 23.3975 | 6.0759 |
0.2122 | 23.4595 | 67000 | 0.2102 | 23.3958 | 6.0669 |
0.2192 | 23.8097 | 68000 | 0.2046 | 22.9106 | 5.9151 |
0.2492 | 24.1597 | 69000 | 0.2106 | 22.9445 | 5.9293 |
0.229 | 24.5099 | 70000 | 0.2077 | 22.8877 | 5.9126 |
0.2188 | 24.8601 | 71000 | 0.2127 | 22.9759 | 5.9271 |
0.2203 | 25.2101 | 72000 | 0.2080 | 22.7563 | 5.8728 |
0.2732 | 25.5603 | 73000 | 0.2083 | 22.8555 | 5.9057 |
0.2355 | 25.9105 | 74000 | 0.2065 | 22.6748 | 5.8377 |
0.1742 | 26.2605 | 75000 | 0.2068 | 22.6723 | 5.8282 |
0.1717 | 26.6108 | 76000 | 0.2081 | 22.6307 | 5.8337 |
0.1944 | 26.9610 | 77000 | 0.2069 | 22.6341 | 5.8331 |
0.1526 | 27.3110 | 78000 | 0.2067 | 22.6460 | 5.8402 |
0.1894 | 27.6612 | 79000 | 0.2067 | 22.6367 | 5.8324 |
0.1358 | 28.0112 | 80000 | 0.2080 | 22.6638 | 5.8319 |
0.1346 | 28.3614 | 81000 | 0.2077 | 22.6486 | 5.8285 |
0.1318 | 28.7116 | 82000 | 0.2082 | 22.6647 | 5.8406 |
0.1584 | 29.0616 | 83000 | 0.2085 | 22.6876 | 5.8403 |
0.1491 | 29.4118 | 84000 | 0.2085 | 22.6808 | 5.8377 |
0.1347 | 29.7620 | 85000 | 0.2085 | 22.6884 | 5.8380 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0