wav2vec2-large-xlsr-53-AL-1000
This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3865
- Wer: 0.2321
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
6.0156 | 0.1063 | 200 | 3.0290 | 1.0 |
2.9588 | 0.2126 | 400 | 2.6180 | 1.0 |
1.1877 | 0.3189 | 600 | 0.6898 | 0.6574 |
0.6517 | 0.4252 | 800 | 0.5704 | 0.5222 |
0.5701 | 0.5315 | 1000 | 0.5273 | 0.5166 |
0.5243 | 0.6378 | 1200 | 0.4915 | 0.4742 |
0.4977 | 0.7441 | 1400 | 0.4686 | 0.4341 |
0.4642 | 0.8504 | 1600 | 0.4500 | 0.4045 |
0.4541 | 0.9567 | 1800 | 0.4374 | 0.4073 |
0.4264 | 1.0627 | 2000 | 0.4095 | 0.3677 |
0.4004 | 1.1690 | 2200 | 0.4050 | 0.3632 |
0.3904 | 1.2753 | 2400 | 0.3954 | 0.3399 |
0.391 | 1.3816 | 2600 | 0.4028 | 0.3688 |
0.3721 | 1.4879 | 2800 | 0.3784 | 0.3173 |
0.3776 | 1.5942 | 3000 | 0.3721 | 0.3143 |
0.3614 | 1.7005 | 3200 | 0.3769 | 0.3214 |
0.3621 | 1.8068 | 3400 | 0.3712 | 0.3190 |
0.3564 | 1.9131 | 3600 | 0.3675 | 0.3122 |
0.3501 | 2.0191 | 3800 | 0.3578 | 0.2831 |
0.322 | 2.1254 | 4000 | 0.3491 | 0.2820 |
0.3087 | 2.2317 | 4200 | 0.3517 | 0.2860 |
0.3135 | 2.3380 | 4400 | 0.3478 | 0.2734 |
0.3135 | 2.4443 | 4600 | 0.3524 | 0.2747 |
0.3021 | 2.5506 | 4800 | 0.3418 | 0.2573 |
0.3015 | 2.6569 | 5000 | 0.3439 | 0.2627 |
0.3037 | 2.7632 | 5200 | 0.3428 | 0.2652 |
0.2971 | 2.8695 | 5400 | 0.3411 | 0.2576 |
0.3002 | 2.9758 | 5600 | 0.3354 | 0.2492 |
0.2659 | 3.0818 | 5800 | 0.3399 | 0.2512 |
0.2597 | 3.1881 | 6000 | 0.3355 | 0.2489 |
0.275 | 3.2944 | 6200 | 0.3332 | 0.2599 |
0.2703 | 3.4007 | 6400 | 0.3340 | 0.2489 |
0.2678 | 3.5070 | 6600 | 0.3296 | 0.2497 |
0.2646 | 3.6133 | 6800 | 0.3333 | 0.2495 |
0.2573 | 3.7196 | 7000 | 0.3266 | 0.2445 |
0.2604 | 3.8259 | 7200 | 0.3288 | 0.2400 |
0.2603 | 3.9322 | 7400 | 0.3243 | 0.2476 |
0.2582 | 4.0383 | 7600 | 0.3236 | 0.2468 |
0.2322 | 4.1446 | 7800 | 0.3292 | 0.2602 |
0.2296 | 4.2509 | 8000 | 0.3232 | 0.2397 |
0.2289 | 4.3572 | 8200 | 0.3246 | 0.2485 |
0.2267 | 4.4635 | 8400 | 0.3192 | 0.2382 |
0.2284 | 4.5698 | 8600 | 0.3193 | 0.2395 |
0.2336 | 4.6761 | 8800 | 0.3203 | 0.2385 |
0.2346 | 4.7824 | 9000 | 0.3164 | 0.2385 |
0.2279 | 4.8887 | 9200 | 0.3199 | 0.2365 |
0.2293 | 4.9950 | 9400 | 0.3223 | 0.2377 |
0.1982 | 5.1010 | 9600 | 0.3386 | 0.2415 |
0.198 | 5.2073 | 9800 | 0.3437 | 0.2444 |
0.2092 | 5.3136 | 10000 | 0.3343 | 0.2387 |
0.1964 | 5.4199 | 10200 | 0.3318 | 0.2381 |
0.1957 | 5.5262 | 10400 | 0.3307 | 0.2314 |
0.1951 | 5.6325 | 10600 | 0.3371 | 0.2343 |
0.1969 | 5.7388 | 10800 | 0.3328 | 0.2306 |
0.2021 | 5.8451 | 11000 | 0.3288 | 0.2327 |
0.2036 | 5.9514 | 11200 | 0.3278 | 0.2337 |
0.1828 | 6.0574 | 11400 | 0.3498 | 0.2297 |
0.1759 | 6.1637 | 11600 | 0.3464 | 0.2302 |
0.1702 | 6.2700 | 11800 | 0.3438 | 0.2311 |
0.1735 | 6.3763 | 12000 | 0.3469 | 0.2285 |
0.1689 | 6.4826 | 12200 | 0.3529 | 0.2288 |
0.1735 | 6.5889 | 12400 | 0.3380 | 0.2284 |
0.1687 | 6.6952 | 12600 | 0.3429 | 0.2261 |
0.1671 | 6.8015 | 12800 | 0.3426 | 0.2293 |
0.1752 | 6.9078 | 13000 | 0.3462 | 0.2263 |
0.1677 | 7.0138 | 13200 | 0.3379 | 0.2369 |
0.1486 | 7.1201 | 13400 | 0.3437 | 0.2328 |
0.1489 | 7.2264 | 13600 | 0.3440 | 0.2331 |
0.1484 | 7.3327 | 13800 | 0.3469 | 0.2375 |
0.1474 | 7.4390 | 14000 | 0.3455 | 0.2337 |
0.1463 | 7.5453 | 14200 | 0.3477 | 0.2310 |
0.1516 | 7.6516 | 14400 | 0.3450 | 0.2328 |
0.1489 | 7.7579 | 14600 | 0.3496 | 0.2318 |
0.1501 | 7.8642 | 14800 | 0.3421 | 0.2319 |
0.1478 | 7.9705 | 15000 | 0.3510 | 0.2383 |
0.1297 | 8.0765 | 15200 | 0.3693 | 0.2310 |
0.1288 | 8.1828 | 15400 | 0.3776 | 0.2321 |
0.1289 | 8.2891 | 15600 | 0.3683 | 0.2344 |
0.1384 | 8.3954 | 15800 | 0.3683 | 0.2295 |
0.1304 | 8.5017 | 16000 | 0.3712 | 0.2318 |
0.1328 | 8.6080 | 16200 | 0.3659 | 0.2306 |
0.1296 | 8.7143 | 16400 | 0.3624 | 0.2305 |
0.1292 | 8.8206 | 16600 | 0.3607 | 0.2312 |
0.1302 | 8.9269 | 16800 | 0.3659 | 0.2295 |
0.1266 | 9.0330 | 17000 | 0.3809 | 0.2304 |
0.1202 | 9.1393 | 17200 | 0.3856 | 0.2291 |
0.1191 | 9.2455 | 17400 | 0.3837 | 0.2317 |
0.1149 | 9.3518 | 17600 | 0.3847 | 0.2309 |
0.1113 | 9.4581 | 17800 | 0.3865 | 0.2330 |
0.1193 | 9.5644 | 18000 | 0.3867 | 0.2344 |
0.1162 | 9.6707 | 18200 | 0.3855 | 0.2351 |
0.1181 | 9.7770 | 18400 | 0.3853 | 0.2328 |
0.1186 | 9.8833 | 18600 | 0.3861 | 0.2321 |
0.1166 | 9.9896 | 18800 | 0.3865 | 0.2321 |
Framework versions
- Transformers 4.49.0.dev0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 113
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for akadriu/wav2vec2-large-xlsr-53-AL-1000
Base model
facebook/wav2vec2-large-xlsr-53