w2v2_ablation_with_ling_head-drop0.1-not-load-best-wer-best_on_tp0.025_tl10_fp0.001_fl16
This model is a fine-tuned version of nguyenvulebinh/wav2vec2-base-vietnamese-250h on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.4141
- Wer: 0.0914
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
119.415 | 0.94 | 100 | 91.5112 | 18.6364 |
74.7916 | 1.89 | 200 | 12.2928 | 0.9951 |
6.9068 | 2.83 | 300 | 5.2345 | 1.0 |
5.1207 | 3.77 | 400 | 5.0365 | 1.0 |
4.7306 | 4.72 | 500 | 4.9152 | 1.0 |
4.4974 | 5.66 | 600 | 4.9315 | 1.0 |
4.3923 | 6.6 | 700 | 4.7918 | 1.0 |
4.3447 | 7.55 | 800 | 4.6447 | 1.0 |
4.225 | 8.49 | 900 | 4.6061 | 1.0 |
3.9805 | 9.43 | 1000 | 3.6422 | 0.8733 |
2.8303 | 10.38 | 1100 | 1.7824 | 0.3489 |
1.5807 | 11.32 | 1200 | 1.0908 | 0.2162 |
1.1284 | 12.26 | 1300 | 0.8473 | 0.1640 |
0.8703 | 13.21 | 1400 | 0.7322 | 0.1423 |
0.7576 | 14.15 | 1500 | 0.6551 | 0.1325 |
0.6256 | 15.09 | 1600 | 0.6027 | 0.1387 |
0.594 | 16.04 | 1700 | 0.5550 | 0.1300 |
0.5492 | 16.98 | 1800 | 0.5200 | 0.1159 |
0.476 | 17.92 | 1900 | 0.5012 | 0.1091 |
0.4822 | 18.87 | 2000 | 0.5112 | 0.1074 |
0.4351 | 19.81 | 2100 | 0.4985 | 0.1179 |
0.4169 | 20.75 | 2200 | 0.4712 | 0.1061 |
0.3957 | 21.7 | 2300 | 0.4613 | 0.0988 |
0.3885 | 22.64 | 2400 | 0.4610 | 0.1025 |
0.3827 | 23.58 | 2500 | 0.4509 | 0.0978 |
0.3468 | 24.53 | 2600 | 0.4549 | 0.0951 |
0.3451 | 25.47 | 2700 | 0.4556 | 0.1019 |
0.3234 | 26.42 | 2800 | 0.4554 | 0.1104 |
0.31 | 27.36 | 2900 | 0.4568 | 0.0988 |
0.3026 | 28.3 | 3000 | 0.4211 | 0.0965 |
0.2905 | 29.25 | 3100 | 0.4305 | 0.0911 |
0.2964 | 30.19 | 3200 | 0.4379 | 0.0990 |
0.302 | 31.13 | 3300 | 0.4379 | 0.0943 |
0.2576 | 32.08 | 3400 | 0.4293 | 0.0933 |
0.2771 | 33.02 | 3500 | 0.4239 | 0.0928 |
0.268 | 33.96 | 3600 | 0.4228 | 0.0894 |
0.2458 | 34.91 | 3700 | 0.4288 | 0.0899 |
0.2553 | 35.85 | 3800 | 0.4312 | 0.0966 |
0.2424 | 36.79 | 3900 | 0.4162 | 0.0917 |
0.2501 | 37.74 | 4000 | 0.4088 | 0.0840 |
0.2498 | 38.68 | 4100 | 0.4144 | 0.0921 |
0.2273 | 39.62 | 4200 | 0.4154 | 0.0863 |
0.23 | 40.57 | 4300 | 0.4157 | 0.0868 |
0.2409 | 41.51 | 4400 | 0.4033 | 0.0826 |
0.248 | 42.45 | 4500 | 0.4122 | 0.0847 |
0.218 | 43.4 | 4600 | 0.4052 | 0.0848 |
0.1979 | 44.34 | 4700 | 0.4063 | 0.0887 |
0.2091 | 45.28 | 4800 | 0.4078 | 0.0823 |
0.2097 | 46.23 | 4900 | 0.4177 | 0.0893 |
0.2017 | 47.17 | 5000 | 0.4295 | 0.0887 |
0.1899 | 48.11 | 5100 | 0.4177 | 0.0919 |
0.195 | 49.06 | 5200 | 0.4109 | 0.0880 |
0.179 | 50.0 | 5300 | 0.4089 | 0.0879 |
0.1773 | 50.94 | 5400 | 0.4071 | 0.0843 |
0.1889 | 51.89 | 5500 | 0.4072 | 0.0885 |
0.1987 | 52.83 | 5600 | 0.4033 | 0.0873 |
0.1979 | 53.77 | 5700 | 0.4033 | 0.0928 |
0.1777 | 54.72 | 5800 | 0.4077 | 0.0898 |
0.1742 | 55.66 | 5900 | 0.3969 | 0.0838 |
0.1678 | 56.6 | 6000 | 0.3997 | 0.0806 |
0.1726 | 57.55 | 6100 | 0.3978 | 0.0885 |
0.1602 | 58.49 | 6200 | 0.3967 | 0.0860 |
0.1681 | 59.43 | 6300 | 0.4039 | 0.0901 |
0.1594 | 60.38 | 6400 | 0.3992 | 0.0856 |
0.171 | 61.32 | 6500 | 0.4058 | 0.0890 |
0.1691 | 62.26 | 6600 | 0.4078 | 0.0842 |
0.1724 | 63.21 | 6700 | 0.4161 | 0.0903 |
0.172 | 64.15 | 6800 | 0.4121 | 0.0899 |
0.1717 | 65.09 | 6900 | 0.4111 | 0.0878 |
0.1775 | 66.04 | 7000 | 0.4109 | 0.0926 |
0.1607 | 66.98 | 7100 | 0.4080 | 0.0908 |
0.1606 | 67.92 | 7200 | 0.4070 | 0.0930 |
0.1801 | 68.87 | 7300 | 0.4096 | 0.0908 |
0.16 | 69.81 | 7400 | 0.4030 | 0.0933 |
0.1433 | 70.75 | 7500 | 0.4059 | 0.0920 |
0.1473 | 71.7 | 7600 | 0.4120 | 0.0979 |
0.1396 | 72.64 | 7700 | 0.4062 | 0.0922 |
0.1429 | 73.58 | 7800 | 0.4079 | 0.0899 |
0.1332 | 74.53 | 7900 | 0.4055 | 0.0851 |
0.1429 | 75.47 | 8000 | 0.4081 | 0.0922 |
0.1528 | 76.42 | 8100 | 0.4083 | 0.0853 |
0.1547 | 77.36 | 8200 | 0.4139 | 0.0945 |
0.1384 | 78.3 | 8300 | 0.4111 | 0.0933 |
0.1696 | 79.25 | 8400 | 0.4132 | 0.0943 |
0.1483 | 80.19 | 8500 | 0.4139 | 0.0906 |
0.1547 | 81.13 | 8600 | 0.4156 | 0.0959 |
0.149 | 82.08 | 8700 | 0.4119 | 0.0905 |
0.1294 | 83.02 | 8800 | 0.4145 | 0.0945 |
0.1383 | 83.96 | 8900 | 0.4151 | 0.0917 |
0.1356 | 84.91 | 9000 | 0.4165 | 0.0952 |
0.1491 | 85.85 | 9100 | 0.4188 | 0.0950 |
0.1395 | 86.79 | 9200 | 0.4174 | 0.0950 |
0.1439 | 87.74 | 9300 | 0.4151 | 0.0919 |
0.1421 | 88.68 | 9400 | 0.4152 | 0.0931 |
0.1443 | 89.62 | 9500 | 0.4160 | 0.0944 |
0.1429 | 90.57 | 9600 | 0.4138 | 0.0928 |
0.1397 | 91.51 | 9700 | 0.4149 | 0.0918 |
0.155 | 92.45 | 9800 | 0.4144 | 0.0915 |
0.1406 | 93.4 | 9900 | 0.4139 | 0.0921 |
0.1328 | 94.34 | 10000 | 0.4140 | 0.0929 |
0.1461 | 95.28 | 10100 | 0.4142 | 0.0914 |
0.1455 | 96.23 | 10200 | 0.4142 | 0.0913 |
0.155 | 97.17 | 10300 | 0.4139 | 0.0914 |
0.147 | 98.11 | 10400 | 0.4140 | 0.0918 |
0.1298 | 99.06 | 10500 | 0.4140 | 0.0917 |
0.1508 | 100.0 | 10600 | 0.4141 | 0.0914 |
Framework versions
- Transformers 4.35.2
- Pytorch 1.13.1+cu117
- Datasets 2.12.0
- Tokenizers 0.14.1
- Downloads last month
- 8
Model tree for tuanio/w2v2_ablation_with_ling_head-drop0.1-not-load-best-wer-best_on_tp0.025_tl10_fp0.001_fl16
Base model
nguyenvulebinh/wav2vec2-base-vietnamese-250h