sulaimank commited on
Commit
5c31613
·
verified ·
1 Parent(s): c77578c

End of training

Browse files
Files changed (2) hide show
  1. README.md +97 -104
  2. model.safetensors +1 -1
README.md CHANGED
@@ -18,9 +18,9 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [facebook/hubert-large-ls960-ft](https://huggingface.co/facebook/hubert-large-ls960-ft) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.4346
22
- - Wer: 0.1351
23
- - Cer: 0.0392
24
 
25
  ## Model description
26
 
@@ -39,7 +39,7 @@ More information needed
39
  ### Training hyperparameters
40
 
41
  The following hyperparameters were used during training:
42
- - learning_rate: 0.0003
43
  - train_batch_size: 8
44
  - eval_batch_size: 4
45
  - seed: 42
@@ -54,106 +54,99 @@ The following hyperparameters were used during training:
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
56
  |:-------------:|:-----:|:------:|:---------------:|:------:|:------:|
57
- | 0.5539 | 1.0 | 5827 | 0.3337 | 0.3247 | 0.0811 |
58
- | 0.3272 | 2.0 | 11654 | 0.2824 | 0.2704 | 0.0710 |
59
- | 0.2783 | 3.0 | 17481 | 0.2829 | 0.2394 | 0.0635 |
60
- | 0.2486 | 4.0 | 23308 | 0.2650 | 0.2309 | 0.0614 |
61
- | 0.2275 | 5.0 | 29135 | 0.2858 | 0.2182 | 0.0577 |
62
- | 0.2101 | 6.0 | 34962 | 0.2475 | 0.2045 | 0.0557 |
63
- | 0.1975 | 7.0 | 40789 | 0.2692 | 0.2053 | 0.0558 |
64
- | 0.1843 | 8.0 | 46616 | 0.2529 | 0.1966 | 0.0545 |
65
- | 0.1742 | 9.0 | 52443 | 0.2611 | 0.1952 | 0.0532 |
66
- | 0.1666 | 10.0 | 58270 | 0.2995 | 0.1910 | 0.0526 |
67
- | 0.1575 | 11.0 | 64097 | 0.2493 | 0.1822 | 0.0508 |
68
- | 0.1518 | 12.0 | 69924 | 0.2473 | 0.1840 | 0.0507 |
69
- | 0.1447 | 13.0 | 75751 | 0.2587 | 0.1826 | 0.0510 |
70
- | 0.1394 | 14.0 | 81578 | 0.2553 | 0.1833 | 0.0502 |
71
- | 0.1336 | 15.0 | 87405 | 0.2556 | 0.1803 | 0.0498 |
72
- | 0.1283 | 16.0 | 93232 | 0.2699 | 0.1775 | 0.0504 |
73
- | 0.1228 | 17.0 | 99059 | 0.2552 | 0.1786 | 0.0494 |
74
- | 0.1181 | 18.0 | 104886 | 0.2715 | 0.1754 | 0.0485 |
75
- | 0.1132 | 19.0 | 110713 | 0.2689 | 0.1738 | 0.0484 |
76
- | 0.1102 | 20.0 | 116540 | 0.2677 | 0.1768 | 0.0490 |
77
- | 0.1058 | 21.0 | 122367 | 0.3001 | 0.1708 | 0.0489 |
78
- | 0.101 | 22.0 | 128194 | 0.2749 | 0.1708 | 0.0481 |
79
- | 0.0981 | 23.0 | 134021 | 0.2524 | 0.1715 | 0.0480 |
80
- | 0.0942 | 24.0 | 139848 | 0.2832 | 0.1739 | 0.0483 |
81
- | 0.0919 | 25.0 | 145675 | 0.2899 | 0.1708 | 0.0475 |
82
- | 0.0885 | 26.0 | 151502 | 0.2952 | 0.1654 | 0.0462 |
83
- | 0.0859 | 27.0 | 157329 | 0.2989 | 0.1651 | 0.0474 |
84
- | 0.0826 | 28.0 | 163156 | 0.3051 | 0.1693 | 0.0471 |
85
- | 0.0802 | 29.0 | 168983 | 0.2831 | 0.1652 | 0.0467 |
86
- | 0.0772 | 30.0 | 174810 | 0.3043 | 0.1664 | 0.0476 |
87
- | 0.0756 | 31.0 | 180637 | 0.2851 | 0.1674 | 0.0466 |
88
- | 0.0727 | 32.0 | 186464 | 0.2960 | 0.1664 | 0.0455 |
89
- | 0.0709 | 33.0 | 192291 | 0.3147 | 0.1637 | 0.0455 |
90
- | 0.069 | 34.0 | 198118 | 0.2955 | 0.1646 | 0.0465 |
91
- | 0.0673 | 35.0 | 203945 | 0.2981 | 0.1631 | 0.0457 |
92
- | 0.0649 | 36.0 | 209772 | 0.3149 | 0.1632 | 0.0463 |
93
- | 0.0633 | 37.0 | 215599 | 0.3064 | 0.1605 | 0.0459 |
94
- | 0.0614 | 38.0 | 221426 | 0.3066 | 0.1621 | 0.0461 |
95
- | 0.0599 | 39.0 | 227253 | 0.3181 | 0.1585 | 0.0443 |
96
- | 0.0583 | 40.0 | 233080 | 0.3462 | 0.1605 | 0.0458 |
97
- | 0.056 | 41.0 | 238907 | 0.3290 | 0.1560 | 0.0448 |
98
- | 0.0546 | 42.0 | 244734 | 0.3308 | 0.1607 | 0.0454 |
99
- | 0.054 | 43.0 | 250561 | 0.3161 | 0.1588 | 0.0449 |
100
- | 0.0516 | 44.0 | 256388 | 0.3373 | 0.1596 | 0.0449 |
101
- | 0.0511 | 45.0 | 262215 | 0.3273 | 0.1539 | 0.0437 |
102
- | 0.0501 | 46.0 | 268042 | 0.3441 | 0.1579 | 0.0443 |
103
- | 0.049 | 47.0 | 273869 | 0.3175 | 0.1588 | 0.0440 |
104
- | 0.0473 | 48.0 | 279696 | 0.3164 | 0.1541 | 0.0434 |
105
- | 0.0464 | 49.0 | 285523 | 0.3446 | 0.1575 | 0.0449 |
106
- | 0.0455 | 50.0 | 291350 | 0.3130 | 0.1559 | 0.0440 |
107
- | 0.0442 | 51.0 | 297177 | 0.3148 | 0.1543 | 0.0435 |
108
- | 0.0439 | 52.0 | 303004 | 0.3314 | 0.1570 | 0.0441 |
109
- | 0.0425 | 53.0 | 308831 | 0.3262 | 0.1519 | 0.0428 |
110
- | 0.0418 | 54.0 | 314658 | 0.3151 | 0.1492 | 0.0422 |
111
- | 0.0408 | 55.0 | 320485 | 0.3371 | 0.1538 | 0.0423 |
112
- | 0.0395 | 56.0 | 326312 | 0.3290 | 0.1502 | 0.0419 |
113
- | 0.0389 | 57.0 | 332139 | 0.3278 | 0.1542 | 0.0432 |
114
- | 0.0377 | 58.0 | 337966 | 0.3255 | 0.1500 | 0.0421 |
115
- | 0.0367 | 59.0 | 343793 | 0.3304 | 0.1505 | 0.0423 |
116
- | 0.0366 | 60.0 | 349620 | 0.3693 | 0.1496 | 0.0430 |
117
- | 0.0353 | 61.0 | 355447 | 0.3630 | 0.1483 | 0.0424 |
118
- | 0.034 | 62.0 | 361274 | 0.3398 | 0.1463 | 0.0415 |
119
- | 0.0341 | 63.0 | 367101 | 0.3421 | 0.1478 | 0.0419 |
120
- | 0.0328 | 64.0 | 372928 | 0.3500 | 0.1447 | 0.0413 |
121
- | 0.032 | 65.0 | 378755 | 0.3587 | 0.1455 | 0.0416 |
122
- | 0.032 | 66.0 | 384582 | 0.3685 | 0.1436 | 0.0417 |
123
- | 0.0307 | 67.0 | 390409 | 0.3768 | 0.1481 | 0.0427 |
124
- | 0.0298 | 68.0 | 396236 | 0.3772 | 0.1467 | 0.0420 |
125
- | 0.0292 | 69.0 | 402063 | 0.3695 | 0.1464 | 0.0413 |
126
- | 0.0293 | 70.0 | 407890 | 0.4028 | 0.1460 | 0.0426 |
127
- | 0.0283 | 71.0 | 413717 | 0.3997 | 0.1462 | 0.0422 |
128
- | 0.0277 | 72.0 | 419544 | 0.4140 | 0.1466 | 0.0423 |
129
- | 0.0274 | 73.0 | 425371 | 0.3845 | 0.1434 | 0.0409 |
130
- | 0.0269 | 74.0 | 431198 | 0.3745 | 0.1427 | 0.0413 |
131
- | 0.0259 | 75.0 | 437025 | 0.4056 | 0.1481 | 0.0415 |
132
- | 0.025 | 76.0 | 442852 | 0.4140 | 0.1459 | 0.0414 |
133
- | 0.025 | 77.0 | 448679 | 0.4117 | 0.1454 | 0.0413 |
134
- | 0.0238 | 78.0 | 454506 | 0.3879 | 0.1436 | 0.0411 |
135
- | 0.0233 | 79.0 | 460333 | 0.3717 | 0.1410 | 0.0401 |
136
- | 0.0231 | 80.0 | 466160 | 0.4141 | 0.1421 | 0.0405 |
137
- | 0.0228 | 81.0 | 471987 | 0.4184 | 0.1414 | 0.0409 |
138
- | 0.0223 | 82.0 | 477814 | 0.3844 | 0.1414 | 0.0404 |
139
- | 0.0216 | 83.0 | 483641 | 0.4252 | 0.1403 | 0.0405 |
140
- | 0.0209 | 84.0 | 489468 | 0.3956 | 0.1429 | 0.0407 |
141
- | 0.0207 | 85.0 | 495295 | 0.4396 | 0.1421 | 0.0408 |
142
- | 0.0205 | 86.0 | 501122 | 0.4187 | 0.1395 | 0.0403 |
143
- | 0.0194 | 87.0 | 506949 | 0.4247 | 0.1411 | 0.0404 |
144
- | 0.0196 | 88.0 | 512776 | 0.4075 | 0.1389 | 0.0401 |
145
- | 0.0188 | 89.0 | 518603 | 0.4189 | 0.1375 | 0.0397 |
146
- | 0.0184 | 90.0 | 524430 | 0.4179 | 0.1357 | 0.0393 |
147
- | 0.0181 | 91.0 | 530257 | 0.4271 | 0.1360 | 0.0397 |
148
- | 0.0178 | 92.0 | 536084 | 0.4084 | 0.1370 | 0.0392 |
149
- | 0.0173 | 93.0 | 541911 | 0.4127 | 0.1354 | 0.0390 |
150
- | 0.017 | 94.0 | 547738 | 0.4151 | 0.1340 | 0.0388 |
151
- | 0.0166 | 95.0 | 553565 | 0.4307 | 0.1379 | 0.0394 |
152
- | 0.0164 | 96.0 | 559392 | 0.4154 | 0.1353 | 0.0392 |
153
- | 0.0163 | 97.0 | 565219 | 0.4244 | 0.1351 | 0.0390 |
154
- | 0.0163 | 98.0 | 571046 | 0.4336 | 0.1358 | 0.0393 |
155
- | 0.0158 | 99.0 | 576873 | 0.4322 | 0.1357 | 0.0393 |
156
- | 0.0158 | 100.0 | 582700 | 0.4346 | 0.1351 | 0.0392 |
157
 
158
 
159
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [facebook/hubert-large-ls960-ft](https://huggingface.co/facebook/hubert-large-ls960-ft) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.1921
22
+ - Wer: 0.0389
23
+ - Cer: 0.0143
24
 
25
  ## Model description
26
 
 
39
  ### Training hyperparameters
40
 
41
  The following hyperparameters were used during training:
42
+ - learning_rate: 0.0001
43
  - train_batch_size: 8
44
  - eval_batch_size: 4
45
  - seed: 42
 
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
56
  |:-------------:|:-----:|:------:|:---------------:|:------:|:------:|
57
+ | 1.1234 | 1.0 | 1385 | 0.3733 | 0.4333 | 0.0910 |
58
+ | 0.5164 | 2.0 | 2770 | 0.2676 | 0.2680 | 0.0634 |
59
+ | 0.4223 | 3.0 | 4155 | 0.2327 | 0.2027 | 0.0508 |
60
+ | 0.3671 | 4.0 | 5540 | 0.2044 | 0.1743 | 0.0446 |
61
+ | 0.3242 | 5.0 | 6925 | 0.1881 | 0.1466 | 0.0393 |
62
+ | 0.292 | 6.0 | 8310 | 0.1792 | 0.1307 | 0.0357 |
63
+ | 0.2669 | 7.0 | 9695 | 0.1740 | 0.1225 | 0.0341 |
64
+ | 0.244 | 8.0 | 11080 | 0.1647 | 0.1120 | 0.0321 |
65
+ | 0.2248 | 9.0 | 12465 | 0.1678 | 0.1033 | 0.0305 |
66
+ | 0.2111 | 10.0 | 13850 | 0.1653 | 0.0974 | 0.0291 |
67
+ | 0.1958 | 11.0 | 15235 | 0.1624 | 0.0910 | 0.0275 |
68
+ | 0.1852 | 12.0 | 16620 | 0.1482 | 0.0884 | 0.0266 |
69
+ | 0.1718 | 13.0 | 18005 | 0.1580 | 0.0859 | 0.0261 |
70
+ | 0.164 | 14.0 | 19390 | 0.1537 | 0.0802 | 0.0246 |
71
+ | 0.1531 | 15.0 | 20775 | 0.1525 | 0.0789 | 0.0246 |
72
+ | 0.1456 | 16.0 | 22160 | 0.1476 | 0.0761 | 0.0236 |
73
+ | 0.1376 | 17.0 | 23545 | 0.1513 | 0.0730 | 0.0232 |
74
+ | 0.1329 | 18.0 | 24930 | 0.1508 | 0.0732 | 0.0231 |
75
+ | 0.1267 | 19.0 | 26315 | 0.1580 | 0.0719 | 0.0222 |
76
+ | 0.123 | 20.0 | 27700 | 0.1538 | 0.0670 | 0.0214 |
77
+ | 0.1158 | 21.0 | 29085 | 0.1625 | 0.0677 | 0.0218 |
78
+ | 0.1111 | 22.0 | 30470 | 0.1451 | 0.0626 | 0.0205 |
79
+ | 0.1049 | 23.0 | 31855 | 0.1652 | 0.0635 | 0.0210 |
80
+ | 0.1023 | 24.0 | 33240 | 0.1562 | 0.0650 | 0.0209 |
81
+ | 0.0982 | 25.0 | 34625 | 0.1541 | 0.0626 | 0.0203 |
82
+ | 0.0954 | 26.0 | 36010 | 0.1545 | 0.0618 | 0.0202 |
83
+ | 0.0898 | 27.0 | 37395 | 0.1666 | 0.0598 | 0.0199 |
84
+ | 0.0881 | 28.0 | 38780 | 0.1656 | 0.0575 | 0.0196 |
85
+ | 0.0857 | 29.0 | 40165 | 0.1611 | 0.0590 | 0.0195 |
86
+ | 0.0815 | 30.0 | 41550 | 0.1595 | 0.0584 | 0.0193 |
87
+ | 0.0798 | 31.0 | 42935 | 0.1592 | 0.0576 | 0.0193 |
88
+ | 0.0784 | 32.0 | 44320 | 0.1586 | 0.0568 | 0.0187 |
89
+ | 0.0742 | 33.0 | 45705 | 0.1622 | 0.0568 | 0.0187 |
90
+ | 0.0736 | 34.0 | 47090 | 0.1705 | 0.0554 | 0.0187 |
91
+ | 0.0721 | 35.0 | 48475 | 0.1570 | 0.0530 | 0.0178 |
92
+ | 0.0686 | 36.0 | 49860 | 0.1658 | 0.0543 | 0.0179 |
93
+ | 0.0657 | 37.0 | 51245 | 0.1615 | 0.0526 | 0.0179 |
94
+ | 0.0647 | 38.0 | 52630 | 0.1646 | 0.0519 | 0.0178 |
95
+ | 0.0637 | 39.0 | 54015 | 0.1635 | 0.0515 | 0.0179 |
96
+ | 0.0614 | 40.0 | 55400 | 0.1716 | 0.0521 | 0.0175 |
97
+ | 0.0601 | 41.0 | 56785 | 0.1701 | 0.0504 | 0.0173 |
98
+ | 0.0596 | 42.0 | 58170 | 0.1598 | 0.0514 | 0.0174 |
99
+ | 0.0574 | 43.0 | 59555 | 0.1678 | 0.0506 | 0.0176 |
100
+ | 0.0564 | 44.0 | 60940 | 0.1679 | 0.0486 | 0.0170 |
101
+ | 0.0534 | 45.0 | 62325 | 0.1760 | 0.0490 | 0.0170 |
102
+ | 0.0536 | 46.0 | 63710 | 0.1722 | 0.0494 | 0.0170 |
103
+ | 0.0516 | 47.0 | 65095 | 0.1635 | 0.0486 | 0.0166 |
104
+ | 0.0504 | 48.0 | 66480 | 0.1652 | 0.0489 | 0.0169 |
105
+ | 0.0493 | 49.0 | 67865 | 0.1757 | 0.0480 | 0.0169 |
106
+ | 0.0491 | 50.0 | 69250 | 0.1734 | 0.0481 | 0.0167 |
107
+ | 0.0482 | 51.0 | 70635 | 0.1750 | 0.0479 | 0.0166 |
108
+ | 0.0465 | 52.0 | 72020 | 0.1762 | 0.0481 | 0.0166 |
109
+ | 0.0452 | 53.0 | 73405 | 0.1695 | 0.0461 | 0.0160 |
110
+ | 0.0456 | 54.0 | 74790 | 0.1732 | 0.0464 | 0.0160 |
111
+ | 0.0441 | 55.0 | 76175 | 0.1738 | 0.0455 | 0.0161 |
112
+ | 0.0438 | 56.0 | 77560 | 0.1771 | 0.0457 | 0.0161 |
113
+ | 0.0421 | 57.0 | 78945 | 0.1794 | 0.0452 | 0.0160 |
114
+ | 0.0416 | 58.0 | 80330 | 0.1673 | 0.0440 | 0.0157 |
115
+ | 0.0401 | 59.0 | 81715 | 0.1871 | 0.0448 | 0.0160 |
116
+ | 0.0407 | 60.0 | 83100 | 0.1705 | 0.0448 | 0.0156 |
117
+ | 0.0404 | 61.0 | 84485 | 0.1786 | 0.0446 | 0.0157 |
118
+ | 0.0379 | 62.0 | 85870 | 0.1760 | 0.0435 | 0.0155 |
119
+ | 0.0376 | 63.0 | 87255 | 0.1815 | 0.0445 | 0.0156 |
120
+ | 0.0358 | 64.0 | 88640 | 0.1808 | 0.0444 | 0.0158 |
121
+ | 0.0361 | 65.0 | 90025 | 0.1775 | 0.0433 | 0.0154 |
122
+ | 0.0347 | 66.0 | 91410 | 0.1740 | 0.0438 | 0.0155 |
123
+ | 0.0346 | 67.0 | 92795 | 0.1808 | 0.0437 | 0.0155 |
124
+ | 0.0343 | 68.0 | 94180 | 0.1774 | 0.0418 | 0.0153 |
125
+ | 0.0332 | 69.0 | 95565 | 0.1786 | 0.0408 | 0.0152 |
126
+ | 0.0324 | 70.0 | 96950 | 0.1846 | 0.0428 | 0.0155 |
127
+ | 0.0322 | 71.0 | 98335 | 0.1801 | 0.0422 | 0.0154 |
128
+ | 0.0331 | 72.0 | 99720 | 0.1740 | 0.0408 | 0.0147 |
129
+ | 0.0311 | 73.0 | 101105 | 0.1830 | 0.0418 | 0.0152 |
130
+ | 0.0299 | 74.0 | 102490 | 0.1874 | 0.0417 | 0.0153 |
131
+ | 0.0305 | 75.0 | 103875 | 0.1816 | 0.0411 | 0.0150 |
132
+ | 0.0301 | 76.0 | 105260 | 0.1799 | 0.0398 | 0.0146 |
133
+ | 0.029 | 77.0 | 106645 | 0.1890 | 0.0408 | 0.0149 |
134
+ | 0.0285 | 78.0 | 108030 | 0.1810 | 0.0385 | 0.0146 |
135
+ | 0.0286 | 79.0 | 109415 | 0.1874 | 0.0395 | 0.0147 |
136
+ | 0.0279 | 80.0 | 110800 | 0.1868 | 0.0399 | 0.0148 |
137
+ | 0.0274 | 81.0 | 112185 | 0.1852 | 0.0398 | 0.0147 |
138
+ | 0.0265 | 82.0 | 113570 | 0.1890 | 0.0408 | 0.0148 |
139
+ | 0.0267 | 83.0 | 114955 | 0.1908 | 0.0402 | 0.0148 |
140
+ | 0.0258 | 84.0 | 116340 | 0.1834 | 0.0396 | 0.0146 |
141
+ | 0.0268 | 85.0 | 117725 | 0.1945 | 0.0395 | 0.0146 |
142
+ | 0.0247 | 86.0 | 119110 | 0.1893 | 0.0397 | 0.0145 |
143
+ | 0.0249 | 87.0 | 120495 | 0.1904 | 0.0397 | 0.0145 |
144
+ | 0.0254 | 88.0 | 121880 | 0.1880 | 0.0403 | 0.0147 |
145
+ | 0.0248 | 89.0 | 123265 | 0.1860 | 0.0393 | 0.0146 |
146
+ | 0.0241 | 90.0 | 124650 | 0.1936 | 0.0389 | 0.0146 |
147
+ | 0.0232 | 91.0 | 126035 | 0.1922 | 0.0393 | 0.0144 |
148
+ | 0.0235 | 92.0 | 127420 | 0.1854 | 0.0390 | 0.0143 |
149
+ | 0.0227 | 93.0 | 128805 | 0.1921 | 0.0389 | 0.0143 |
 
 
 
 
 
 
 
150
 
151
 
152
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d5f28de01a1edd452b8fa292b825262ed0f556f6118a40247f0e160ca64b652e
3
  size 1261966548
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3a3eed6945f1b37f457d4673e4d6e9ed3bbd7cdd9f0b7edeed94eed205b0f52d
3
  size 1261966548