End of training
Browse files
README.md
CHANGED
@@ -1,199 +1,162 @@
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
---
|
5 |
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
|
106 |
-
|
107 |
-
|
108 |
-
|
109 |
-
|
110 |
-
|
111 |
-
|
112 |
-
|
113 |
-
|
114 |
-
|
115 |
-
|
116 |
-
|
117 |
-
|
118 |
-
|
119 |
-
|
120 |
-
|
121 |
-
|
122 |
-
|
123 |
-
|
124 |
-
|
125 |
-
|
126 |
-
|
127 |
-
|
128 |
-
|
129 |
-
|
130 |
-
|
131 |
-
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
-
|
136 |
-
|
137 |
-
|
138 |
-
|
139 |
-
|
140 |
-
|
141 |
-
|
142 |
-
|
143 |
-
|
144 |
-
|
145 |
-
|
146 |
-
|
147 |
-
|
148 |
-
|
149 |
-
|
150 |
-
|
151 |
-
-
|
152 |
-
|
153 |
-
|
154 |
-
|
155 |
-
### Model Architecture and Objective
|
156 |
-
|
157 |
-
[More Information Needed]
|
158 |
-
|
159 |
-
### Compute Infrastructure
|
160 |
-
|
161 |
-
[More Information Needed]
|
162 |
-
|
163 |
-
#### Hardware
|
164 |
-
|
165 |
-
[More Information Needed]
|
166 |
-
|
167 |
-
#### Software
|
168 |
-
|
169 |
-
[More Information Needed]
|
170 |
-
|
171 |
-
## Citation [optional]
|
172 |
-
|
173 |
-
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
174 |
-
|
175 |
-
**BibTeX:**
|
176 |
-
|
177 |
-
[More Information Needed]
|
178 |
-
|
179 |
-
**APA:**
|
180 |
-
|
181 |
-
[More Information Needed]
|
182 |
-
|
183 |
-
## Glossary [optional]
|
184 |
-
|
185 |
-
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
|
186 |
-
|
187 |
-
[More Information Needed]
|
188 |
-
|
189 |
-
## More Information [optional]
|
190 |
-
|
191 |
-
[More Information Needed]
|
192 |
-
|
193 |
-
## Model Card Authors [optional]
|
194 |
-
|
195 |
-
[More Information Needed]
|
196 |
-
|
197 |
-
## Model Card Contact
|
198 |
-
|
199 |
-
[More Information Needed]
|
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
+
license: apache-2.0
|
4 |
+
base_model: facebook/wav2vec2-xls-r-300m
|
5 |
+
tags:
|
6 |
+
- generated_from_trainer
|
7 |
+
metrics:
|
8 |
+
- wer
|
9 |
+
model-index:
|
10 |
+
- name: wav2vec2-xls-r-300m-CV-Fleurs-lg-10hrs-v6
|
11 |
+
results: []
|
12 |
---
|
13 |
|
14 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
15 |
+
should probably proofread and complete it, then remove this comment. -->
|
16 |
+
|
17 |
+
# wav2vec2-xls-r-300m-CV-Fleurs-lg-10hrs-v6
|
18 |
+
|
19 |
+
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset.
|
20 |
+
It achieves the following results on the evaluation set:
|
21 |
+
- Loss: 1.3042
|
22 |
+
- Wer: 0.6058
|
23 |
+
- Cer: 0.1381
|
24 |
+
|
25 |
+
## Model description
|
26 |
+
|
27 |
+
More information needed
|
28 |
+
|
29 |
+
## Intended uses & limitations
|
30 |
+
|
31 |
+
More information needed
|
32 |
+
|
33 |
+
## Training and evaluation data
|
34 |
+
|
35 |
+
More information needed
|
36 |
+
|
37 |
+
## Training procedure
|
38 |
+
|
39 |
+
### Training hyperparameters
|
40 |
+
|
41 |
+
The following hyperparameters were used during training:
|
42 |
+
- learning_rate: 0.0003
|
43 |
+
- train_batch_size: 4
|
44 |
+
- eval_batch_size: 2
|
45 |
+
- seed: 42
|
46 |
+
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
47 |
+
- lr_scheduler_type: linear
|
48 |
+
- num_epochs: 100
|
49 |
+
- mixed_precision_training: Native AMP
|
50 |
+
|
51 |
+
### Training results
|
52 |
+
|
53 |
+
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|
54 |
+
|:-------------:|:-----:|:------:|:---------------:|:------:|:------:|
|
55 |
+
| 3.1664 | 1.0 | 1025 | 2.5342 | 1.0 | 0.8840 |
|
56 |
+
| 1.8493 | 2.0 | 2050 | 1.2129 | 0.9929 | 0.3862 |
|
57 |
+
| 1.2976 | 3.0 | 3075 | 0.9944 | 0.9482 | 0.2930 |
|
58 |
+
| 1.1249 | 4.0 | 4100 | 0.9405 | 0.9249 | 0.2753 |
|
59 |
+
| 0.9905 | 5.0 | 5125 | 0.8406 | 0.9072 | 0.2535 |
|
60 |
+
| 0.8961 | 6.0 | 6150 | 0.8121 | 0.8699 | 0.2368 |
|
61 |
+
| 0.8101 | 7.0 | 7175 | 0.7705 | 0.8607 | 0.2273 |
|
62 |
+
| 0.728 | 8.0 | 8200 | 0.7325 | 0.8330 | 0.2163 |
|
63 |
+
| 0.6653 | 9.0 | 9225 | 0.7483 | 0.8089 | 0.2060 |
|
64 |
+
| 0.6063 | 10.0 | 10250 | 0.7343 | 0.7981 | 0.2018 |
|
65 |
+
| 0.5503 | 11.0 | 11275 | 0.7499 | 0.7667 | 0.1887 |
|
66 |
+
| 0.5016 | 12.0 | 12300 | 0.7474 | 0.7734 | 0.1917 |
|
67 |
+
| 0.4574 | 13.0 | 13325 | 0.7665 | 0.7479 | 0.1858 |
|
68 |
+
| 0.4227 | 14.0 | 14350 | 0.8014 | 0.7605 | 0.1866 |
|
69 |
+
| 0.3898 | 15.0 | 15375 | 0.8140 | 0.7618 | 0.1817 |
|
70 |
+
| 0.3664 | 16.0 | 16400 | 0.7794 | 0.7234 | 0.1758 |
|
71 |
+
| 0.3389 | 17.0 | 17425 | 0.8175 | 0.7290 | 0.1762 |
|
72 |
+
| 0.3155 | 18.0 | 18450 | 0.8647 | 0.7284 | 0.1799 |
|
73 |
+
| 0.2991 | 19.0 | 19475 | 0.8268 | 0.7134 | 0.1731 |
|
74 |
+
| 0.2825 | 20.0 | 20500 | 0.9408 | 0.7312 | 0.1756 |
|
75 |
+
| 0.2665 | 21.0 | 21525 | 0.9131 | 0.7307 | 0.1715 |
|
76 |
+
| 0.253 | 22.0 | 22550 | 0.9645 | 0.7242 | 0.1747 |
|
77 |
+
| 0.2354 | 23.0 | 23575 | 0.9436 | 0.7125 | 0.1699 |
|
78 |
+
| 0.231 | 24.0 | 24600 | 0.9521 | 0.7239 | 0.1702 |
|
79 |
+
| 0.2178 | 25.0 | 25625 | 0.9751 | 0.7076 | 0.1694 |
|
80 |
+
| 0.2086 | 26.0 | 26650 | 0.9704 | 0.6945 | 0.1689 |
|
81 |
+
| 0.2002 | 27.0 | 27675 | 0.9937 | 0.7077 | 0.1682 |
|
82 |
+
| 0.1968 | 28.0 | 28700 | 0.9523 | 0.6959 | 0.1682 |
|
83 |
+
| 0.1889 | 29.0 | 29725 | 1.0351 | 0.6908 | 0.1653 |
|
84 |
+
| 0.182 | 30.0 | 30750 | 1.0054 | 0.6933 | 0.1644 |
|
85 |
+
| 0.1723 | 31.0 | 31775 | 1.0039 | 0.6930 | 0.1646 |
|
86 |
+
| 0.1695 | 32.0 | 32800 | 1.0005 | 0.6855 | 0.1632 |
|
87 |
+
| 0.1633 | 33.0 | 33825 | 1.0273 | 0.6897 | 0.1633 |
|
88 |
+
| 0.1571 | 34.0 | 34850 | 1.0361 | 0.6850 | 0.1615 |
|
89 |
+
| 0.1573 | 35.0 | 35875 | 1.0092 | 0.6767 | 0.1604 |
|
90 |
+
| 0.1511 | 36.0 | 36900 | 1.0353 | 0.6816 | 0.1622 |
|
91 |
+
| 0.1469 | 37.0 | 37925 | 1.0394 | 0.6716 | 0.1618 |
|
92 |
+
| 0.1495 | 38.0 | 38950 | 1.1006 | 0.6804 | 0.1621 |
|
93 |
+
| 0.1411 | 39.0 | 39975 | 1.1300 | 0.6742 | 0.1603 |
|
94 |
+
| 0.1391 | 40.0 | 41000 | 1.0378 | 0.6801 | 0.1591 |
|
95 |
+
| 0.138 | 41.0 | 42025 | 1.0655 | 0.6679 | 0.1581 |
|
96 |
+
| 0.1304 | 42.0 | 43050 | 1.1279 | 0.6777 | 0.1594 |
|
97 |
+
| 0.1308 | 43.0 | 44075 | 1.0743 | 0.6786 | 0.1572 |
|
98 |
+
| 0.128 | 44.0 | 45100 | 1.1424 | 0.6683 | 0.1569 |
|
99 |
+
| 0.1261 | 45.0 | 46125 | 1.0351 | 0.6787 | 0.1596 |
|
100 |
+
| 0.1242 | 46.0 | 47150 | 1.1587 | 0.6656 | 0.1556 |
|
101 |
+
| 0.1179 | 47.0 | 48175 | 1.1617 | 0.6538 | 0.1555 |
|
102 |
+
| 0.1164 | 48.0 | 49200 | 1.1593 | 0.6604 | 0.1567 |
|
103 |
+
| 0.1137 | 49.0 | 50225 | 1.1450 | 0.6622 | 0.1554 |
|
104 |
+
| 0.1102 | 50.0 | 51250 | 1.1221 | 0.6593 | 0.1555 |
|
105 |
+
| 0.1107 | 51.0 | 52275 | 1.1194 | 0.6618 | 0.1538 |
|
106 |
+
| 0.1088 | 52.0 | 53300 | 1.1452 | 0.6503 | 0.1527 |
|
107 |
+
| 0.1078 | 53.0 | 54325 | 1.1679 | 0.6529 | 0.1529 |
|
108 |
+
| 0.1054 | 54.0 | 55350 | 1.1926 | 0.6437 | 0.1503 |
|
109 |
+
| 0.1012 | 55.0 | 56375 | 1.1483 | 0.6568 | 0.1531 |
|
110 |
+
| 0.1047 | 56.0 | 57400 | 1.1756 | 0.6544 | 0.1528 |
|
111 |
+
| 0.0958 | 57.0 | 58425 | 1.2168 | 0.6531 | 0.1512 |
|
112 |
+
| 0.0966 | 58.0 | 59450 | 1.1973 | 0.6383 | 0.1493 |
|
113 |
+
| 0.0966 | 59.0 | 60475 | 1.1830 | 0.6493 | 0.1511 |
|
114 |
+
| 0.0948 | 60.0 | 61500 | 1.2027 | 0.6438 | 0.1509 |
|
115 |
+
| 0.0888 | 61.0 | 62525 | 1.1959 | 0.6413 | 0.1498 |
|
116 |
+
| 0.0909 | 62.0 | 63550 | 1.2046 | 0.6507 | 0.1512 |
|
117 |
+
| 0.0888 | 63.0 | 64575 | 1.2052 | 0.6347 | 0.1487 |
|
118 |
+
| 0.0869 | 64.0 | 65600 | 1.2118 | 0.6324 | 0.1482 |
|
119 |
+
| 0.0846 | 65.0 | 66625 | 1.1967 | 0.6365 | 0.1480 |
|
120 |
+
| 0.0833 | 66.0 | 67650 | 1.1957 | 0.6323 | 0.1460 |
|
121 |
+
| 0.0827 | 67.0 | 68675 | 1.1928 | 0.6370 | 0.1470 |
|
122 |
+
| 0.0824 | 68.0 | 69700 | 1.2578 | 0.6416 | 0.1472 |
|
123 |
+
| 0.08 | 69.0 | 70725 | 1.2427 | 0.6284 | 0.1447 |
|
124 |
+
| 0.0787 | 70.0 | 71750 | 1.2061 | 0.6295 | 0.1462 |
|
125 |
+
| 0.0777 | 71.0 | 72775 | 1.2185 | 0.6315 | 0.1454 |
|
126 |
+
| 0.0736 | 72.0 | 73800 | 1.2454 | 0.6237 | 0.1445 |
|
127 |
+
| 0.0746 | 73.0 | 74825 | 1.2629 | 0.6298 | 0.1464 |
|
128 |
+
| 0.0735 | 74.0 | 75850 | 1.2398 | 0.6218 | 0.1428 |
|
129 |
+
| 0.0724 | 75.0 | 76875 | 1.2727 | 0.6269 | 0.1440 |
|
130 |
+
| 0.0698 | 76.0 | 77900 | 1.2327 | 0.6259 | 0.1439 |
|
131 |
+
| 0.0677 | 77.0 | 78925 | 1.2338 | 0.6213 | 0.1442 |
|
132 |
+
| 0.0699 | 78.0 | 79950 | 1.2755 | 0.6226 | 0.1442 |
|
133 |
+
| 0.0656 | 79.0 | 80975 | 1.2734 | 0.6237 | 0.1431 |
|
134 |
+
| 0.0622 | 80.0 | 82000 | 1.2733 | 0.6211 | 0.1427 |
|
135 |
+
| 0.0648 | 81.0 | 83025 | 1.2345 | 0.6274 | 0.1421 |
|
136 |
+
| 0.0626 | 82.0 | 84050 | 1.2670 | 0.6273 | 0.1430 |
|
137 |
+
| 0.0632 | 83.0 | 85075 | 1.2634 | 0.6150 | 0.1422 |
|
138 |
+
| 0.0611 | 84.0 | 86100 | 1.3266 | 0.6192 | 0.1418 |
|
139 |
+
| 0.0608 | 85.0 | 87125 | 1.2889 | 0.6153 | 0.1414 |
|
140 |
+
| 0.0581 | 86.0 | 88150 | 1.2808 | 0.6146 | 0.1406 |
|
141 |
+
| 0.0586 | 87.0 | 89175 | 1.3120 | 0.6142 | 0.1406 |
|
142 |
+
| 0.0575 | 88.0 | 90200 | 1.2701 | 0.6171 | 0.1409 |
|
143 |
+
| 0.0577 | 89.0 | 91225 | 1.2916 | 0.6116 | 0.1400 |
|
144 |
+
| 0.0569 | 90.0 | 92250 | 1.3074 | 0.6132 | 0.1401 |
|
145 |
+
| 0.0552 | 91.0 | 93275 | 1.3030 | 0.6115 | 0.1388 |
|
146 |
+
| 0.0563 | 92.0 | 94300 | 1.2719 | 0.6082 | 0.1387 |
|
147 |
+
| 0.0516 | 93.0 | 95325 | 1.2853 | 0.6078 | 0.1380 |
|
148 |
+
| 0.0523 | 94.0 | 96350 | 1.2953 | 0.6096 | 0.1389 |
|
149 |
+
| 0.0489 | 95.0 | 97375 | 1.3099 | 0.6097 | 0.1387 |
|
150 |
+
| 0.0513 | 96.0 | 98400 | 1.3082 | 0.6095 | 0.1388 |
|
151 |
+
| 0.0522 | 97.0 | 99425 | 1.3076 | 0.6097 | 0.1384 |
|
152 |
+
| 0.0498 | 98.0 | 100450 | 1.3003 | 0.6073 | 0.1383 |
|
153 |
+
| 0.0506 | 99.0 | 101475 | 1.3012 | 0.6067 | 0.1382 |
|
154 |
+
| 0.0491 | 100.0 | 102500 | 1.3042 | 0.6058 | 0.1381 |
|
155 |
+
|
156 |
+
|
157 |
+
### Framework versions
|
158 |
+
|
159 |
+
- Transformers 4.46.1
|
160 |
+
- Pytorch 2.1.0+cu118
|
161 |
+
- Datasets 3.1.0
|
162 |
+
- Tokenizers 0.20.1
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|