End of training
Browse files- README.md +175 -195
- model.safetensors +1 -1
README.md
CHANGED
@@ -1,199 +1,179 @@
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
---
|
5 |
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
-
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
|
106 |
-
|
107 |
-
|
108 |
-
|
109 |
-
|
110 |
-
|
111 |
-
|
112 |
-
|
113 |
-
|
114 |
-
|
115 |
-
|
116 |
-
|
117 |
-
|
118 |
-
|
119 |
-
|
120 |
-
|
121 |
-
|
122 |
-
|
123 |
-
|
124 |
-
|
125 |
-
|
126 |
-
|
127 |
-
|
128 |
-
|
129 |
-
|
130 |
-
|
131 |
-
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
-
|
136 |
-
|
137 |
-
|
138 |
-
|
139 |
-
|
140 |
-
|
141 |
-
|
142 |
-
|
143 |
-
|
144 |
-
|
145 |
-
|
146 |
-
|
147 |
-
|
148 |
-
|
149 |
-
|
150 |
-
|
151 |
-
|
152 |
-
|
153 |
-
|
154 |
-
|
155 |
-
|
156 |
-
|
157 |
-
[More Information Needed]
|
158 |
-
|
159 |
-
### Compute Infrastructure
|
160 |
-
|
161 |
-
[More Information Needed]
|
162 |
-
|
163 |
-
#### Hardware
|
164 |
-
|
165 |
-
[More Information Needed]
|
166 |
-
|
167 |
-
#### Software
|
168 |
-
|
169 |
-
[More Information Needed]
|
170 |
-
|
171 |
-
## Citation [optional]
|
172 |
-
|
173 |
-
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
174 |
-
|
175 |
-
**BibTeX:**
|
176 |
-
|
177 |
-
[More Information Needed]
|
178 |
-
|
179 |
-
**APA:**
|
180 |
-
|
181 |
-
[More Information Needed]
|
182 |
-
|
183 |
-
## Glossary [optional]
|
184 |
-
|
185 |
-
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
|
186 |
-
|
187 |
-
[More Information Needed]
|
188 |
-
|
189 |
-
## More Information [optional]
|
190 |
-
|
191 |
-
[More Information Needed]
|
192 |
-
|
193 |
-
## Model Card Authors [optional]
|
194 |
-
|
195 |
-
[More Information Needed]
|
196 |
-
|
197 |
-
## Model Card Contact
|
198 |
-
|
199 |
-
[More Information Needed]
|
|
|
1 |
---
|
2 |
library_name: transformers
|
3 |
+
license: apache-2.0
|
4 |
+
base_model: facebook/hubert-large-ls960-ft
|
5 |
+
tags:
|
6 |
+
- generated_from_trainer
|
7 |
+
datasets:
|
8 |
+
- common_voice_17_0
|
9 |
+
metrics:
|
10 |
+
- wer
|
11 |
+
model-index:
|
12 |
+
- name: hubert-large-ls960-ft-lg-CV-v1
|
13 |
+
results:
|
14 |
+
- task:
|
15 |
+
name: Automatic Speech Recognition
|
16 |
+
type: automatic-speech-recognition
|
17 |
+
dataset:
|
18 |
+
name: common_voice_17_0
|
19 |
+
type: common_voice_17_0
|
20 |
+
config: lg
|
21 |
+
split: None
|
22 |
+
args: lg
|
23 |
+
metrics:
|
24 |
+
- name: Wer
|
25 |
+
type: wer
|
26 |
+
value: 0.20413735167489236
|
27 |
---
|
28 |
|
29 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
30 |
+
should probably proofread and complete it, then remove this comment. -->
|
31 |
+
|
32 |
+
# hubert-large-ls960-ft-lg-CV-v1
|
33 |
+
|
34 |
+
This model is a fine-tuned version of [facebook/hubert-large-ls960-ft](https://huggingface.co/facebook/hubert-large-ls960-ft) on the common_voice_17_0 dataset.
|
35 |
+
It achieves the following results on the evaluation set:
|
36 |
+
- Loss: 0.6251
|
37 |
+
- Wer: 0.2041
|
38 |
+
- Cer: 0.0609
|
39 |
+
|
40 |
+
## Model description
|
41 |
+
|
42 |
+
More information needed
|
43 |
+
|
44 |
+
## Intended uses & limitations
|
45 |
+
|
46 |
+
More information needed
|
47 |
+
|
48 |
+
## Training and evaluation data
|
49 |
+
|
50 |
+
More information needed
|
51 |
+
|
52 |
+
## Training procedure
|
53 |
+
|
54 |
+
### Training hyperparameters
|
55 |
+
|
56 |
+
The following hyperparameters were used during training:
|
57 |
+
- learning_rate: 0.0003
|
58 |
+
- train_batch_size: 8
|
59 |
+
- eval_batch_size: 4
|
60 |
+
- seed: 42
|
61 |
+
- gradient_accumulation_steps: 2
|
62 |
+
- total_train_batch_size: 16
|
63 |
+
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
64 |
+
- lr_scheduler_type: linear
|
65 |
+
- num_epochs: 100
|
66 |
+
- mixed_precision_training: Native AMP
|
67 |
+
|
68 |
+
### Training results
|
69 |
+
|
70 |
+
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|
71 |
+
|:-------------:|:-----:|:------:|:---------------:|:------:|:------:|
|
72 |
+
| 0.5741 | 1.0 | 4442 | 0.4573 | 0.4271 | 0.1144 |
|
73 |
+
| 0.3189 | 2.0 | 8884 | 0.3821 | 0.3444 | 0.0932 |
|
74 |
+
| 0.2692 | 3.0 | 13326 | 0.3881 | 0.3310 | 0.0912 |
|
75 |
+
| 0.2405 | 4.0 | 17768 | 0.3453 | 0.3136 | 0.0854 |
|
76 |
+
| 0.2191 | 5.0 | 22210 | 0.3476 | 0.2931 | 0.0823 |
|
77 |
+
| 0.2039 | 6.0 | 26652 | 0.3841 | 0.2880 | 0.0825 |
|
78 |
+
| 0.1913 | 7.0 | 31094 | 0.3532 | 0.2869 | 0.0798 |
|
79 |
+
| 0.18 | 8.0 | 35536 | 0.3727 | 0.2849 | 0.0823 |
|
80 |
+
| 0.1708 | 9.0 | 39978 | 0.3410 | 0.2773 | 0.0785 |
|
81 |
+
| 0.1624 | 10.0 | 44420 | 0.3604 | 0.2705 | 0.0794 |
|
82 |
+
| 0.1552 | 11.0 | 48862 | 0.3589 | 0.2661 | 0.0765 |
|
83 |
+
| 0.1485 | 12.0 | 53304 | 0.3614 | 0.2687 | 0.0770 |
|
84 |
+
| 0.1418 | 13.0 | 57746 | 0.3500 | 0.2637 | 0.0762 |
|
85 |
+
| 0.1358 | 14.0 | 62188 | 0.3713 | 0.2628 | 0.0766 |
|
86 |
+
| 0.131 | 15.0 | 66630 | 0.3908 | 0.2603 | 0.0758 |
|
87 |
+
| 0.1255 | 16.0 | 71072 | 0.4089 | 0.2608 | 0.0758 |
|
88 |
+
| 0.1205 | 17.0 | 75514 | 0.3848 | 0.2595 | 0.0742 |
|
89 |
+
| 0.1162 | 18.0 | 79956 | 0.3554 | 0.2594 | 0.0739 |
|
90 |
+
| 0.1125 | 19.0 | 84398 | 0.3461 | 0.2593 | 0.0742 |
|
91 |
+
| 0.1073 | 20.0 | 88840 | 0.3663 | 0.2545 | 0.0729 |
|
92 |
+
| 0.1039 | 21.0 | 93282 | 0.4556 | 0.2578 | 0.0743 |
|
93 |
+
| 0.1 | 22.0 | 97724 | 0.4258 | 0.2504 | 0.0724 |
|
94 |
+
| 0.0965 | 23.0 | 102166 | 0.4246 | 0.2545 | 0.0754 |
|
95 |
+
| 0.0931 | 24.0 | 106608 | 0.4570 | 0.2603 | 0.0757 |
|
96 |
+
| 0.0894 | 25.0 | 111050 | 0.4039 | 0.2488 | 0.0732 |
|
97 |
+
| 0.0865 | 26.0 | 115492 | 0.4119 | 0.2510 | 0.0720 |
|
98 |
+
| 0.083 | 27.0 | 119934 | 0.4227 | 0.2454 | 0.0716 |
|
99 |
+
| 0.0805 | 28.0 | 124376 | 0.4424 | 0.2541 | 0.0728 |
|
100 |
+
| 0.0777 | 29.0 | 128818 | 0.4061 | 0.2457 | 0.0709 |
|
101 |
+
| 0.0762 | 30.0 | 133260 | 0.4114 | 0.2450 | 0.0704 |
|
102 |
+
| 0.0724 | 31.0 | 137702 | 0.4599 | 0.2516 | 0.0719 |
|
103 |
+
| 0.0711 | 32.0 | 142144 | 0.4311 | 0.2466 | 0.0714 |
|
104 |
+
| 0.069 | 33.0 | 146586 | 0.4517 | 0.2482 | 0.0717 |
|
105 |
+
| 0.0673 | 34.0 | 151028 | 0.4728 | 0.2467 | 0.0712 |
|
106 |
+
| 0.0655 | 35.0 | 155470 | 0.4542 | 0.2437 | 0.0713 |
|
107 |
+
| 0.0634 | 36.0 | 159912 | 0.4546 | 0.2480 | 0.0713 |
|
108 |
+
| 0.0612 | 37.0 | 164354 | 0.4852 | 0.2479 | 0.0718 |
|
109 |
+
| 0.0607 | 38.0 | 168796 | 0.4892 | 0.2433 | 0.0705 |
|
110 |
+
| 0.0585 | 39.0 | 173238 | 0.4686 | 0.2416 | 0.0702 |
|
111 |
+
| 0.0573 | 40.0 | 177680 | 0.4725 | 0.2412 | 0.0710 |
|
112 |
+
| 0.0556 | 41.0 | 182122 | 0.4737 | 0.2385 | 0.0696 |
|
113 |
+
| 0.0548 | 42.0 | 186564 | 0.4964 | 0.2448 | 0.0704 |
|
114 |
+
| 0.0527 | 43.0 | 191006 | 0.5236 | 0.2429 | 0.0706 |
|
115 |
+
| 0.052 | 44.0 | 195448 | 0.5130 | 0.2415 | 0.0714 |
|
116 |
+
| 0.0503 | 45.0 | 199890 | 0.4936 | 0.2375 | 0.0688 |
|
117 |
+
| 0.0496 | 46.0 | 204332 | 0.5120 | 0.2336 | 0.0680 |
|
118 |
+
| 0.048 | 47.0 | 208774 | 0.4964 | 0.2362 | 0.0694 |
|
119 |
+
| 0.0473 | 48.0 | 213216 | 0.5200 | 0.2372 | 0.0687 |
|
120 |
+
| 0.0465 | 49.0 | 217658 | 0.5433 | 0.2424 | 0.0708 |
|
121 |
+
| 0.0447 | 50.0 | 222100 | 0.5008 | 0.2335 | 0.0680 |
|
122 |
+
| 0.0444 | 51.0 | 226542 | 0.5024 | 0.2247 | 0.0668 |
|
123 |
+
| 0.0431 | 52.0 | 230984 | 0.5003 | 0.2307 | 0.0669 |
|
124 |
+
| 0.0423 | 53.0 | 235426 | 0.4892 | 0.2331 | 0.0676 |
|
125 |
+
| 0.0403 | 54.0 | 239868 | 0.5495 | 0.2316 | 0.0679 |
|
126 |
+
| 0.0406 | 55.0 | 244310 | 0.5193 | 0.2278 | 0.0661 |
|
127 |
+
| 0.0391 | 56.0 | 248752 | 0.5961 | 0.2331 | 0.0687 |
|
128 |
+
| 0.0389 | 57.0 | 253194 | 0.5227 | 0.2297 | 0.0667 |
|
129 |
+
| 0.0379 | 58.0 | 257636 | 0.5506 | 0.2295 | 0.0672 |
|
130 |
+
| 0.0366 | 59.0 | 262078 | 0.5725 | 0.2231 | 0.0673 |
|
131 |
+
| 0.0357 | 60.0 | 266520 | 0.5493 | 0.2280 | 0.0662 |
|
132 |
+
| 0.0357 | 61.0 | 270962 | 0.5355 | 0.2269 | 0.0656 |
|
133 |
+
| 0.035 | 62.0 | 275404 | 0.5430 | 0.2226 | 0.0653 |
|
134 |
+
| 0.0343 | 63.0 | 279846 | 0.5375 | 0.2211 | 0.0644 |
|
135 |
+
| 0.0334 | 64.0 | 284288 | 0.5769 | 0.2248 | 0.0668 |
|
136 |
+
| 0.0333 | 65.0 | 288730 | 0.5763 | 0.2183 | 0.0642 |
|
137 |
+
| 0.0322 | 66.0 | 293172 | 0.5787 | 0.2190 | 0.0653 |
|
138 |
+
| 0.0314 | 67.0 | 297614 | 0.5564 | 0.2207 | 0.0642 |
|
139 |
+
| 0.0305 | 68.0 | 302056 | 0.5813 | 0.2208 | 0.0666 |
|
140 |
+
| 0.03 | 69.0 | 306498 | 0.5837 | 0.2217 | 0.0647 |
|
141 |
+
| 0.0292 | 70.0 | 310940 | 0.5723 | 0.2238 | 0.0649 |
|
142 |
+
| 0.0284 | 71.0 | 315382 | 0.5503 | 0.2218 | 0.0645 |
|
143 |
+
| 0.0285 | 72.0 | 319824 | 0.5615 | 0.2187 | 0.0636 |
|
144 |
+
| 0.0276 | 73.0 | 324266 | 0.5725 | 0.2178 | 0.0650 |
|
145 |
+
| 0.0273 | 74.0 | 328708 | 0.5483 | 0.2187 | 0.0634 |
|
146 |
+
| 0.027 | 75.0 | 333150 | 0.5627 | 0.2148 | 0.0632 |
|
147 |
+
| 0.026 | 76.0 | 337592 | 0.5610 | 0.2203 | 0.0655 |
|
148 |
+
| 0.0253 | 77.0 | 342034 | 0.5776 | 0.2153 | 0.0635 |
|
149 |
+
| 0.0248 | 78.0 | 346476 | 0.5823 | 0.2173 | 0.0643 |
|
150 |
+
| 0.0242 | 79.0 | 350918 | 0.5968 | 0.2172 | 0.0639 |
|
151 |
+
| 0.0241 | 80.0 | 355360 | 0.6121 | 0.2185 | 0.0647 |
|
152 |
+
| 0.0232 | 81.0 | 359802 | 0.5909 | 0.2140 | 0.0648 |
|
153 |
+
| 0.0227 | 82.0 | 364244 | 0.6262 | 0.2209 | 0.0663 |
|
154 |
+
| 0.0224 | 83.0 | 368686 | 0.5913 | 0.2137 | 0.0645 |
|
155 |
+
| 0.0215 | 84.0 | 373128 | 0.6057 | 0.2141 | 0.0642 |
|
156 |
+
| 0.0212 | 85.0 | 377570 | 0.6079 | 0.2135 | 0.0635 |
|
157 |
+
| 0.0209 | 86.0 | 382012 | 0.6067 | 0.2117 | 0.0639 |
|
158 |
+
| 0.0201 | 87.0 | 386454 | 0.6119 | 0.2108 | 0.0638 |
|
159 |
+
| 0.0199 | 88.0 | 390896 | 0.6298 | 0.2112 | 0.0638 |
|
160 |
+
| 0.0194 | 89.0 | 395338 | 0.6054 | 0.2083 | 0.0620 |
|
161 |
+
| 0.0192 | 90.0 | 399780 | 0.6238 | 0.2083 | 0.0634 |
|
162 |
+
| 0.0184 | 91.0 | 404222 | 0.6293 | 0.2099 | 0.0630 |
|
163 |
+
| 0.0184 | 92.0 | 408664 | 0.6166 | 0.2058 | 0.0611 |
|
164 |
+
| 0.0182 | 93.0 | 413106 | 0.6175 | 0.2072 | 0.0618 |
|
165 |
+
| 0.0179 | 94.0 | 417548 | 0.6196 | 0.2061 | 0.0610 |
|
166 |
+
| 0.0176 | 95.0 | 421990 | 0.6181 | 0.2059 | 0.0614 |
|
167 |
+
| 0.0174 | 96.0 | 426432 | 0.6187 | 0.2039 | 0.0606 |
|
168 |
+
| 0.0167 | 97.0 | 430874 | 0.6381 | 0.2064 | 0.0615 |
|
169 |
+
| 0.017 | 98.0 | 435316 | 0.6268 | 0.2049 | 0.0611 |
|
170 |
+
| 0.0165 | 99.0 | 439758 | 0.6262 | 0.2041 | 0.0610 |
|
171 |
+
| 0.0166 | 100.0 | 444200 | 0.6251 | 0.2041 | 0.0609 |
|
172 |
+
|
173 |
+
|
174 |
+
### Framework versions
|
175 |
+
|
176 |
+
- Transformers 4.46.3
|
177 |
+
- Pytorch 2.1.0+cu118
|
178 |
+
- Datasets 3.1.0
|
179 |
+
- Tokenizers 0.20.3
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 1261966548
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2c9c1a36201b6b0dddda8af15d9d11754987eab9ffe095e45bfe5cab60348e9d
|
3 |
size 1261966548
|