File size: 7,697 Bytes
fb8d9a6 ac71ad4 fb8d9a6 ac71ad4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 |
---
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-300m
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-xlsr-ln-50hr-v1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xlsr-ln-50hr-v1
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5500
- Model Preparation Time: 0.0092
- Wer: 0.2237
- Cer: 0.0739
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 500
- num_epochs: 120
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Wer | Cer |
|:-------------:|:-------:|:-----:|:---------------:|:----------------------:|:------:|:------:|
| 4.1139 | 0.9986 | 362 | 0.5868 | 0.0092 | 0.4335 | 0.1216 |
| 0.3001 | 2.0 | 725 | 0.3474 | 0.0092 | 0.2614 | 0.0792 |
| 0.2033 | 2.9986 | 1087 | 0.3256 | 0.0092 | 0.2023 | 0.0670 |
| 0.1629 | 4.0 | 1450 | 0.3155 | 0.0092 | 0.2089 | 0.0641 |
| 0.1366 | 4.9986 | 1812 | 0.2904 | 0.0092 | 0.1899 | 0.0577 |
| 0.1182 | 6.0 | 2175 | 0.2895 | 0.0092 | 0.1864 | 0.0572 |
| 0.1064 | 6.9986 | 2537 | 0.2815 | 0.0092 | 0.1671 | 0.0535 |
| 0.0945 | 8.0 | 2900 | 0.3037 | 0.0092 | 0.1706 | 0.0559 |
| 0.0845 | 8.9986 | 3262 | 0.3142 | 0.0092 | 0.1743 | 0.0581 |
| 0.0779 | 10.0 | 3625 | 0.3031 | 0.0092 | 0.1758 | 0.0572 |
| 0.0754 | 10.9986 | 3987 | 0.3111 | 0.0092 | 0.1704 | 0.0568 |
| 0.0687 | 12.0 | 4350 | 0.3130 | 0.0092 | 0.1664 | 0.0539 |
| 0.0582 | 12.9986 | 4712 | 0.3364 | 0.0092 | 0.1619 | 0.0526 |
| 0.0552 | 14.0 | 5075 | 0.3039 | 0.0092 | 0.1568 | 0.0527 |
| 0.054 | 14.9986 | 5437 | 0.3176 | 0.0092 | 0.1561 | 0.0507 |
| 0.0453 | 16.0 | 5800 | 0.3283 | 0.0092 | 0.1550 | 0.0519 |
| 0.046 | 16.9986 | 6162 | 0.3320 | 0.0092 | 0.1556 | 0.0504 |
| 0.0443 | 18.0 | 6525 | 0.3443 | 0.0092 | 0.1560 | 0.0510 |
| 0.0441 | 18.9986 | 6887 | 0.3392 | 0.0092 | 0.1549 | 0.0518 |
| 0.0375 | 20.0 | 7250 | 0.3526 | 0.0092 | 0.1565 | 0.0529 |
| 0.0371 | 20.9986 | 7612 | 0.3552 | 0.0092 | 0.1574 | 0.0541 |
| 0.0412 | 22.0 | 7975 | 0.3313 | 0.0092 | 0.1762 | 0.0565 |
| 0.041 | 22.9986 | 8337 | 0.3649 | 0.0092 | 0.1695 | 0.0572 |
| 0.0377 | 24.0 | 8700 | 0.3603 | 0.0092 | 0.1578 | 0.0532 |
| 0.0332 | 24.9986 | 9062 | 0.3496 | 0.0092 | 0.1513 | 0.0509 |
| 0.032 | 26.0 | 9425 | 0.3436 | 0.0092 | 0.1504 | 0.0517 |
| 0.0314 | 26.9986 | 9787 | 0.3573 | 0.0092 | 0.1545 | 0.0523 |
| 0.0281 | 28.0 | 10150 | 0.3644 | 0.0092 | 0.1504 | 0.0504 |
| 0.0268 | 28.9986 | 10512 | 0.3628 | 0.0092 | 0.1521 | 0.0506 |
| 0.0304 | 30.0 | 10875 | 0.3692 | 0.0092 | 0.1512 | 0.0510 |
| 0.0296 | 30.9986 | 11237 | 0.3573 | 0.0092 | 0.1493 | 0.0505 |
| 0.023 | 32.0 | 11600 | 0.3767 | 0.0092 | 0.1562 | 0.0516 |
| 0.0292 | 32.9986 | 11962 | 0.3462 | 0.0092 | 0.1496 | 0.0492 |
| 0.0261 | 34.0 | 12325 | 0.3927 | 0.0092 | 0.1500 | 0.0490 |
| 0.0248 | 34.9986 | 12687 | 0.3771 | 0.0092 | 0.1438 | 0.0492 |
| 0.0238 | 36.0 | 13050 | 0.3763 | 0.0092 | 0.1457 | 0.0474 |
| 0.0223 | 36.9986 | 13412 | 0.3627 | 0.0092 | 0.1523 | 0.0510 |
| 0.0225 | 38.0 | 13775 | 0.3825 | 0.0092 | 0.1468 | 0.0494 |
| 0.022 | 38.9986 | 14137 | 0.3830 | 0.0092 | 0.1614 | 0.0547 |
| 0.0226 | 40.0 | 14500 | 0.3851 | 0.0092 | 0.1488 | 0.0509 |
| 0.0225 | 40.9986 | 14862 | 0.4072 | 0.0092 | 0.1592 | 0.0530 |
| 0.0197 | 42.0 | 15225 | 0.4024 | 0.0092 | 0.1460 | 0.0502 |
| 0.0205 | 42.9986 | 15587 | 0.4099 | 0.0092 | 0.1491 | 0.0510 |
| 0.0195 | 44.0 | 15950 | 0.3746 | 0.0092 | 0.1449 | 0.0501 |
| 0.0187 | 44.9986 | 16312 | 0.3902 | 0.0092 | 0.1417 | 0.0487 |
| 0.0196 | 46.0 | 16675 | 0.3923 | 0.0092 | 0.1453 | 0.0497 |
| 0.0177 | 46.9986 | 17037 | 0.4107 | 0.0092 | 0.1458 | 0.0490 |
| 0.0175 | 48.0 | 17400 | 0.4043 | 0.0092 | 0.1478 | 0.0503 |
| 0.0178 | 48.9986 | 17762 | 0.4009 | 0.0092 | 0.1450 | 0.0514 |
| 0.0161 | 50.0 | 18125 | 0.4172 | 0.0092 | 0.1374 | 0.0472 |
| 0.015 | 50.9986 | 18487 | 0.4006 | 0.0092 | 0.1342 | 0.0463 |
| 0.015 | 52.0 | 18850 | 0.3975 | 0.0092 | 0.1399 | 0.0492 |
| 0.0173 | 52.9986 | 19212 | 0.3690 | 0.0092 | 0.1399 | 0.0493 |
| 0.0156 | 54.0 | 19575 | 0.4321 | 0.0092 | 0.1439 | 0.0504 |
| 0.0151 | 54.9986 | 19937 | 0.4353 | 0.0092 | 0.1443 | 0.0508 |
| 0.0151 | 56.0 | 20300 | 0.3784 | 0.0092 | 0.1394 | 0.0488 |
| 0.015 | 56.9986 | 20662 | 0.4225 | 0.0092 | 0.1415 | 0.0499 |
| 0.0128 | 58.0 | 21025 | 0.4172 | 0.0092 | 0.1421 | 0.0486 |
| 0.0124 | 58.9986 | 21387 | 0.3899 | 0.0092 | 0.1400 | 0.0479 |
| 0.0109 | 60.0 | 21750 | 0.4265 | 0.0092 | 0.1364 | 0.0468 |
| 0.0109 | 60.9986 | 22112 | 0.4143 | 0.0092 | 0.1400 | 0.0486 |
| 0.0118 | 62.0 | 22475 | 0.4204 | 0.0092 | 0.1446 | 0.0495 |
| 0.0125 | 62.9986 | 22837 | 0.4020 | 0.0092 | 0.1367 | 0.0472 |
### Framework versions
- Transformers 4.43.3
- Pytorch 2.1.0+cu118
- Datasets 2.20.0
- Tokenizers 0.19.1
|