|
--- |
|
license: apache-2.0 |
|
base_model: facebook/wav2vec2-large-lv60 |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: k2e-20s_asr-scr_w2v2-large-lv60_001 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# k2e-20s_asr-scr_w2v2-large-lv60_001 |
|
|
|
This model is a fine-tuned version of [facebook/wav2vec2-large-lv60](https://huggingface.co/facebook/wav2vec2-large-lv60) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 3.9960 |
|
- Per: 0.9491 |
|
- Pcc: 0.4581 |
|
- Ctc Loss: 2.9538 |
|
- Mse Loss: 1.3478 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 1e-05 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 1 |
|
- seed: 2222 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 2235 |
|
- training_steps: 22350 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Per | Pcc | Ctc Loss | Mse Loss | |
|
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:--------:|:--------:| |
|
| 30.007 | 3.0 | 2235 | 4.9713 | 0.9890 | 0.4691 | 3.8731 | 1.1616 | |
|
| 4.5436 | 6.01 | 4470 | 4.8907 | 0.9890 | 0.4458 | 3.7192 | 1.3424 | |
|
| 4.1973 | 9.01 | 6705 | 4.7504 | 0.9890 | 0.4791 | 3.6454 | 1.3436 | |
|
| 3.9659 | 12.02 | 8940 | 4.9631 | 0.9627 | 0.3953 | 3.5684 | 1.6631 | |
|
| 3.7945 | 15.02 | 11175 | 4.6238 | 0.9627 | 0.3228 | 3.5522 | 1.3937 | |
|
| 3.657 | 18.02 | 13410 | 4.8315 | 0.9627 | 0.3795 | 3.4947 | 1.6564 | |
|
| 3.4943 | 21.03 | 15645 | 4.5083 | 0.9626 | 0.4295 | 3.3623 | 1.4824 | |
|
| 3.3082 | 24.03 | 17880 | 4.1212 | 0.9625 | 0.4469 | 3.1651 | 1.2958 | |
|
| 3.1432 | 27.04 | 20115 | 4.1271 | 0.9586 | 0.4566 | 3.0120 | 1.4192 | |
|
| 3.0438 | 30.04 | 22350 | 3.9960 | 0.9491 | 0.4581 | 2.9538 | 1.3478 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.38.1 |
|
- Pytorch 2.0.1 |
|
- Datasets 2.16.1 |
|
- Tokenizers 0.15.2 |
|
|