metadata
library_name: transformers
license: mit
base_model: facebook/w2v-bert-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: w2v-bert-2.0_BIG-C_corpus_Bemba_1hr_v1
results: []
w2v-bert-2.0_BIG-C_corpus_Bemba_1hr_v1
This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5722
- Wer: 0.5985
- Cer: 0.1196
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 500
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
7.0053 | 1.0 | 15 | 6.3978 | 1.0085 | 1.1686 |
5.7304 | 2.0 | 30 | 4.1429 | 1.0 | 0.9477 |
3.6107 | 3.0 | 45 | 3.1577 | 1.0 | 0.9999 |
3.087 | 4.0 | 60 | 2.8811 | 1.0 | 0.9799 |
2.8587 | 5.0 | 75 | 2.7459 | 1.0 | 0.9090 |
2.7163 | 6.0 | 90 | 2.4646 | 1.0 | 0.8254 |
2.3267 | 7.0 | 105 | 1.9810 | 0.9998 | 0.6766 |
1.7199 | 8.0 | 120 | 1.3007 | 0.9820 | 0.3001 |
1.1524 | 9.0 | 135 | 0.9403 | 0.8640 | 0.2081 |
0.9037 | 10.0 | 150 | 0.8880 | 0.7630 | 0.1692 |
0.7707 | 11.0 | 165 | 0.7744 | 0.7416 | 0.1788 |
0.6817 | 12.0 | 180 | 0.7403 | 0.6391 | 0.1462 |
0.6124 | 13.0 | 195 | 0.7595 | 0.6170 | 0.1406 |
0.5606 | 14.0 | 210 | 0.7323 | 0.6665 | 0.1565 |
0.5283 | 15.0 | 225 | 0.7329 | 0.7097 | 0.1781 |
0.4703 | 16.0 | 240 | 0.7322 | 0.6011 | 0.1402 |
0.5413 | 17.0 | 255 | 0.7942 | 0.7116 | 0.1545 |
0.531 | 18.0 | 270 | 0.8518 | 0.6595 | 0.1535 |
0.5132 | 19.0 | 285 | 0.8821 | 0.6633 | 0.1442 |
0.4961 | 20.0 | 300 | 0.7836 | 0.6450 | 0.1478 |
0.5584 | 21.0 | 315 | 0.9809 | 0.6544 | 0.1546 |
0.7199 | 22.0 | 330 | 0.9238 | 0.7732 | 0.2111 |
0.8428 | 23.0 | 345 | 0.8865 | 0.7223 | 0.1874 |
0.9216 | 24.0 | 360 | 1.3912 | 0.9975 | 0.6157 |
1.1638 | 25.0 | 375 | 1.1943 | 0.7590 | 0.1808 |
1.0508 | 26.0 | 390 | 1.1233 | 0.9919 | 0.4404 |
Framework versions
- Transformers 4.44.1
- Pytorch 2.2.0+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1