metadata
library_name: transformers
base_model: Samuael/ethiopic-asr-characters
tags:
- generated_from_trainer
datasets:
- alffa_amharic
metrics:
- wer
model-index:
- name: ethiopic-asr-characters
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: alffa_amharic
type: alffa_amharic
config: clean
split: None
args: clean
metrics:
- name: Wer
type: wer
value: 0.3037292519939642
ethiopic-asr-characters
This model is a fine-tuned version of Samuael/ethiopic-asr-characters on the alffa_amharic dataset. It achieves the following results on the evaluation set:
- Loss: 0.4233
- Wer: 0.3037
- Phoneme Cer: 0.1439
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Phoneme Cer |
---|---|---|---|---|---|
0.1902 | 0.2312 | 200 | 0.5678 | 0.3431 | 0.1555 |
0.1622 | 0.4624 | 400 | 0.5520 | 0.3321 | 0.1532 |
0.0933 | 0.6936 | 600 | 0.5467 | 0.3292 | 0.1527 |
0.1337 | 0.9249 | 800 | 0.5601 | 0.3315 | 0.1531 |
0.091 | 1.1561 | 1000 | 0.5508 | 0.3307 | 0.1522 |
0.2286 | 1.3873 | 1200 | 0.5274 | 0.3267 | 0.1510 |
0.1596 | 1.6185 | 1400 | 0.5085 | 0.3273 | 0.1506 |
0.2091 | 1.8497 | 1600 | 0.4826 | 0.3198 | 0.1498 |
0.2798 | 2.0809 | 1800 | 0.4731 | 0.3187 | 0.1489 |
0.2678 | 2.3121 | 2000 | 0.4700 | 0.3165 | 0.1482 |
0.2777 | 2.5434 | 2200 | 0.4521 | 0.3137 | 0.1468 |
0.5036 | 2.7746 | 2400 | 0.4592 | 0.3133 | 0.1463 |
0.3513 | 3.0058 | 2600 | 0.4535 | 0.3119 | 0.1469 |
0.6127 | 3.2370 | 2800 | 0.4434 | 0.3080 | 0.1457 |
0.3057 | 3.4682 | 3000 | 0.4377 | 0.3090 | 0.1454 |
0.346 | 3.6994 | 3200 | 0.4348 | 0.3077 | 0.1449 |
0.3782 | 3.9306 | 3400 | 0.4273 | 0.3035 | 0.1442 |
0.2468 | 4.1618 | 3600 | 0.4270 | 0.3051 | 0.1442 |
0.1932 | 4.3931 | 3800 | 0.4232 | 0.3029 | 0.1436 |
0.3175 | 4.6243 | 4000 | 0.4269 | 0.3040 | 0.1435 |
0.1834 | 4.8555 | 4200 | 0.4233 | 0.3037 | 0.1439 |
Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3