File size: 2,450 Bytes
f27e681 2ca1edd f27e681 2ca1edd e98e45e 2ca1edd f043d19 2ca1edd 5ed299d f27e681 2ca1edd b6be0ca 2ca1edd e98e45e 2ca1edd 296a8c4 2ca1edd 296a8c4 2ca1edd |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 |
---
language:
- el
license: apache-2.0
tags:
- whisper-event
- generated_from_trainer
- hf-asr-leaderboard
- automatic-speech-recognition
- greek
datasets:
- mozilla-foundation/common_voice_11_0
- google/fleurs
metrics:
- wer
model-index:
- name: whisper-md-el-intlv-xs
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: mozilla-foundation/common_voice_11_0
type: mozilla-foundation/common_voice_11_0
config: el
split: test
metrics:
- name: Wer
type: wer
value: 11.3670
---
# whisper-md-el-intlv-xs
This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on interleaved mozilla-foundation/common_voice_11_0 (el) and the google/fleurs (el_gr) datasets. It achieves the following results on the mozilla-foundation/common_voice_11_0 test evaluation set:
- Loss: 0.4168
- Wer: 11.3670
## Model description
This model is trained over the two interleaved datasets in the Greek language. Testing used only the common_voice_11_0 (el) test split.
## Intended uses & limitations
The model was trained for transcription in Greek
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8e-06
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|
| 0.0251 | 2.49 | 1000 | 0.2216 | 12.5836 |
| 0.0051 | 4.98 | 2000 | 0.2874 | 12.2957 |
| 0.0015 | 7.46 | 3000 | 0.3281 | 11.9056 |
| 0.0017 | 9.95 | 4000 | 0.3178 | 12.5929 |
| 0.0008 | 12.44 | 5000 | 0.3449 | 11.9799 |
| 0.0001 | 14.93 | 6000 | 0.3638 | 11.7106 |
| 0.0001 | 17.41 | 7000 | 0.3910 | 11.4970 |
| 0.0 | 19.9 | 8000 | 0.4042 | 11.3949 |
| 0.0 | 22.39 | 9000 | 0.4129 | 11.4134 |
| 0.0 | 24.88 | 10000 | 0.4168 | 11.3670 |
### Framework versions
- Transformers 4.26.0.dev0
- Pytorch 1.13.0+cu117
- Datasets 2.7.1.dev0
- Tokenizers 0.13.2
|