File size: 3,089 Bytes
de7cb68
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
library_name: transformers
language:
- ac
license: apache-2.0
base_model: openai/whisper-small
tags:
- whisper-event
- generated_from_trainer
datasets:
- tericlabs
metrics:
- wer
model-index:
- name: Whisper base acholi
  results:
  - task:
      name: Automatic Speech Recognition
      type: automatic-speech-recognition
    dataset:
      name: Sunbird_salt
      type: tericlabs
    metrics:
    - name: Wer
      type: wer
      value: 125.91206735266604
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Whisper base acholi

This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the Sunbird_salt dataset.
It achieves the following results on the evaluation set:
- Loss: 6.3204
- Wer: 125.9121

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- training_steps: 20000
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch    | Step  | Validation Loss | Wer      |
|:-------------:|:--------:|:-----:|:---------------:|:--------:|
| 3.9219        | 6.6225   | 1000  | 2.8840          | 196.1646 |
| 2.2828        | 13.2450  | 2000  | 2.8298          | 129.9345 |
| 1.399         | 19.8675  | 3000  | 3.3370          | 135.6408 |
| 0.5689        | 26.4901  | 4000  | 3.9490          | 141.4406 |
| 0.1519        | 33.1126  | 5000  | 4.4924          | 117.0253 |
| 0.0408        | 39.7351  | 6000  | 4.8503          | 130.4958 |
| 0.0176        | 46.3576  | 7000  | 5.1254          | 123.5734 |
| 0.0101        | 52.9801  | 8000  | 5.2911          | 128.7184 |
| 0.0049        | 59.6026  | 9000  | 5.5606          | 145.7437 |
| 0.004         | 66.2252  | 10000 | 5.6918          | 131.7119 |
| 0.003         | 72.8477  | 11000 | 5.8036          | 130.5893 |
| 0.0021        | 79.4702  | 12000 | 5.9199          | 127.5023 |
| 0.0008        | 86.0927  | 13000 | 6.0288          | 134.5182 |
| 0.0021        | 92.7152  | 14000 | 6.0003          | 133.4892 |
| 0.0006        | 99.3377  | 15000 | 6.1112          | 123.0122 |
| 0.0003        | 105.9603 | 16000 | 6.1775          | 122.1703 |
| 0.0002        | 112.5828 | 17000 | 6.2225          | 125.6314 |
| 0.0002        | 119.2053 | 18000 | 6.2691          | 126.3798 |
| 0.0002        | 125.8278 | 19000 | 6.3077          | 125.7250 |
| 0.0002        | 132.4503 | 20000 | 6.3204          | 125.9121 |


### Framework versions

- Transformers 4.47.0.dev0
- Pytorch 2.4.0+cu121
- Datasets 3.0.2
- Tokenizers 0.20.1