File size: 2,857 Bytes
11db9cd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d77cc25
 
 
11db9cd
d77cc25
 
 
11db9cd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d77cc25
11db9cd
 
 
 
 
d77cc25
 
11db9cd
 
 
 
 
 
d77cc25
 
 
 
 
 
 
 
 
 
11db9cd
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
---

license: apache-2.0
base_model: openai/whisper-base.en
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: abbenedekwhisper-base.en-finetuning2-D3K
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# abbenedekwhisper-base.en-finetuning2-D3K

This model is a fine-tuned version of [openai/whisper-base.en](https://huggingface.co/openai/whisper-base.en) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.7781
- Cer: 64.7190
- Wer: 119.5364
- Ser: 100.0
- Cer Clean: 3.5058
- Wer Clean: 6.2914
- Ser Clean: 7.0175

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-08

- train_batch_size: 16

- eval_batch_size: 64

- seed: 42

- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08

- lr_scheduler_type: linear

- lr_scheduler_warmup_steps: 10
- training_steps: 1000

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Cer     | Wer      | Ser   | Cer Clean | Wer Clean | Ser Clean |

|:-------------:|:-----:|:----:|:---------------:|:-------:|:--------:|:-----:|:---------:|:---------:|:---------:|

| 7.5369        | 0.53  | 100  | 6.7220          | 63.7730 | 128.1457 | 100.0 | 4.1180    | 6.9536    | 8.7719    |

| 7.0363        | 1.06  | 200  | 6.1829          | 65.0529 | 123.8411 | 100.0 | 3.2832    | 5.6291    | 7.0175    |

| 6.417         | 1.6   | 300  | 5.7959          | 64.1625 | 121.1921 | 100.0 | 3.2832    | 5.6291    | 7.0175    |

| 6.0146        | 2.13  | 400  | 5.4587          | 64.7746 | 121.8543 | 100.0 | 3.6728    | 6.6225    | 7.8947    |

| 5.6687        | 2.66  | 500  | 5.2287          | 65.3311 | 120.5298 | 100.0 | 3.7284    | 6.6225    | 7.8947    |

| 5.3902        | 3.19  | 600  | 5.0691          | 65.1085 | 121.1921 | 100.0 | 3.5615    | 6.2914    | 7.0175    |

| 5.2512        | 3.72  | 700  | 4.9358          | 64.7190 | 120.1987 | 100.0 | 3.2832    | 5.9603    | 6.1404    |

| 5.1258        | 4.26  | 800  | 4.8451          | 64.7190 | 119.5364 | 100.0 | 3.5058    | 6.2914    | 7.0175    |

| 5.0472        | 4.79  | 900  | 4.7950          | 64.7190 | 119.5364 | 100.0 | 3.5058    | 6.2914    | 7.0175    |

| 4.9871        | 5.32  | 1000 | 4.7781          | 64.7190 | 119.5364 | 100.0 | 3.5058    | 6.2914    | 7.0175    |





### Framework versions



- Transformers 4.39.3

- Pytorch 2.2.2+cu121

- Datasets 2.14.5

- Tokenizers 0.15.2