File size: 1,777 Bytes
edf66ff
3774e13
 
 
98654f0
3774e13
 
 
 
 
 
 
edf66ff
 
3774e13
 
edf66ff
3774e13
edf66ff
3774e13
 
98654f0
 
edf66ff
3774e13
edf66ff
3774e13
edf66ff
3774e13
edf66ff
3774e13
edf66ff
3774e13
edf66ff
3774e13
edf66ff
3774e13
edf66ff
3774e13
edf66ff
3774e13
 
 
 
 
 
 
 
 
edf66ff
3774e13
edf66ff
3774e13
 
98654f0
 
 
 
 
 
 
edf66ff
 
3774e13
edf66ff
3774e13
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
---
language:
- nl
license: apache-2.0
base_model: openai/whisper-large-v2
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: Whisper Large V2
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Whisper Large V2

This model is a fine-tuned version of [openai/whisper-large-v2](https://huggingface.co/openai/whisper-large-v2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4076
- Wer: 12.3813

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 20
- num_epochs: 5

### Training results

| Training Loss | Epoch | Step | Validation Loss | Wer     |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.6749        | 0.71  | 30   | 0.3798          | 17.3625 |
| 0.26          | 1.43  | 60   | 0.3843          | 14.0477 |
| 0.163         | 2.14  | 90   | 0.3617          | 12.5963 |
| 0.0743        | 2.86  | 120  | 0.3539          | 13.2234 |
| 0.0429        | 3.57  | 150  | 0.3883          | 14.4598 |
| 0.024         | 4.29  | 180  | 0.4002          | 14.1014 |
| 0.011         | 5.0   | 210  | 0.4076          | 12.3813 |


### Framework versions

- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.14.6
- Tokenizers 0.15.0