File size: 1,269 Bytes
1bcde50
 
 
c6465a4
1bcde50
c6465a4
1bcde50
c6465a4
 
 
1bcde50
c6465a4
1bcde50
c6465a4
1bcde50
c6465a4
1bcde50
c6465a4
 
 
 
 
 
 
 
 
 
1bcde50
c6465a4
1bcde50
c6465a4
 
 
 
 
d4f621b
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
---
library_name: transformers
---
#  finetuned whisper-tiny model on custom dataset

This model is a fine-tuned version of `openai/whisper-tiny` on Serbian Mozilla/Common Voice 13. It achieves the following results on the evaluation set:

- **Loss**: 0.1628
- **Wer Ortho**: 0.1635
- **Wer**: 0.0556

## Training Procedure

### Training Hyperparameters

The following hyperparameters were used during training:

- **learning_rate**: 3e-5
- **train_batch_size**: 32
- **eval_batch_size**: 32
- **gradient_accumulation_steps**: 2
- **total_train_batch_size**: 64
- **optimizer**: Adam with betas=(0.9,0.999) and epsilon=1e-08
- **lr_scheduler_type**: linear
- **lr_scheduler_warmup_steps**: 100
- **training_steps**: 2000
- **mixed_precision_training**: Native AMP

### Training Results

| Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer   |
|---------------|-------|------|-----------------|-----------|-------|
| 0.0600        | 1.34  | 500  | 0.1852          | 0.1800    | 0.0745|
| 0.0285        | 2.67  | 1000 | 0.1715          | 0.1710    | 0.0640|
| 0.0140        | 4.01  | 1500 | 0.1658          | 0.1685    | 0.0582|


## Framework Versions

- **Transformers**: 4.41.2
- **Pytorch**: 2.3.0+cu121
- **Datasets**: 2.18.0
- **Tokenizers**: 0.19.1