LeVy4 commited on
Commit
0aa2b2a
·
verified ·
1 Parent(s): 7ce7a68

LeVy4/whisper-small-v3

Browse files
README.md CHANGED
@@ -4,6 +4,8 @@ license: apache-2.0
4
  base_model: openai/whisper-small
5
  tags:
6
  - generated_from_trainer
 
 
7
  model-index:
8
  - name: whisper-small-vi
9
  results: []
@@ -16,13 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - eval_loss: 0.2205
20
- - eval_wer: 8.8652
21
- - eval_runtime: 20.9686
22
- - eval_samples_per_second: 2.48
23
- - eval_steps_per_second: 0.334
24
- - epoch: 38.1053
25
- - step: 724
26
 
27
  ## Model description
28
 
@@ -45,15 +42,31 @@ The following hyperparameters were used during training:
45
  - train_batch_size: 16
46
  - eval_batch_size: 8
47
  - seed: 42
48
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
  - lr_scheduler_warmup_steps: 200
51
  - training_steps: 1000
52
  - mixed_precision_training: Native AMP
53
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
54
  ### Framework versions
55
 
56
- - Transformers 4.45.2
57
- - Pytorch 2.4.1+cu121
58
- - Datasets 3.0.1
59
  - Tokenizers 0.20.1
 
4
  base_model: openai/whisper-small
5
  tags:
6
  - generated_from_trainer
7
+ metrics:
8
+ - wer
9
  model-index:
10
  - name: whisper-small-vi
11
  results: []
 
18
 
19
  This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.0855
22
+ - Wer: 14.0000
 
 
 
 
 
23
 
24
  ## Model description
25
 
 
42
  - train_batch_size: 16
43
  - eval_batch_size: 8
44
  - seed: 42
45
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
46
  - lr_scheduler_type: linear
47
  - lr_scheduler_warmup_steps: 200
48
  - training_steps: 1000
49
  - mixed_precision_training: Native AMP
50
 
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
54
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|
55
+ | 1.6255 | 5.5556 | 100 | 1.6181 | 67.0 |
56
+ | 0.0811 | 11.1111 | 200 | 0.1741 | 24.0 |
57
+ | 0.0031 | 16.6667 | 300 | 0.0962 | 17.0 |
58
+ | 0.0007 | 22.2222 | 400 | 0.0863 | 15.0 |
59
+ | 0.0004 | 27.7778 | 500 | 0.0860 | 15.0 |
60
+ | 0.0002 | 33.3333 | 600 | 0.0855 | 14.0000 |
61
+ | 0.0002 | 38.8889 | 700 | 0.0851 | 14.0000 |
62
+ | 0.0002 | 44.4444 | 800 | 0.0854 | 14.0000 |
63
+ | 0.0002 | 50.0 | 900 | 0.0855 | 14.0000 |
64
+ | 0.0002 | 55.5556 | 1000 | 0.0855 | 14.0000 |
65
+
66
+
67
  ### Framework versions
68
 
69
+ - Transformers 4.46.1
70
+ - Pytorch 2.5.0+cu121
71
+ - Datasets 3.1.0
72
  - Tokenizers 0.20.1
generation_config.json CHANGED
@@ -250,5 +250,5 @@
250
  "transcribe": 50359,
251
  "translate": 50358
252
  },
253
- "transformers_version": "4.45.2"
254
  }
 
250
  "transcribe": 50359,
251
  "translate": 50358
252
  },
253
+ "transformers_version": "4.46.1"
254
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:719c5e9593555466fa21ba0b2375b6c2cc657ebcfa58767e59ebd431294db984
3
  size 966995080
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3a6b62e3bb6d5512028519051149f6f750864f8d24de838bdda7db8c3b88851b
3
  size 966995080
runs/Nov03_04-30-43_f914acea8591/events.out.tfevents.1730608300.f914acea8591.270.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8d5546cc2f23fb009678159de00651d00db56cf3ef96df2a8ffee097b7c7ad66
3
- size 31001
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f8ebc79d8c0b95bef1997e6a2aba661c5e53ca70d69bd5d8106abb53c02b7a48
3
+ size 31355
runs/Nov03_04-30-43_f914acea8591/events.out.tfevents.1730612898.f914acea8591.270.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dc2fae5e7845348baae10140ec27c85ecc9823d72b273345ed75b26cdd9c70b6
3
+ size 406