TifinLab commited on
Commit
9341165
·
verified ·
1 Parent(s): e2f11e6

End of training

Browse files
Files changed (2) hide show
  1. README.md +3 -8
  2. model.safetensors +1 -1
README.md CHANGED
@@ -1,7 +1,6 @@
1
  ---
2
- base_model: TifinLab/wav2vec2-kab
3
  library_name: transformers
4
- license: apache-2.0
5
  tags:
6
  - generated_from_trainer
7
  model-index:
@@ -14,7 +13,7 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # wav2vec2-berber
16
 
17
- This model is a fine-tuned version of [TifinLab/wav2vec2-kab](https://huggingface.co/TifinLab/wav2vec2-kab) on the None dataset.
18
 
19
  ## Model description
20
 
@@ -40,13 +39,9 @@ The following hyperparameters were used during training:
40
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
  - lr_scheduler_type: cosine_with_restarts
42
  - lr_scheduler_warmup_steps: 500
43
- - num_epochs: 15
44
  - mixed_precision_training: Native AMP
45
 
46
- ### Training results
47
-
48
-
49
-
50
  ### Framework versions
51
 
52
  - Transformers 4.45.0.dev0
 
1
  ---
 
2
  library_name: transformers
3
+ base_model: TifinLab/wav2vec2-kabyle
4
  tags:
5
  - generated_from_trainer
6
  model-index:
 
13
 
14
  # wav2vec2-berber
15
 
16
+ This model is a fine-tuned version of [TifinLab/wav2vec2-kabyle](https://huggingface.co/TifinLab/wav2vec2-kabyle) on the None dataset.
17
 
18
  ## Model description
19
 
 
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
  - lr_scheduler_type: cosine_with_restarts
41
  - lr_scheduler_warmup_steps: 500
42
+ - num_epochs: 6
43
  - mixed_precision_training: Native AMP
44
 
 
 
 
 
45
  ### Framework versions
46
 
47
  - Transformers 4.45.0.dev0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:479d2baf1f4f2aa825f5dea3a28e0a68fb96fb68a789f96177386d59d7e0edf4
3
  size 1262032980
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae03c000cf9c43b5f8045c6d382b4a24d72c30543c30b29f8877b06f34715f7f
3
  size 1262032980