End of training
Browse files
README.md
CHANGED
@@ -44,7 +44,7 @@ The following hyperparameters were used during training:
|
|
44 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
45 |
- lr_scheduler_type: linear
|
46 |
- lr_scheduler_warmup_steps: 500
|
47 |
-
- num_epochs:
|
48 |
- mixed_precision_training: Native AMP
|
49 |
|
50 |
### Training results
|
@@ -57,11 +57,3 @@ The following hyperparameters were used during training:
|
|
57 |
- Pytorch 2.5.1+cu118
|
58 |
- Datasets 3.1.0
|
59 |
- Tokenizers 0.20.3
|
60 |
-
|
61 |
-
|
62 |
-
```python
|
63 |
-
from transformers import AutoModelForCTC, Wav2Vec2BertProcessor
|
64 |
-
|
65 |
-
model = AutoModelForCTC.from_pretrained("HERIUN/w2v-bert-2.0-korean-colab-CV16.0")
|
66 |
-
processor = Wav2Vec2BertProcessor.from_pretrained("HERIUN/w2v-bert-2.0-korean-colab-CV16.0")
|
67 |
-
```
|
|
|
44 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
45 |
- lr_scheduler_type: linear
|
46 |
- lr_scheduler_warmup_steps: 500
|
47 |
+
- num_epochs: 1
|
48 |
- mixed_precision_training: Native AMP
|
49 |
|
50 |
### Training results
|
|
|
57 |
- Pytorch 2.5.1+cu118
|
58 |
- Datasets 3.1.0
|
59 |
- Tokenizers 0.20.3
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|