Update README.md
Browse files
README.md
CHANGED
@@ -57,40 +57,35 @@ The *SpeechT5 TTS Technical Train2* is built on the *SpeechT5* architecture and
|
|
57 |
|
58 |
The model was fine-tuned on a *custom dataset*, curated for enhancing TTS outputs. This dataset consists of various types of text that help the model generate more natural speech, making it suitable for TTS applications.
|
59 |
|
60 |
-
|
61 |
-
|
62 |
-
## β Training Procedure
|
63 |
-
|
64 |
-
### β *Hyperparameters*:
|
65 |
|
66 |
The model was trained with the following hyperparameters:
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
### Training
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
|
85 |
-
|
|
86 |
-
|
|
87 |
-
|
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
-
|
94 |
-
-
|
95 |
-
-
|
96 |
-
- Tokenizers 0.20.1
|
|
|
57 |
|
58 |
The model was fine-tuned on a *custom dataset*, curated for enhancing TTS outputs. This dataset consists of various types of text that help the model generate more natural speech, making it suitable for TTS applications.
|
59 |
|
60 |
+
### β *Hyperparameters:*
|
|
|
|
|
|
|
|
|
61 |
|
62 |
The model was trained with the following hyperparameters:
|
63 |
+
|
64 |
+
- *Learning Rate*: 1e-05
|
65 |
+
- *Train Batch Size*: 16
|
66 |
+
- *Eval Batch Size*: 8
|
67 |
+
- *Seed*: 42
|
68 |
+
- *Gradient Accumulation Steps*: 2
|
69 |
+
- *Total Train Batch Size*: 32
|
70 |
+
- *Optimizer*: AdamW (betas=(0.9, 0.999), epsilon=1e-08)
|
71 |
+
- *LR Scheduler Type*: Linear
|
72 |
+
- *Warmup Steps*: 50
|
73 |
+
- *Training Steps*: 500
|
74 |
+
- *Mixed Precision Training*: Native AMP
|
75 |
+
|
76 |
+
### β *π Training Results:*:
|
77 |
+
| πββ Training Loss | π Epoch | π€ Step | π Validation Loss |
|
78 |
+
|:-------------------:|:-------:|:-------:|:-----------------:|
|
79 |
+
| 1.1921 | 100.0 | 100 | 0.4136 |
|
80 |
+
| 0.8435 | 200.0 | 200 | 0.3791 |
|
81 |
+
| 0.8294 | 300.0 | 300 | 0.3766 |
|
82 |
+
| 0.7959 | 400.0 | 400 | 0.3744 |
|
83 |
+
| 0.7918 | 500.0 | 500 | 0.3763 |
|
84 |
+
|
85 |
+
|
86 |
+
### π¦ Framework Versions
|
87 |
+
|
88 |
+
- *Transformers*: 4.46.0.dev0
|
89 |
+
- *PyTorch*: 2.4.1+cu121
|
90 |
+
- *Datasets*: 3.0.2
|
91 |
+
- *Tokenizers*:Β 0.20.1
|
|