zlm_b64_le4_s8000

This model is a fine-tuned version of microsoft/speecht5_tts on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3177

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 2000
  • training_steps: 8000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.5277 0.4188 500 0.4806
0.4582 0.8377 1000 0.4116
0.4312 1.2565 1500 0.3951
0.4122 1.6754 2000 0.3768
0.4002 2.0942 2500 0.3599
0.3905 2.5131 3000 0.3521
0.3806 2.9319 3500 0.3445
0.37 3.3508 4000 0.3474
0.3736 3.7696 4500 0.3362
0.3608 8.3872 5000 0.3342
0.3602 9.2249 5500 0.3258
0.3561 10.0626 6000 0.3230
0.3505 10.9003 6500 0.3199
0.3473 11.7380 7000 0.3193
0.3523 12.5757 7500 0.3177
0.3462 13.4134 8000 0.3177

Framework versions

  • Transformers 4.41.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.0
  • Tokenizers 0.19.1
Downloads last month
14
Safetensors
Model size
144M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mikhail-panzo/zlm_b64_le4_s8000

Finetuned
(859)
this model