--- library_name: transformers license: apache-2.0 base_model: google/mt5-base tags: - generated_from_trainer metrics: - bleu model-index: - name: mt5-base-spanish-yoremnokki results: [] --- # mt5-base-spanish-yoremnokki This model is a fine-tuned version of [google/mt5-base](https://huggingface.co/google/mt5-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.8910 - Bleu: 20.9873 - Gen Len: 12.2218 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 7 ### Training results | Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len | |:-------------:|:------:|:-----:|:---------------:|:-------:|:-------:| | 2.9865 | 0.9997 | 1728 | 2.4599 | 19.12 | 12.2553 | | 2.5084 | 2.0 | 3457 | 2.1867 | 20.9775 | 12.815 | | 2.2879 | 2.9997 | 5185 | 2.0527 | 20.9572 | 12.537 | | 2.1599 | 4.0 | 6914 | 1.9781 | 21.1008 | 12.4693 | | 2.1032 | 4.9997 | 8642 | 1.9272 | 21.1189 | 12.4292 | | 2.0649 | 6.0 | 10371 | 1.8999 | 20.9375 | 12.2177 | | 1.9988 | 6.9980 | 12096 | 1.8910 | 20.9873 | 12.2218 | ### Framework versions - Transformers 4.46.2 - Pytorch 2.5.1+cu121 - Datasets 3.1.0 - Tokenizers 0.20.3