Plasmoxy's picture
End of training
2348533 verified
---
library_name: transformers
license: apache-2.0
base_model: google/mt5-small
tags:
- generated_from_trainer
model-index:
- name: mt5-small-gigatrue-slovak
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mt5-small-gigatrue-slovak
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1758
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:-----:|:---------------:|
| 3.5063 | 0.1015 | 3000 | 2.3075 |
| 2.998 | 0.2030 | 6000 | 2.2417 |
| 2.9368 | 0.3044 | 9000 | 2.2237 |
| 2.9102 | 0.4059 | 12000 | 2.2064 |
| 2.8894 | 0.5074 | 15000 | 2.2052 |
| 2.8837 | 0.6089 | 18000 | 2.1945 |
| 2.8756 | 0.7104 | 21000 | 2.1984 |
| 2.8718 | 0.8119 | 24000 | 2.1881 |
| 2.868 | 0.9133 | 27000 | 2.1868 |
| 2.8644 | 1.0148 | 30000 | 2.1816 |
| 2.8644 | 1.1163 | 33000 | 2.1815 |
| 2.8566 | 1.2178 | 36000 | 2.1785 |
| 2.858 | 1.3193 | 39000 | 2.1745 |
| 2.8558 | 1.4207 | 42000 | 2.1784 |
| 2.8559 | 1.5222 | 45000 | 2.1775 |
| 2.85 | 1.6237 | 48000 | 2.1783 |
| 2.8521 | 1.7252 | 51000 | 2.1777 |
| 2.8488 | 1.8267 | 54000 | 2.1782 |
| 2.8501 | 1.9282 | 57000 | 2.1760 |
| 2.8521 | 2.0296 | 60000 | 2.1773 |
| 2.8526 | 2.1311 | 63000 | 2.1764 |
| 2.8494 | 2.2326 | 66000 | 2.1774 |
| 2.8501 | 2.3341 | 69000 | 2.1765 |
| 2.8489 | 2.4356 | 72000 | 2.1771 |
| 2.8501 | 2.5370 | 75000 | 2.1763 |
| 2.8506 | 2.6385 | 78000 | 2.1762 |
| 2.8472 | 2.7400 | 81000 | 2.1762 |
| 2.8512 | 2.8415 | 84000 | 2.1758 |
| 2.8494 | 2.9430 | 87000 | 2.1758 |
### Framework versions
- Transformers 4.48.0.dev0
- Pytorch 2.5.1
- Datasets 3.1.0
- Tokenizers 0.21.0