mt5-small-finetuned-emails-gpt-summaries-batchs8-epochs20
This model is a fine-tuned version of google/mt5-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.0969
- Rouge1: 0.1523
- Rouge2: 0.0732
- Rougel: 0.1381
- Rougelsum: 0.1512
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
---|---|---|---|---|---|---|---|
20.8315 | 1.0 | 18 | 12.1572 | 0.0110 | 0.0 | 0.0110 | 0.0110 |
16.2522 | 2.0 | 36 | 9.4668 | 0.0172 | 0.0025 | 0.0172 | 0.0168 |
13.8556 | 3.0 | 54 | 8.5239 | 0.0149 | 0.0025 | 0.0152 | 0.0146 |
11.5721 | 4.0 | 72 | 7.3135 | 0.0128 | 0.0013 | 0.0128 | 0.0125 |
10.2536 | 5.0 | 90 | 7.3664 | 0.0118 | 0.0013 | 0.0120 | 0.0115 |
8.8152 | 6.0 | 108 | 5.2895 | 0.0288 | 0.0070 | 0.0290 | 0.0286 |
7.909 | 7.0 | 126 | 3.9341 | 0.0285 | 0.0013 | 0.0284 | 0.0281 |
6.7655 | 8.0 | 144 | 3.6549 | 0.0427 | 0.0069 | 0.0406 | 0.0394 |
6.243 | 9.0 | 162 | 3.5561 | 0.0474 | 0.0133 | 0.0437 | 0.0462 |
5.5584 | 10.0 | 180 | 3.4047 | 0.0823 | 0.0253 | 0.0786 | 0.0763 |
5.1966 | 11.0 | 198 | 3.3160 | 0.1221 | 0.0484 | 0.1186 | 0.1209 |
4.9761 | 12.0 | 216 | 3.2536 | 0.1469 | 0.0751 | 0.1364 | 0.1481 |
4.7006 | 13.0 | 234 | 3.2081 | 0.1353 | 0.0637 | 0.1162 | 0.1318 |
4.4686 | 14.0 | 252 | 3.1687 | 0.1453 | 0.0707 | 0.1347 | 0.1399 |
4.3912 | 15.0 | 270 | 3.1480 | 0.1574 | 0.0782 | 0.1449 | 0.1498 |
4.2511 | 16.0 | 288 | 3.1414 | 0.1541 | 0.0734 | 0.1412 | 0.1509 |
4.1233 | 17.0 | 306 | 3.1239 | 0.1572 | 0.0734 | 0.1439 | 0.1536 |
4.0948 | 18.0 | 324 | 3.1061 | 0.1451 | 0.0689 | 0.1316 | 0.1438 |
4.0434 | 19.0 | 342 | 3.0981 | 0.1517 | 0.0732 | 0.1377 | 0.1507 |
4.1557 | 20.0 | 360 | 3.0969 | 0.1523 | 0.0732 | 0.1381 | 0.1512 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 20
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for ak2603/mt5-small-finetuned-emails-gpt-summaries-batchs8-epochs20
Base model
google/mt5-small