t5-small-finetuned-DEPlain

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4040
  • Rouge1: 56.1449
  • Rouge2: 33.5451
  • Rougel: 49.3652
  • Rougelsum: 50.4116
  • Gen Len: 16.8619

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
1.7816 1.0 667 1.5659 56.0636 33.4605 49.2184 50.1982 16.8749
1.7247 2.0 1334 1.5268 55.8529 33.273 49.0989 50.0532 16.8457
1.646 3.0 2001 1.5005 55.9672 33.491 49.2462 50.1807 16.8903
1.6284 4.0 2668 1.4829 55.7959 33.2889 49.115 50.0945 16.8497
1.6125 5.0 3335 1.4690 55.9584 33.4199 49.197 50.1955 16.8595
1.5722 6.0 4002 1.4583 56.002 33.3992 49.2363 50.2844 16.8652
1.5578 7.0 4669 1.4461 55.9959 33.4014 49.2695 50.3575 16.8205
1.5483 8.0 5336 1.4401 56.1002 33.4891 49.3499 50.4312 16.8465
1.5376 9.0 6003 1.4319 56.0337 33.4694 49.2847 50.392 16.8367
1.5174 10.0 6670 1.4261 56.1104 33.5113 49.3145 50.4133 16.853
1.5031 11.0 7337 1.4215 56.0716 33.5463 49.3603 50.4459 16.8359
1.488 12.0 8004 1.4165 56.0433 33.5083 49.3177 50.3731 16.8424
1.4931 13.0 8671 1.4154 56.2073 33.6711 49.4172 50.4928 16.8481
1.4613 14.0 9338 1.4103 56.0724 33.5666 49.3104 50.3582 16.8497
1.4695 15.0 10005 1.4080 56.142 33.6211 49.4136 50.4679 16.8619
1.4695 16.0 10672 1.4070 56.173 33.6205 49.4061 50.474 16.87
1.4625 17.0 11339 1.4053 56.0842 33.5358 49.3451 50.4014 16.866
1.4616 18.0 12006 1.4042 56.1138 33.5467 49.359 50.4131 16.866
1.4622 19.0 12673 1.4037 56.1368 33.5442 49.3712 50.4346 16.8627
1.455 20.0 13340 1.4040 56.1449 33.5451 49.3652 50.4116 16.8619

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.1
Downloads last month
2
Safetensors
Model size
60.5M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for jonathandechert/t5-small-finetuned-DEPlain

Base model

google-t5/t5-small
Finetuned
(2071)
this model