shadow
This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.2249
- Rouge1: 0.2638
- Rouge2: 0.1186
- Rougel: 0.2171
- Rougelsum: 0.2169
- Gen Len: 146.8555
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
2.6504 | 1.0 | 718 | 1.5109 | 0.0112 | 0.0026 | 0.0098 | 0.0098 | 13.3183 |
1.7193 | 2.0 | 1436 | 1.3836 | 0.1697 | 0.0487 | 0.1355 | 0.1354 | 140.8933 |
1.5492 | 3.0 | 2154 | 1.3174 | 0.202 | 0.0564 | 0.1591 | 0.1592 | 153.2811 |
1.51 | 4.0 | 2872 | 1.2829 | 0.2152 | 0.0691 | 0.1708 | 0.1709 | 156.3921 |
1.4341 | 5.0 | 3590 | 1.2613 | 0.2336 | 0.0895 | 0.1866 | 0.1866 | 156.2713 |
1.4335 | 6.0 | 4308 | 1.2466 | 0.2534 | 0.1118 | 0.2073 | 0.2075 | 146.7207 |
1.4033 | 7.0 | 5026 | 1.2363 | 0.2591 | 0.1168 | 0.213 | 0.213 | 144.5354 |
1.4045 | 8.0 | 5744 | 1.2298 | 0.261 | 0.1171 | 0.2149 | 0.2149 | 147.7884 |
1.3839 | 9.0 | 6462 | 1.2260 | 0.263 | 0.1178 | 0.216 | 0.2159 | 147.3518 |
1.3863 | 10.0 | 7180 | 1.2249 | 0.2638 | 0.1186 | 0.2171 | 0.2169 | 146.8555 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for davidgaofc/SFT_shadow
Base model
google-t5/t5-small