ft-GPT2-with-lyrics
This model is a fine-tuned version of gpt2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.1112
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.0058 | 1.0 | 71 | 1.5293 |
1.5797 | 2.0 | 142 | 1.3814 |
1.4549 | 3.0 | 213 | 1.3093 |
1.3837 | 4.0 | 284 | 1.2661 |
1.3354 | 5.0 | 355 | 1.2328 |
1.2976 | 6.0 | 426 | 1.2142 |
1.2678 | 7.0 | 497 | 1.1973 |
1.2401 | 8.0 | 568 | 1.1825 |
1.2168 | 9.0 | 639 | 1.1737 |
1.1971 | 10.0 | 710 | 1.1604 |
1.1797 | 11.0 | 781 | 1.1529 |
1.1671 | 12.0 | 852 | 1.1451 |
1.1517 | 13.0 | 923 | 1.1395 |
1.1389 | 14.0 | 994 | 1.1357 |
1.1267 | 15.0 | 1065 | 1.1325 |
1.1189 | 16.0 | 1136 | 1.1278 |
1.1076 | 17.0 | 1207 | 1.1246 |
1.1019 | 18.0 | 1278 | 1.1201 |
1.0925 | 19.0 | 1349 | 1.1201 |
1.0885 | 20.0 | 1420 | 1.1159 |
1.0806 | 21.0 | 1491 | 1.1165 |
1.0785 | 22.0 | 1562 | 1.1178 |
1.0703 | 23.0 | 1633 | 1.1171 |
1.0674 | 24.0 | 1704 | 1.1130 |
1.0604 | 25.0 | 1775 | 1.1132 |
1.062 | 26.0 | 1846 | 1.1120 |
1.0545 | 27.0 | 1917 | 1.1123 |
1.0534 | 28.0 | 1988 | 1.1106 |
1.0528 | 29.0 | 2059 | 1.1115 |
1.0516 | 30.0 | 2130 | 1.1112 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.2
- Datasets 2.15.0
- Tokenizers 0.15.1
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for surajkarki/ft-GPT2-with-lyrics
Base model
openai-community/gpt2