rel-cl-lstm-0
This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.9774
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 0
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 3052726
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
4.8082 | 0.03 | 76319 | 4.7686 |
4.5186 | 1.03 | 152638 | 4.4836 |
4.3751 | 0.03 | 228957 | 4.3481 |
4.2868 | 1.03 | 305276 | 4.2655 |
4.2227 | 0.03 | 381595 | 4.2085 |
4.1713 | 0.03 | 457914 | 4.1671 |
4.1342 | 0.03 | 534233 | 4.1361 |
4.1036 | 0.03 | 610552 | 4.1114 |
4.0715 | 1.03 | 686871 | 4.0915 |
4.0524 | 0.03 | 763190 | 4.0756 |
4.0316 | 1.03 | 839509 | 4.0636 |
4.0127 | 0.03 | 915828 | 4.0522 |
3.9912 | 0.03 | 992147 | 4.0424 |
3.9787 | 1.03 | 1068466 | 4.0349 |
3.9572 | 0.03 | 1144786 | 4.0269 |
3.9465 | 1.03 | 1221106 | 4.0206 |
3.9442 | 0.03 | 1297426 | 4.0153 |
3.9335 | 1.03 | 1373746 | 4.0111 |
3.9232 | 0.03 | 1450066 | 4.0068 |
3.9185 | 1.03 | 1526386 | 4.0029 |
3.9139 | 0.03 | 1602706 | 3.9997 |
3.9108 | 1.03 | 1679026 | 3.9973 |
3.9081 | 0.03 | 1755346 | 3.9954 |
3.8976 | 1.03 | 1831666 | 3.9930 |
3.8919 | 0.03 | 1907986 | 3.9912 |
3.8824 | 1.03 | 1984306 | 3.9896 |
3.8759 | 0.03 | 2060626 | 3.9880 |
3.8735 | 1.03 | 2136946 | 3.9865 |
3.8676 | 0.03 | 2213266 | 3.9854 |
3.8588 | 1.03 | 2289586 | 3.9842 |
3.8596 | 0.03 | 2365906 | 3.9830 |
3.8594 | 0.03 | 2442226 | 3.9820 |
3.8535 | 1.03 | 2518546 | 3.9811 |
3.8489 | 0.03 | 2594866 | 3.9804 |
3.8453 | 1.03 | 2671186 | 3.9795 |
3.8472 | 0.03 | 2747506 | 3.9791 |
3.8447 | 1.03 | 2823826 | 3.9786 |
3.8473 | 0.03 | 2900146 | 3.9781 |
3.8489 | 0.03 | 2976466 | 3.9777 |
3.8439 | 1.02 | 3052726 | 3.9774 |
Framework versions
- Transformers 4.33.3
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.3
- Downloads last month
- 4