superlative-quantifier-lstm-1
This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.9853
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 1
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 3052726
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
4.7763 | 0.03 | 76320 | 4.7620 |
4.4989 | 1.03 | 152640 | 4.4853 |
4.3556 | 0.03 | 228960 | 4.3511 |
4.2721 | 1.03 | 305280 | 4.2690 |
4.2111 | 0.03 | 381600 | 4.2130 |
4.1619 | 1.03 | 457920 | 4.1721 |
4.1268 | 0.03 | 534240 | 4.1413 |
4.0969 | 1.03 | 610560 | 4.1164 |
4.0684 | 0.03 | 686880 | 4.0977 |
4.0424 | 1.03 | 763200 | 4.0819 |
4.0213 | 0.03 | 839520 | 4.0687 |
4.0022 | 1.03 | 915840 | 4.0577 |
3.9921 | 2.03 | 992160 | 4.0491 |
3.9758 | 0.03 | 1068480 | 4.0414 |
3.9621 | 1.03 | 1144800 | 4.0342 |
3.9449 | 2.03 | 1221120 | 4.0284 |
3.931 | 0.03 | 1297440 | 4.0237 |
3.9215 | 1.03 | 1373760 | 4.0190 |
3.9109 | 0.03 | 1450080 | 4.0155 |
3.9095 | 0.03 | 1526400 | 4.0124 |
3.9049 | 1.03 | 1602720 | 4.0085 |
3.9 | 0.03 | 1679040 | 4.0056 |
3.8962 | 1.03 | 1755360 | 4.0035 |
3.8938 | 2.03 | 1831680 | 4.0011 |
3.8866 | 0.03 | 1908000 | 3.9991 |
3.8816 | 0.03 | 1984320 | 3.9976 |
3.8738 | 0.03 | 2060640 | 3.9959 |
3.8678 | 1.03 | 2136960 | 3.9945 |
3.8677 | 0.03 | 2213280 | 3.9936 |
3.8607 | 1.03 | 2289600 | 3.9922 |
3.8561 | 2.03 | 2365920 | 3.9911 |
3.8476 | 0.03 | 2442240 | 3.9901 |
3.841 | 1.03 | 2518560 | 3.9895 |
3.8375 | 2.03 | 2594880 | 3.9884 |
3.834 | 0.03 | 2671200 | 3.9873 |
3.837 | 0.03 | 2747520 | 3.9869 |
3.839 | 1.03 | 2823840 | 3.9863 |
3.8355 | 0.03 | 2900160 | 3.9858 |
3.8396 | 1.03 | 2976480 | 3.9855 |
3.8402 | 2.02 | 3052726 | 3.9853 |
Framework versions
- Transformers 4.33.3
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.3
- Downloads last month
- 4