|
--- |
|
license: apache-2.0 |
|
base_model: t5-large |
|
tags: |
|
- generated_from_trainer |
|
datasets: |
|
- glue |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: t5-large_cola_sp0_ar0 |
|
results: |
|
- task: |
|
name: Text Classification |
|
type: text-classification |
|
dataset: |
|
name: glue |
|
type: glue |
|
config: cola |
|
split: validation |
|
args: cola |
|
metrics: |
|
- name: Accuracy |
|
type: accuracy |
|
value: 0.880859375 |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# t5-large_cola_sp0_ar0 |
|
|
|
This model is a fine-tuned version of [t5-large](https://huggingface.co/t5-large) on the glue dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.4179 |
|
- Accuracy: 0.8809 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 32 |
|
- seed: 1 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 20 |
|
- num_epochs: 6 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:| |
|
| 0.5885 | 0.05 | 25 | 0.6751 | 0.6913 | |
|
| 0.5475 | 0.11 | 50 | 0.5338 | 0.6913 | |
|
| 0.5122 | 0.16 | 75 | 0.4847 | 0.7919 | |
|
| 0.4486 | 0.21 | 100 | 0.5089 | 0.7996 | |
|
| 0.4087 | 0.27 | 125 | 0.5139 | 0.8063 | |
|
| 0.4022 | 0.32 | 150 | 0.5188 | 0.8035 | |
|
| 0.4245 | 0.37 | 175 | 0.5196 | 0.7987 | |
|
| 0.4298 | 0.42 | 200 | 0.6226 | 0.8006 | |
|
| 0.4326 | 0.48 | 225 | 0.6169 | 0.8015 | |
|
| 0.4321 | 0.53 | 250 | 0.6173 | 0.7987 | |
|
| 0.4288 | 0.58 | 275 | 0.4786 | 0.8102 | |
|
| 0.3914 | 0.64 | 300 | 0.5147 | 0.8054 | |
|
| 0.3519 | 0.69 | 325 | 0.5691 | 0.8150 | |
|
| 0.4036 | 0.74 | 350 | 0.4560 | 0.8236 | |
|
| 0.3706 | 0.8 | 375 | 0.4640 | 0.8245 | |
|
| 0.3584 | 0.85 | 400 | 0.4605 | 0.8207 | |
|
| 0.3539 | 0.9 | 425 | 0.4932 | 0.8217 | |
|
| 0.3982 | 0.96 | 450 | 0.5397 | 0.8073 | |
|
| 0.3352 | 1.01 | 475 | 0.5490 | 0.8150 | |
|
| 0.2631 | 1.06 | 500 | 0.9244 | 0.8121 | |
|
| 0.2992 | 1.11 | 525 | 0.5666 | 0.8169 | |
|
| 0.2308 | 1.17 | 550 | 0.7285 | 0.8178 | |
|
| 0.2893 | 1.22 | 575 | 0.6907 | 0.8198 | |
|
| 0.2809 | 1.27 | 600 | 0.4998 | 0.8140 | |
|
| 0.2469 | 1.33 | 625 | 0.7260 | 0.8236 | |
|
| 0.331 | 1.38 | 650 | 0.5812 | 0.8293 | |
|
| 0.286 | 1.43 | 675 | 0.5102 | 0.8360 | |
|
| 0.347 | 1.49 | 700 | 0.5696 | 0.8255 | |
|
| 0.2971 | 1.54 | 725 | 0.4114 | 0.8380 | |
|
| 0.3048 | 1.59 | 750 | 0.5466 | 0.8169 | |
|
| 0.3168 | 1.65 | 775 | 0.4787 | 0.8274 | |
|
| 0.2247 | 1.7 | 800 | 0.7926 | 0.8063 | |
|
| 0.2666 | 1.75 | 825 | 0.5763 | 0.8274 | |
|
| 0.2856 | 1.8 | 850 | 0.5131 | 0.8303 | |
|
| 0.2967 | 1.86 | 875 | 0.4970 | 0.8293 | |
|
| 0.296 | 1.91 | 900 | 0.5532 | 0.8293 | |
|
| 0.2828 | 1.96 | 925 | 0.4777 | 0.8274 | |
|
| 0.2708 | 2.02 | 950 | 0.5433 | 0.8351 | |
|
| 0.1406 | 2.07 | 975 | 0.6351 | 0.8351 | |
|
| 0.2046 | 2.12 | 1000 | 0.6058 | 0.8332 | |
|
| 0.2227 | 2.18 | 1025 | 0.5616 | 0.8408 | |
|
| 0.1551 | 2.23 | 1050 | 1.0299 | 0.8360 | |
|
| 0.1465 | 2.28 | 1075 | 0.7842 | 0.8380 | |
|
| 0.2171 | 2.34 | 1100 | 0.6329 | 0.8437 | |
|
| 0.1588 | 2.39 | 1125 | 0.7575 | 0.8418 | |
|
| 0.4245 | 2.44 | 1150 | 0.7603 | 0.8351 | |
|
| 0.2124 | 2.49 | 1175 | 0.5838 | 0.8447 | |
|
| 0.2333 | 2.55 | 1200 | 0.4896 | 0.8418 | |
|
| 0.1943 | 2.6 | 1225 | 0.6343 | 0.8332 | |
|
| 0.1961 | 2.65 | 1250 | 0.6343 | 0.8284 | |
|
| 0.1981 | 2.71 | 1275 | 0.6145 | 0.8332 | |
|
| 0.2151 | 2.76 | 1300 | 0.6335 | 0.8360 | |
|
| 0.1634 | 2.81 | 1325 | 1.1357 | 0.8399 | |
|
| 0.1526 | 2.87 | 1350 | 1.0044 | 0.8293 | |
|
| 0.2096 | 2.92 | 1375 | 0.7761 | 0.8360 | |
|
| 0.2135 | 2.97 | 1400 | 0.9338 | 0.8351 | |
|
| 0.155 | 3.03 | 1425 | 3.3297 | 0.8360 | |
|
| 0.3667 | 3.08 | 1450 | 4.0564 | 0.8370 | |
|
| 0.5925 | 3.13 | 1475 | 6.7411 | 0.8408 | |
|
| 0.5866 | 3.18 | 1500 | 7.1940 | 0.8399 | |
|
| 0.3812 | 3.24 | 1525 | 7.0097 | 0.8351 | |
|
| 0.1041 | 3.29 | 1550 | 7.0157 | 0.8351 | |
|
| 0.3451 | 3.34 | 1575 | 6.2653 | 0.8418 | |
|
| 0.1121 | 3.4 | 1600 | 4.2608 | 0.8485 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.2 |
|
- Pytorch 2.0.1+cu117 |
|
- Datasets 2.14.5 |
|
- Tokenizers 0.11.6 |
|
|