conservation_gc_codon_t5_small
This model is a fine-tuned version of google-t5/t5-small on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3890
- Accuracy: 0.8465
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: polynomial
- num_epochs: 18
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.6949 | 1.0 | 1344 | 0.6931 | 0.4458 |
0.624 | 2.0 | 2688 | 0.4578 | 0.8307 |
0.5455 | 3.0 | 4032 | 0.3847 | 0.8624 |
0.5393 | 4.0 | 5376 | 0.3912 | 0.8648 |
0.5208 | 5.0 | 6720 | 0.3811 | 0.8624 |
0.5277 | 6.0 | 8064 | 0.4022 | 0.8648 |
0.5241 | 7.0 | 9408 | 0.3903 | 0.8575 |
0.5191 | 8.0 | 10752 | 0.3941 | 0.8465 |
0.5031 | 9.0 | 12096 | 0.3941 | 0.8538 |
0.5046 | 10.0 | 13440 | 0.3892 | 0.8441 |
0.5183 | 11.0 | 14784 | 0.3800 | 0.8465 |
0.5088 | 12.0 | 16128 | 0.3783 | 0.8551 |
0.5247 | 13.0 | 17472 | 0.3861 | 0.8502 |
0.5162 | 14.0 | 18816 | 0.3775 | 0.8514 |
0.5125 | 15.0 | 20160 | 0.3819 | 0.8538 |
0.4959 | 16.0 | 21504 | 0.3851 | 0.8502 |
0.5047 | 17.0 | 22848 | 0.3876 | 0.8477 |
0.5014 | 18.0 | 24192 | 0.3890 | 0.8465 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for bif02/conservation_gc_codon_t5_small
Base model
google-t5/t5-small