DNADebertaK6_Arabidopsis

This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7194

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 600001
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
4.6174 6.12 20000 1.9257
1.8873 12.24 40000 1.8098
1.8213 18.36 60000 1.7952
1.8042 24.48 80000 1.7888
1.7945 30.6 100000 1.7861
1.7873 36.72 120000 1.7772
1.782 42.84 140000 1.7757
1.7761 48.96 160000 1.7632
1.7714 55.08 180000 1.7685
1.7677 61.2 200000 1.7568
1.7637 67.32 220000 1.7570
1.7585 73.44 240000 1.7442
1.7554 79.56 260000 1.7556
1.7515 85.68 280000 1.7505
1.7483 91.8 300000 1.7463
1.745 97.92 320000 1.7425
1.7427 104.04 340000 1.7425
1.7398 110.16 360000 1.7359
1.7377 116.28 380000 1.7369
1.7349 122.4 400000 1.7340
1.7325 128.52 420000 1.7313
1.731 134.64 440000 1.7256
1.7286 140.76 460000 1.7238
1.7267 146.88 480000 1.7324
1.7247 153.0 500000 1.7247
1.7228 159.12 520000 1.7185
1.7209 165.24 540000 1.7166
1.7189 171.36 560000 1.7206
1.7181 177.48 580000 1.7190
1.7159 183.6 600000 1.7194

Framework versions

  • Transformers 4.19.2
  • Pytorch 1.11.0
  • Datasets 2.2.2
  • Tokenizers 0.12.1
Downloads last month
7
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.