esm2_t6_8M_UR50D_ft_peft-IA3_Aerin_Yang_et_al_2023_prepared-dataset

This model is a fine-tuned version of facebook/esm2_t6_8M_UR50D on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 70.9898
  • Rmse: 21.7326
  • Mae: 17.9645
  • Spearmanr Corr: 0.8508
  • Spearmanr Corr P Value: 0.0000
  • Pearsonr Corr: 0.9197
  • Pearsonr Corr P Value: 0.0000
  • Spearmanr Corr Of Deltas: 0.8774
  • Spearmanr Corr Of Deltas P Value: 0.0
  • Pearsonr Corr Of Deltas: 0.9193
  • Pearsonr Corr Of Deltas P Value: 0.0
  • Ranking F1 Score: 0.7690
  • Ranking Mcc: 0.6009
  • Rmse Enriched: 7.7540
  • Mae Enriched: 6.8827
  • Spearmanr Corr Enriched: 0.5429
  • Spearmanr Corr Enriched P Value: 0.0000
  • Pearsonr Corr Enriched: 0.0133
  • Pearsonr Corr Enriched P Value: 0.8018
  • Spearmanr Corr Of Deltas Enriched: 0.4375
  • Spearmanr Corr Of Deltas Enriched P Value: 0.0
  • Pearsonr Corr Of Deltas Enriched: 0.0155
  • Pearsonr Corr Of Deltas Enriched P Value: 0.0001
  • Ranking F1 Score Enriched: 0.6552
  • Ranking Mcc Enriched: 0.3704
  • Classification Thresh: 0.2
  • Mcc: 0.8953
  • F1 Score: 0.9508
  • Acc: 0.9478
  • Auc: 0.9772
  • Precision: 0.9471
  • Recall: 0.9482

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.00033
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rmse Mae Spearmanr Corr Spearmanr Corr P Value Pearsonr Corr Pearsonr Corr P Value Spearmanr Corr Of Deltas Spearmanr Corr Of Deltas P Value Pearsonr Corr Of Deltas Pearsonr Corr Of Deltas P Value Ranking F1 Score Ranking Mcc Rmse Enriched Mae Enriched Spearmanr Corr Enriched Spearmanr Corr Enriched P Value Pearsonr Corr Enriched Pearsonr Corr Enriched P Value Spearmanr Corr Of Deltas Enriched Spearmanr Corr Of Deltas Enriched P Value Pearsonr Corr Of Deltas Enriched Pearsonr Corr Of Deltas Enriched P Value Ranking F1 Score Enriched Ranking Mcc Enriched Classification Thresh Mcc F1 Score Acc Auc Precision Recall
460.7964 1.0 42 434.4817 18.6715 14.1108 0.7207 0.0000 0.7197 0.0000 0.7023 0.0 0.7183 0.0 0.7023 0.4735 2.6853 2.6581 0.2823 0.0000 -0.0060 0.9101 0.2369 0.0 -0.0062 0.1180 0.5573 0.1874 0.2 0.6429 0.8116 0.8149 0.9130 0.8225 0.8204
401.283 2.0 84 356.9070 19.0315 14.1069 0.7920 0.0000 0.8101 0.0000 0.8001 0.0 0.8092 0.0 0.7410 0.5476 1.5232 0.9534 0.4040 0.0000 0.0115 0.8284 0.3485 0.0 0.0136 0.0006 0.6080 0.2772 0.5 0.7784 0.8997 0.8896 0.9490 0.8916 0.8868
296.2103 3.0 126 266.6812 19.6865 15.2607 0.7767 0.0000 0.8307 0.0000 0.7993 0.0 0.8300 0.0 0.7236 0.5142 2.7693 2.4028 0.2798 0.0000 0.0089 0.8660 0.2542 0.0 0.0111 0.0048 0.5624 0.1953 0.5 0.7872 0.9034 0.8940 0.9518 0.8956 0.8917
276.8175 4.0 168 264.8241 21.4862 17.0255 0.6996 0.0000 0.7229 0.0000 0.7083 0.0 0.7216 0.0 0.6880 0.4460 4.7746 4.6865 0.0538 0.3096 -0.0040 0.9404 0.0691 0.0000 -0.0033 0.4020 0.4815 0.0470 0.5 0.6588 0.8512 0.8179 0.9274 0.8546 0.8060
186.8575 5.0 210 132.7174 21.4535 18.0706 0.8269 0.0000 0.8739 0.0000 0.8452 0.0 0.8734 0.0 0.7529 0.5703 7.6455 7.4059 0.4745 0.0000 0.0035 0.9477 0.3586 0.0 0.0054 0.1720 0.6306 0.3234 0.2 0.8445 0.9268 0.9224 0.9664 0.9216 0.9228
143.6457 6.0 252 129.9129 22.7771 19.2314 0.8202 0.0000 0.8590 0.0000 0.8268 0.0 0.8584 0.0 0.7534 0.5712 9.4125 9.2343 0.4869 0.0000 0.0092 0.8619 0.3867 0.0 0.0097 0.0143 0.6324 0.3272 0.4 0.8111 0.9138 0.9060 0.9608 0.9070 0.9041
121.6148 7.0 294 98.5414 23.5646 20.1629 0.8314 0.0000 0.8928 0.0000 0.8527 0.0 0.8924 0.0 0.7544 0.5733 11.1044 10.8920 0.4789 0.0000 0.0106 0.8414 0.3919 0.0 0.0098 0.0133 0.6331 0.3247 0.2 0.8619 0.9361 0.9313 0.9693 0.9312 0.9308
130.6186 8.0 336 83.4085 22.8860 19.5197 0.8440 0.0000 0.9093 0.0000 0.8670 0.0 0.9089 0.0 0.7628 0.5891 10.2468 9.8949 0.5147 0.0000 0.0144 0.7863 0.3918 0.0 0.0172 0.0000 0.6469 0.3523 0.4 0.8803 0.9438 0.9403 0.9747 0.9396 0.9406
111.8285 9.0 378 126.4507 22.5877 18.9855 0.8336 0.0000 0.8619 0.0000 0.8273 0.0 0.8613 0.0 0.7563 0.5769 8.6582 8.4736 0.4969 0.0000 0.0143 0.7865 0.4101 0.0 0.0162 0.0000 0.6403 0.3378 0.5 0.8169 0.9176 0.9075 0.9691 0.9137 0.9033
113.5487 10.0 420 96.3994 21.4343 18.1736 0.8491 0.0000 0.8923 0.0000 0.8688 0.0 0.8919 0.0 0.7643 0.5922 8.7046 8.0078 0.5176 0.0000 0.0177 0.7377 0.3835 0.0 0.0144 0.0003 0.6488 0.3507 0.4 0.8381 0.9179 0.9164 0.9787 0.9182 0.9199
123.4506 11.0 462 87.6575 23.5949 20.1224 0.8338 0.0000 0.9054 0.0000 0.8528 0.0 0.9051 0.0 0.7559 0.5760 10.7717 10.5746 0.4567 0.0000 0.0130 0.8061 0.3812 0.0 0.0147 0.0002 0.6241 0.3075 0.2 0.8800 0.9449 0.9403 0.9745 0.9409 0.9391
96.0486 12.0 504 128.6021 20.4876 17.3638 0.8378 0.0000 0.8470 0.0000 0.8524 0.0 0.8463 0.0 0.7622 0.5881 8.8083 7.5890 0.4958 0.0000 0.0487 0.3579 0.3968 0.0 0.0462 0.0000 0.6410 0.3376 0.01 0.7802 0.8740 0.8791 0.9739 0.8937 0.8865
109.1046 13.0 546 84.4819 23.2523 19.7547 0.8349 0.0000 0.9081 0.0000 0.8602 0.0 0.9078 0.0 0.7564 0.5769 10.6298 10.2565 0.4339 0.0000 0.0109 0.8364 0.3583 0.0 0.0140 0.0004 0.6131 0.2904 0.2 0.8892 0.9481 0.9448 0.9776 0.9442 0.9450
94.8759 14.0 588 78.4801 22.0612 18.5745 0.8409 0.0000 0.9109 0.0000 0.8682 0.0 0.9106 0.0 0.7568 0.5778 9.1375 8.3497 0.4721 0.0000 0.0105 0.8433 0.3769 0.0 0.0114 0.0040 0.6252 0.3134 0.2 0.8881 0.9456 0.9433 0.9774 0.9429 0.9451
84.9398 15.0 630 81.5671 21.6981 18.0842 0.8384 0.0000 0.9089 0.0000 0.8614 0.0 0.9086 0.0 0.7581 0.5802 7.6759 7.0524 0.4686 0.0000 0.0030 0.9549 0.3807 0.0 0.0042 0.2859 0.6245 0.3111 0.4 0.8799 0.9444 0.9403 0.9765 0.9402 0.9398
89.9718 16.0 672 96.1698 21.6908 17.9853 0.8413 0.0000 0.8914 0.0000 0.8532 0.0 0.8909 0.0 0.7591 0.5823 7.3056 6.7279 0.4899 0.0000 0.0175 0.7407 0.4178 0.0 0.0199 0.0000 0.6384 0.3335 0.4 0.8623 0.9372 0.9313 0.9750 0.9328 0.9295
93.6391 17.0 714 75.8101 22.3371 18.6930 0.8431 0.0000 0.9153 0.0000 0.8726 0.0 0.9149 0.0 0.7627 0.5889 9.3630 8.5478 0.4792 0.0000 0.0124 0.8146 0.3869 0.0 0.0133 0.0007 0.6338 0.3261 0.1 0.8881 0.9456 0.9433 0.9773 0.9429 0.9451
74.3313 18.0 756 70.9898 21.7326 17.9645 0.8508 0.0000 0.9197 0.0000 0.8774 0.0 0.9193 0.0 0.7690 0.6009 7.7540 6.8827 0.5429 0.0000 0.0133 0.8018 0.4375 0.0 0.0155 0.0001 0.6552 0.3704 0.2 0.8953 0.9508 0.9478 0.9772 0.9471 0.9482

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.0.0+cu117
  • Datasets 2.21.0
  • Tokenizers 0.20.3
Downloads last month
120
Safetensors
Model size
7.84M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for straore/esm2_t6_8M_UR50D_ft_peft-IA3_Aerin_Yang_et_al_2023_prepared-dataset

Finetuned
(13)
this model