elodiesune's picture
Model save
f812151 verified
|
raw
history blame
32.9 kB
metadata
license: mit
base_model: facebook/esm2_t6_8M_UR50D
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
model-index:
  - name: esm_ft_Aerin_Yang_et_al_2023
    results: []

esm_ft_Aerin_Yang_et_al_2023

This model is a fine-tuned version of facebook/esm2_t6_8M_UR50D on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0111
  • Rmse: 0.3306
  • Mae: 0.2493
  • Spearmanr Corr: 0.8522
  • Spearmanr Corr P Value: 0.0000
  • Pearsonr Corr: 0.9108
  • Pearsonr Corr P Value: 0.0000
  • Spearmanr Corr Of Deltas: 0.8703
  • Spearmanr Corr Of Deltas P Value: 0.0
  • Pearsonr Corr Of Deltas: 0.9104
  • Pearsonr Corr Of Deltas P Value: 0.0
  • Ranking F1 Score: 0.7710
  • Ranking Mcc: 0.6081
  • Rmse Enriched: 0.1210
  • Mae Enriched: 0.0675
  • Spearmanr Corr Enriched: 0.5778
  • Spearmanr Corr Enriched P Value: 0.0000
  • Pearsonr Corr Enriched: 0.1383
  • Pearsonr Corr Enriched P Value: 0.0082
  • Spearmanr Corr Of Deltas Enriched: 0.5093
  • Spearmanr Corr Of Deltas Enriched P Value: 0.0
  • Pearsonr Corr Of Deltas Enriched: 0.1388
  • Pearsonr Corr Of Deltas Enriched P Value: 0.0000
  • Ranking F1 Score Enriched: 0.6722
  • Ranking Mcc Enriched: 0.4113
  • Classification Thresh: 0.3
  • Mcc: 0.8926
  • F1 Score: 0.9499
  • Acc: 0.9463
  • Auc: 0.9778
  • Precision: 0.9451
  • Recall: 0.9475

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: not_parallel
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rmse Mae Spearmanr Corr Spearmanr Corr P Value Pearsonr Corr Pearsonr Corr P Value Spearmanr Corr Of Deltas Spearmanr Corr Of Deltas P Value Pearsonr Corr Of Deltas Pearsonr Corr Of Deltas P Value Ranking F1 Score Ranking Mcc Rmse Enriched Mae Enriched Spearmanr Corr Enriched Spearmanr Corr Enriched P Value Pearsonr Corr Enriched Pearsonr Corr Enriched P Value Spearmanr Corr Of Deltas Enriched Spearmanr Corr Of Deltas Enriched P Value Pearsonr Corr Of Deltas Enriched Pearsonr Corr Of Deltas Enriched P Value Ranking F1 Score Enriched Ranking Mcc Enriched Classification Thresh Mcc F1 Score Acc Auc Precision Recall
0.038 1.0 335 0.0209 0.3315 0.2524 0.8158 0.0000 0.8282 0.0000 0.7984 0.0 0.8274 0.0 0.7496 0.5671 0.1044 0.0512 0.5212 0.0000 0.1479 0.0046 0.4274 0.0 0.1481 0.0000 0.6431 0.3596 0.3 0.8185 0.9198 0.9090 0.9573 0.9145 0.9040
0.0245 2.0 670 0.0145 0.3226 0.2470 0.8075 0.0000 0.8774 0.0000 0.8309 0.0 0.8769 0.0 0.7425 0.5536 0.1099 0.0566 0.3775 0.0000 0.1397 0.0075 0.3448 0.0 0.1396 0.0000 0.5868 0.2564 0.3 0.8585 0.9361 0.9299 0.9686 0.9299 0.9286
0.0182 3.0 1005 0.0150 0.3305 0.2520 0.8266 0.0000 0.8710 0.0000 0.8456 0.0 0.8705 0.0 0.7519 0.5715 0.1311 0.0681 0.4981 0.0000 0.1616 0.0019 0.3962 0.0 0.1615 0.0 0.6348 0.3449 0.3 0.8592 0.9348 0.9299 0.9679 0.9287 0.9305
0.0182 4.0 1340 0.0168 0.3479 0.2560 0.8355 0.0000 0.8663 0.0000 0.8395 0.0 0.8657 0.0 0.7545 0.5765 0.0995 0.0415 0.5134 0.0000 0.1067 0.0417 0.4565 0.0 0.1074 0.0000 0.6363 0.3490 0.3 0.8418 0.9298 0.9209 0.9732 0.925 0.9169
0.016 5.0 1675 0.0161 0.3140 0.2513 0.8407 0.0000 0.8933 0.0000 0.8636 0.0 0.8929 0.0 0.7598 0.5867 0.1561 0.1224 0.5345 0.0000 0.2082 0.0001 0.4651 0.0 0.2087 0.0 0.6463 0.3658 0.2 0.8828 0.9464 0.9418 0.9746 0.9410 0.9417
0.0165 6.0 2010 0.0130 0.3258 0.2485 0.8377 0.0000 0.8901 0.0000 0.8557 0.0 0.8896 0.0 0.7571 0.5816 0.1145 0.0628 0.4912 0.0000 0.1737 0.0009 0.4237 0.0 0.1743 0.0 0.6328 0.3385 0.2 0.8617 0.9380 0.9313 0.9774 0.9325 0.9292
0.015 7.0 2345 0.0115 0.3478 0.2582 0.8443 0.0000 0.9082 0.0000 0.8633 0.0 0.9078 0.0 0.7631 0.5928 0.1256 0.0575 0.5489 0.0000 0.1400 0.0074 0.4376 0.0 0.1408 0.0000 0.6547 0.3820 0.2 0.8979 0.9532 0.9493 0.9755 0.9485 0.9494
0.0161 8.0 2680 0.0131 0.3262 0.2470 0.8447 0.0000 0.8908 0.0000 0.8638 0.0 0.8903 0.0 0.7637 0.5940 0.1046 0.0477 0.5252 0.0000 0.1337 0.0106 0.4596 0.0 0.1344 0.0000 0.6426 0.3571 0.4 0.8862 0.9474 0.9433 0.9788 0.9423 0.9439
0.0143 9.0 3015 0.0124 0.3131 0.2477 0.8564 0.0000 0.9061 0.0000 0.8713 0.0 0.9057 0.0 0.7723 0.6103 0.1166 0.0766 0.6197 0.0000 0.1191 0.0229 0.5005 0.0 0.1197 0.0000 0.6849 0.4376 0.2 0.8856 0.9482 0.9433 0.9762 0.9433 0.9423
0.0147 10.0 3350 0.0120 0.3388 0.2528 0.8504 0.0000 0.9032 0.0000 0.8649 0.0 0.9028 0.0 0.7674 0.6009 0.1253 0.0623 0.5835 0.0000 0.1332 0.0108 0.4718 0.0 0.1338 0.0000 0.6686 0.4078 0.2 0.8827 0.9465 0.9418 0.9754 0.9412 0.9415
0.014 11.0 3685 0.0151 0.3394 0.2509 0.8267 0.0000 0.8718 0.0000 0.8449 0.0 0.8713 0.0 0.7517 0.5711 0.1322 0.0565 0.4668 0.0000 0.1225 0.0192 0.3979 0.0 0.1223 0.0000 0.6187 0.3171 0.3 0.8509 0.9302 0.9254 0.9720 0.9242 0.9267
0.0162 12.0 4020 0.0125 0.3347 0.2483 0.8461 0.0000 0.8945 0.0000 0.8537 0.0 0.8940 0.0 0.7631 0.5929 0.1150 0.0537 0.5391 0.0000 0.1000 0.0562 0.4447 0.0 0.1007 0.0000 0.6500 0.3694 0.3 0.8829 0.9462 0.9418 0.9779 0.9409 0.9420
0.0162 13.0 4355 0.0165 0.3158 0.2495 0.8230 0.0000 0.8684 0.0000 0.8456 0.0 0.8680 0.0 0.7492 0.5664 0.1513 0.1012 0.4649 0.0000 0.1911 0.0002 0.3978 0.0 0.1908 0.0 0.6176 0.3108 0.1 0.8620 0.9383 0.9313 0.9694 0.9334 0.9286
0.0149 14.0 4690 0.0118 0.3240 0.2506 0.8240 0.0000 0.9063 0.0000 0.8565 0.0 0.9058 0.0 0.7520 0.5717 0.1088 0.0547 0.4151 0.0000 0.1528 0.0034 0.3628 0.0 0.1530 0.0 0.6027 0.2809 0.2 0.8917 0.9507 0.9463 0.9773 0.9458 0.9458
0.0131 15.0 5025 0.0118 0.3238 0.2501 0.8574 0.0000 0.9031 0.0000 0.8767 0.0 0.9027 0.0 0.7724 0.6110 0.1250 0.0702 0.6054 0.0000 0.1976 0.0001 0.4866 0.0 0.1974 0.0 0.6779 0.4211 0.2 0.8949 0.9517 0.9478 0.9783 0.9469 0.9480
0.0131 16.0 5360 0.0130 0.3150 0.2469 0.8498 0.0000 0.8979 0.0000 0.8552 0.0 0.8974 0.0 0.7671 0.6003 0.1060 0.0659 0.5602 0.0000 0.1186 0.0235 0.4725 0.0 0.1190 0.0000 0.6586 0.3893 0.2 0.8561 0.9358 0.9284 0.9785 0.9308 0.9254
0.0137 17.0 5695 0.0129 0.3380 0.2502 0.8438 0.0000 0.8905 0.0000 0.8597 0.0 0.8900 0.0 0.7604 0.5879 0.1175 0.0539 0.5465 0.0000 0.1454 0.0054 0.4599 0.0 0.1458 0.0000 0.6494 0.3709 0.2 0.8736 0.9426 0.9373 0.9754 0.9370 0.9365
0.0137 18.0 6030 0.0132 0.3250 0.2464 0.8467 0.0000 0.8891 0.0000 0.8493 0.0 0.8886 0.0 0.7603 0.5875 0.1005 0.0494 0.5406 0.0000 0.0979 0.0616 0.4716 0.0 0.0986 0.0000 0.6487 0.3743 0.3 0.8767 0.9447 0.9388 0.9786 0.9399 0.9368
0.0146 19.0 6365 0.0102 0.3318 0.2495 0.8438 0.0000 0.9145 0.0000 0.8632 0.0 0.9141 0.0 0.7637 0.5940 0.1105 0.0537 0.5224 0.0000 0.1566 0.0027 0.4352 0.0 0.1568 0.0 0.6453 0.3639 0.2 0.9158 0.9615 0.9582 0.9783 0.9577 0.9581
0.014 20.0 6700 0.0123 0.3263 0.2479 0.8477 0.0000 0.8972 0.0000 0.8622 0.0 0.8967 0.0 0.7682 0.6022 0.1180 0.0612 0.5451 0.0000 0.1459 0.0052 0.4474 0.0 0.1468 0.0000 0.6527 0.3791 0.2 0.8766 0.9439 0.9388 0.9785 0.9384 0.9382
0.0132 21.0 7035 0.0110 0.3425 0.2555 0.8511 0.0000 0.9090 0.0000 0.8671 0.0 0.9086 0.0 0.7689 0.6039 0.1058 0.0504 0.5635 0.0000 0.1368 0.0089 0.4661 0.0 0.1372 0.0000 0.6556 0.3808 0.2 0.8916 0.9510 0.9463 0.9791 0.9463 0.9453
0.0121 22.0 7370 0.0121 0.3345 0.2509 0.8548 0.0000 0.9004 0.0000 0.8649 0.0 0.8999 0.0 0.7736 0.6126 0.1201 0.0630 0.5958 0.0000 0.1415 0.0068 0.5002 0.0 0.1421 0.0000 0.6788 0.4265 0.2 0.8827 0.9465 0.9418 0.9777 0.9412 0.9415
0.0127 23.0 7705 0.0121 0.3290 0.2501 0.8416 0.0000 0.9051 0.0000 0.8652 0.0 0.9047 0.0 0.7622 0.5912 0.1298 0.0773 0.5242 0.0000 0.1694 0.0012 0.4502 0.0 0.1701 0.0 0.6456 0.3641 0.3 0.8960 0.9510 0.9478 0.9768 0.9467 0.9494
0.0124 24.0 8040 0.0114 0.3410 0.2521 0.8606 0.0000 0.9044 0.0000 0.8724 0.0 0.9040 0.0 0.7773 0.6195 0.1134 0.0500 0.6115 0.0000 0.1403 0.0073 0.5075 0.0 0.1409 0.0000 0.6787 0.4268 0.2 0.8887 0.9492 0.9448 0.9799 0.9442 0.9445
0.0124 25.0 8375 0.0125 0.3433 0.2569 0.8541 0.0000 0.9005 0.0000 0.8656 0.0 0.9001 0.0 0.7684 0.6032 0.1213 0.0604 0.5910 0.0000 0.1409 0.0070 0.4780 0.0 0.1412 0.0000 0.6702 0.4086 0.1 0.8737 0.9432 0.9373 0.9775 0.9382 0.9355
0.0132 26.0 8710 0.0140 0.3280 0.2488 0.8406 0.0000 0.8811 0.0000 0.8502 0.0 0.8805 0.0 0.7646 0.5959 0.1100 0.0637 0.5411 0.0000 0.1550 0.0030 0.4777 0.0 0.1553 0.0 0.6539 0.3757 0.2 0.8503 0.9333 0.9254 0.9738 0.9282 0.9221
0.0145 27.0 9045 0.0135 0.3307 0.2494 0.8445 0.0000 0.8862 0.0000 0.8564 0.0 0.8857 0.0 0.7652 0.5969 0.1328 0.0676 0.5786 0.0000 0.1629 0.0018 0.4697 0.0 0.1628 0.0 0.6666 0.4034 0.1 0.8706 0.9418 0.9358 0.9718 0.9365 0.9341
0.0125 28.0 9380 0.0212 0.3222 0.2533 0.8398 0.0000 0.8542 0.0000 0.8541 0.0 0.8535 0.0 0.7620 0.5907 0.1959 0.1420 0.5024 0.0000 0.2541 0.0000 0.4328 0.0 0.2539 0.0 0.6354 0.3445 0.1 0.8777 0.9428 0.9388 0.9783 0.9377 0.9401
0.0152 29.0 9715 0.0128 0.3229 0.2477 0.8454 0.0000 0.8933 0.0000 0.8586 0.0 0.8928 0.0 0.7656 0.5975 0.1134 0.0639 0.5512 0.0000 0.1683 0.0012 0.4725 0.0 0.1687 0.0 0.6538 0.3816 0.2 0.8738 0.9434 0.9373 0.9757 0.9386 0.9352
0.0117 30.0 10050 0.0136 0.3341 0.2490 0.8443 0.0000 0.8882 0.0000 0.8529 0.0 0.8877 0.0 0.7634 0.5933 0.0941 0.0371 0.5213 0.0000 0.1168 0.0256 0.4651 0.0 0.1173 0.0000 0.6432 0.3612 0.3 0.8676 0.9405 0.9343 0.9789 0.9352 0.9325
0.011 31.0 10385 0.0158 0.3293 0.2543 0.8330 0.0000 0.8818 0.0000 0.8491 0.0 0.8812 0.0 0.7580 0.5833 0.1442 0.1010 0.5286 0.0000 0.1291 0.0136 0.4536 0.0 0.1297 0.0000 0.6454 0.3616 0.2 0.8675 0.9399 0.9343 0.9701 0.9340 0.9335
0.0145 32.0 10720 0.0126 0.3373 0.2508 0.8394 0.0000 0.8927 0.0000 0.8549 0.0 0.8923 0.0 0.7615 0.5898 0.1046 0.0478 0.5271 0.0000 0.1537 0.0032 0.4602 0.0 0.1539 0.0 0.6463 0.3637 0.3 0.8766 0.9441 0.9388 0.9749 0.9387 0.9379
0.012 33.0 11055 0.0132 0.3342 0.2503 0.8562 0.0000 0.8886 0.0000 0.8727 0.0 0.8881 0.0 0.7737 0.6129 0.1347 0.0668 0.6093 0.0000 0.1682 0.0013 0.4913 0.0 0.1686 0.0 0.6799 0.4291 0.1 0.8858 0.9477 0.9433 0.9774 0.9425 0.9434
0.0113 34.0 11390 0.0111 0.3364 0.2489 0.8561 0.0000 0.9062 0.0000 0.8695 0.0 0.9058 0.0 0.7718 0.6095 0.1158 0.0502 0.5813 0.0000 0.1314 0.0120 0.4868 0.0 0.1320 0.0000 0.6684 0.4049 0.2 0.8918 0.9504 0.9463 0.9804 0.9455 0.9464
0.0113 35.0 11725 0.0118 0.3413 0.2542 0.8545 0.0000 0.9031 0.0000 0.8657 0.0 0.9026 0.0 0.7723 0.6105 0.1042 0.0460 0.5860 0.0000 0.1317 0.0118 0.4950 0.0 0.1321 0.0000 0.6731 0.4123 0.2 0.8856 0.9482 0.9433 0.9784 0.9433 0.9423
0.0099 36.0 12060 0.0122 0.3514 0.2605 0.8461 0.0000 0.9016 0.0000 0.8687 0.0 0.9012 0.0 0.7696 0.6051 0.1288 0.0645 0.5650 0.0000 0.1560 0.0028 0.4806 0.0 0.1566 0.0 0.6647 0.3975 0.2 0.8986 0.9526 0.9493 0.9758 0.9481 0.9505
0.0113 37.0 12395 0.0127 0.3301 0.2495 0.8422 0.0000 0.8980 0.0000 0.8641 0.0 0.8975 0.0 0.7670 0.6000 0.1362 0.0761 0.5706 0.0000 0.1543 0.0031 0.4834 0.0 0.1546 0.0 0.6673 0.4070 0.2 0.8869 0.9469 0.9433 0.9716 0.9422 0.9447
0.01 38.0 12730 0.0115 0.3294 0.2480 0.8506 0.0000 0.9036 0.0000 0.8639 0.0 0.9032 0.0 0.7680 0.6028 0.1084 0.0467 0.5947 0.0000 0.1379 0.0083 0.5140 0.0 0.1382 0.0000 0.6786 0.4224 0.3 0.8953 0.9515 0.9478 0.9746 0.9467 0.9486
0.0107 39.0 13065 0.0111 0.3306 0.2493 0.8522 0.0000 0.9108 0.0000 0.8703 0.0 0.9104 0.0 0.7710 0.6081 0.1210 0.0675 0.5778 0.0000 0.1383 0.0082 0.5093 0.0 0.1388 0.0000 0.6722 0.4113 0.3 0.8926 0.9499 0.9463 0.9778 0.9451 0.9475

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.5.1+cu118
  • Datasets 3.1.0
  • Tokenizers 0.19.1