Jasontth's picture
climate_verfication_model
9577e68
|
raw
history blame
13.8 kB
metadata
license: mit
base_model: roberta-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - precision
  - recall
  - f1
model-index:
  - name: results
    results: []

results

This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7998
  • Accuracy: 0.7023
  • Precision: 0.7144
  • Recall: 0.7023
  • F1: 0.7065

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
1.0859 0.04 10 1.0722 0.6493 0.6735 0.6493 0.5118
1.0731 0.07 20 1.0562 0.6488 0.4209 0.6488 0.5105
1.0456 0.11 30 1.0316 0.6488 0.4209 0.6488 0.5105
1.0199 0.15 40 0.9979 0.6488 0.4209 0.6488 0.5105
0.9613 0.19 50 0.9362 0.6488 0.4209 0.6488 0.5105
0.8949 0.22 60 0.8645 0.6488 0.4209 0.6488 0.5105
0.9151 0.26 70 0.8606 0.6488 0.4209 0.6488 0.5105
0.8583 0.3 80 0.8593 0.6488 0.4209 0.6488 0.5105
0.9604 0.33 90 0.8539 0.6488 0.4209 0.6488 0.5105
0.7919 0.37 100 0.8504 0.6488 0.4209 0.6488 0.5105
0.9365 0.41 110 0.8520 0.6488 0.4209 0.6488 0.5105
0.9285 0.45 120 0.8521 0.6488 0.4209 0.6488 0.5105
0.8564 0.48 130 0.8615 0.6488 0.4209 0.6488 0.5105
0.8132 0.52 140 0.8583 0.6488 0.4209 0.6488 0.5105
0.8911 0.56 150 0.8467 0.6488 0.4209 0.6488 0.5105
0.8383 0.59 160 0.8373 0.6488 0.4209 0.6488 0.5105
0.8387 0.63 170 0.8372 0.6488 0.4209 0.6488 0.5105
0.979 0.67 180 0.8595 0.6488 0.4209 0.6488 0.5105
0.7621 0.71 190 0.8642 0.6488 0.4209 0.6488 0.5105
0.8367 0.74 200 0.8276 0.6553 0.6447 0.6553 0.5271
0.9116 0.78 210 0.8466 0.6493 0.6735 0.6493 0.5118
0.8444 0.82 220 0.8171 0.6504 0.6740 0.6504 0.5143
0.7815 0.86 230 0.7919 0.6667 0.6008 0.6667 0.5615
0.8592 0.89 240 0.7907 0.6732 0.5878 0.6732 0.5962
0.8933 0.93 250 0.7963 0.6813 0.6004 0.6813 0.6102
0.8409 0.97 260 0.7812 0.6797 0.6021 0.6797 0.6066
0.8285 1.0 270 0.7794 0.6737 0.5987 0.6737 0.5940
0.7895 1.04 280 0.7893 0.6846 0.6044 0.6846 0.6168
0.8012 1.08 290 0.7617 0.6813 0.6129 0.6813 0.6002
0.7215 1.12 300 0.8029 0.6748 0.6248 0.6748 0.5764
0.8134 1.15 310 0.8294 0.6781 0.5949 0.6781 0.6294
0.7247 1.19 320 0.7944 0.6732 0.5941 0.6732 0.6290
0.8043 1.23 330 0.7978 0.6656 0.5931 0.6656 0.6268
0.7647 1.26 340 0.7571 0.6884 0.6344 0.6884 0.6063
0.7807 1.3 350 0.7958 0.6412 0.6041 0.6412 0.6167
0.8031 1.34 360 0.7261 0.6906 0.6820 0.6906 0.6680
0.6965 1.38 370 0.7287 0.7003 0.6796 0.7003 0.6813
0.69 1.41 380 0.7115 0.7074 0.6981 0.7074 0.6581
0.7015 1.45 390 0.7391 0.7063 0.6932 0.7063 0.6813
0.7461 1.49 400 0.7624 0.6987 0.6791 0.6987 0.6787
0.758 1.52 410 0.7778 0.6819 0.6893 0.6819 0.6695
0.7617 1.56 420 0.7913 0.6878 0.6906 0.6878 0.6339
0.7848 1.6 430 0.7785 0.6629 0.6806 0.6629 0.6643
0.8138 1.64 440 0.7191 0.6954 0.6763 0.6954 0.6474
0.7451 1.67 450 0.7086 0.7030 0.7061 0.7030 0.6434
0.788 1.71 460 0.7202 0.6840 0.6956 0.6840 0.6497
0.7107 1.75 470 0.7543 0.6835 0.6067 0.6835 0.6379
0.7047 1.78 480 0.7940 0.6862 0.6697 0.6862 0.6258
0.8561 1.82 490 0.7497 0.6802 0.6860 0.6802 0.6666
0.804 1.86 500 0.7247 0.6938 0.6757 0.6938 0.6555
0.7796 1.9 510 0.7239 0.7063 0.6988 0.7063 0.6702
0.8124 1.93 520 0.7693 0.6976 0.7003 0.6976 0.6621
0.7306 1.97 530 0.8395 0.6363 0.6788 0.6363 0.6329
0.7079 2.01 540 0.7051 0.7041 0.6828 0.7041 0.6811
0.6018 2.04 550 0.7327 0.7058 0.6873 0.7058 0.6849
0.5824 2.08 560 0.7819 0.6743 0.6811 0.6743 0.6774
0.6001 2.12 570 0.7547 0.7139 0.6980 0.7139 0.7023
0.6471 2.16 580 0.7617 0.7172 0.7040 0.7172 0.6848
0.6226 2.19 590 0.7421 0.6927 0.6974 0.6927 0.6909
0.5203 2.23 600 0.7935 0.6694 0.6871 0.6694 0.6748
0.6445 2.27 610 0.7722 0.7182 0.7038 0.7182 0.6947
0.7027 2.3 620 0.7517 0.6754 0.7039 0.6754 0.6814
0.5662 2.34 630 0.6804 0.7182 0.7069 0.7182 0.7090
0.6304 2.38 640 0.6965 0.7128 0.6958 0.7128 0.6904
0.6258 2.42 650 0.7053 0.7041 0.7041 0.7041 0.7041
0.4966 2.45 660 0.7300 0.7177 0.7030 0.7177 0.7033
0.5721 2.49 670 0.8330 0.6683 0.6910 0.6683 0.6737
0.5507 2.53 680 0.8154 0.6857 0.7020 0.6857 0.6923
0.6392 2.57 690 0.8048 0.7166 0.7079 0.7166 0.6814
0.6128 2.6 700 0.7445 0.6786 0.6890 0.6786 0.6827
0.622 2.64 710 0.7029 0.7047 0.6895 0.7047 0.6870
0.5847 2.68 720 0.7911 0.6569 0.6889 0.6569 0.6677
0.6454 2.71 730 0.7062 0.7112 0.7017 0.7112 0.6797
0.5264 2.75 740 0.7419 0.6992 0.6893 0.6992 0.6870
0.649 2.79 750 0.7243 0.7063 0.7009 0.7063 0.7030
0.5343 2.83 760 0.7478 0.6889 0.7030 0.6889 0.6946
0.5335 2.86 770 0.7222 0.7237 0.7115 0.7237 0.7052
0.5228 2.9 780 0.7182 0.7226 0.7152 0.7226 0.7063
0.5605 2.94 790 0.7195 0.7210 0.7128 0.7210 0.7106
0.627 2.97 800 0.7559 0.6878 0.7135 0.6878 0.6933
0.6536 3.01 810 0.6616 0.7275 0.7141 0.7275 0.7105
0.4106 3.05 820 0.7176 0.7307 0.7209 0.7307 0.7230
0.3588 3.09 830 0.8387 0.7226 0.7230 0.7226 0.7183
0.404 3.12 840 0.8459 0.7117 0.7138 0.7117 0.7124
0.4313 3.16 850 0.8406 0.6992 0.7108 0.6992 0.7036
0.3407 3.2 860 0.8317 0.6916 0.7133 0.6916 0.6997
0.365 3.23 870 0.8310 0.6992 0.7110 0.6992 0.7035
0.3776 3.27 880 0.8376 0.6927 0.7107 0.6927 0.6986
0.3442 3.31 890 0.8554 0.7079 0.7082 0.7079 0.7079
0.41 3.35 900 0.9473 0.6401 0.7039 0.6401 0.6550
0.4649 3.38 910 0.8139 0.7134 0.7063 0.7134 0.7090
0.4359 3.42 920 0.8275 0.6992 0.7095 0.6992 0.7022
0.2906 3.46 930 0.8398 0.7096 0.7013 0.7096 0.7025
0.413 3.49 940 0.8558 0.6982 0.7049 0.6982 0.7009
0.3936 3.53 950 0.8457 0.7025 0.7058 0.7025 0.7039
0.3691 3.57 960 0.8312 0.7014 0.7102 0.7014 0.7050
0.3747 3.61 970 0.8146 0.7210 0.7074 0.7210 0.7086
0.4037 3.64 980 0.7906 0.7199 0.7132 0.7199 0.7150
0.4112 3.68 990 0.8135 0.7139 0.7145 0.7139 0.7137
0.3685 3.72 1000 0.8024 0.7106 0.7144 0.7106 0.7123
0.3881 3.75 1010 0.8339 0.7063 0.7109 0.7063 0.7063
0.4168 3.79 1020 0.8261 0.7231 0.7191 0.7231 0.7206
0.3591 3.83 1030 0.8014 0.7340 0.7258 0.7340 0.7281
0.3632 3.87 1040 0.8568 0.6878 0.7206 0.6878 0.6974
0.259 3.9 1050 0.8182 0.7324 0.7226 0.7324 0.7225
0.3741 3.94 1060 0.8511 0.7009 0.7200 0.7009 0.7078
0.3551 3.98 1070 0.8283 0.7150 0.7186 0.7150 0.7159
0.4105 4.01 1080 0.7817 0.7204 0.7209 0.7204 0.7205
0.2411 4.05 1090 0.8384 0.7372 0.7272 0.7372 0.7274
0.2166 4.09 1100 0.9466 0.7003 0.7240 0.7003 0.7066
0.4075 4.13 1110 0.9255 0.6976 0.7157 0.6976 0.7042
0.3328 4.16 1120 0.9120 0.6922 0.7153 0.6922 0.7003
0.1584 4.2 1130 0.9688 0.6857 0.7100 0.6857 0.6942
0.1737 4.24 1140 1.0205 0.7356 0.7267 0.7356 0.7289
0.2335 4.28 1150 1.0734 0.7068 0.7194 0.7068 0.7116
0.2179 4.31 1160 1.0748 0.7085 0.7190 0.7085 0.7127
0.244 4.35 1170 1.0801 0.7030 0.7220 0.7030 0.7097
0.2151 4.39 1180 1.0332 0.7112 0.7176 0.7112 0.7140
0.2602 4.42 1190 1.0343 0.7134 0.7181 0.7134 0.7154
0.131 4.46 1200 1.0453 0.7128 0.7175 0.7128 0.7149
0.1966 4.5 1210 1.0673 0.7096 0.7160 0.7096 0.7121
0.2136 4.54 1220 1.0550 0.7166 0.7157 0.7166 0.7158
0.1625 4.57 1230 1.0690 0.7172 0.7148 0.7172 0.7156
0.2199 4.61 1240 1.0908 0.7112 0.7182 0.7112 0.7141
0.2028 4.65 1250 1.0991 0.7085 0.7200 0.7085 0.7130
0.2669 4.68 1260 1.0944 0.7134 0.7205 0.7134 0.7163
0.1408 4.72 1270 1.0827 0.7248 0.7198 0.7248 0.7215
0.2649 4.76 1280 1.0974 0.7199 0.7182 0.7199 0.7187
0.1512 4.8 1290 1.1159 0.7220 0.7212 0.7220 0.7214
0.1962 4.83 1300 1.1374 0.7161 0.7206 0.7161 0.7180
0.2322 4.87 1310 1.1435 0.7144 0.7226 0.7144 0.7178
0.2095 4.91 1320 1.1408 0.7106 0.7220 0.7106 0.7151
0.1534 4.94 1330 1.1466 0.7123 0.7248 0.7123 0.7170
0.2505 4.98 1340 1.1481 0.7123 0.7248 0.7123 0.7170

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0