resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.5

This model is a fine-tuned version of microsoft/resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6500
  • Accuracy: 0.69
  • Brier Loss: 0.5003
  • Nll: 2.5629
  • F1 Micro: 0.69
  • F1 Macro: 0.6350
  • Ece: 0.3098
  • Aurc: 0.1329

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 13 1.4712 0.165 0.8966 8.4652 0.165 0.1101 0.2129 0.8342
No log 2.0 26 1.4590 0.165 0.8951 8.1097 0.165 0.1059 0.2034 0.8021
No log 3.0 39 1.4178 0.175 0.8873 6.8095 0.175 0.0813 0.2150 0.7994
No log 4.0 52 1.3342 0.18 0.8702 6.4137 0.18 0.0475 0.2314 0.7558
No log 5.0 65 1.2828 0.2 0.8587 6.1547 0.2000 0.0642 0.2429 0.7009
No log 6.0 78 1.2675 0.205 0.8548 6.1395 0.205 0.0612 0.2348 0.7022
No log 7.0 91 1.0716 0.31 0.7962 6.4589 0.31 0.1241 0.2787 0.4433
No log 8.0 104 1.1184 0.29 0.8126 6.2585 0.29 0.1394 0.2863 0.5819
No log 9.0 117 1.1021 0.31 0.8075 6.0370 0.31 0.1697 0.2834 0.5458
No log 10.0 130 1.0268 0.33 0.7815 6.1370 0.33 0.1921 0.2856 0.5395
No log 11.0 143 1.0290 0.355 0.7759 5.3640 0.3550 0.2143 0.2795 0.4697
No log 12.0 156 0.9169 0.36 0.7262 5.2997 0.36 0.1995 0.2761 0.4070
No log 13.0 169 0.9903 0.36 0.7586 4.9404 0.36 0.2200 0.2832 0.5343
No log 14.0 182 0.9128 0.425 0.7082 4.5862 0.425 0.2706 0.2834 0.3542
No log 15.0 195 1.0046 0.405 0.7441 3.9763 0.405 0.2759 0.3142 0.4602
No log 16.0 208 0.9277 0.41 0.7146 4.3670 0.41 0.2763 0.2695 0.4409
No log 17.0 221 0.9726 0.505 0.7208 3.5350 0.505 0.3736 0.3332 0.3469
No log 18.0 234 0.7717 0.505 0.6280 3.4386 0.505 0.3412 0.2564 0.2567
No log 19.0 247 0.7723 0.58 0.6143 3.6207 0.58 0.4125 0.3178 0.1847
No log 20.0 260 0.8182 0.57 0.6419 3.1633 0.57 0.4855 0.3517 0.2530
No log 21.0 273 0.7333 0.58 0.5891 3.3014 0.58 0.4512 0.2718 0.2137
No log 22.0 286 0.7374 0.665 0.5856 3.0299 0.665 0.5432 0.3459 0.1657
No log 23.0 299 0.7083 0.645 0.5564 3.0874 0.645 0.5180 0.3112 0.1608
No log 24.0 312 0.7480 0.64 0.5901 3.0218 0.64 0.5410 0.3701 0.1976
No log 25.0 325 0.7547 0.68 0.5894 2.9002 0.68 0.5801 0.3817 0.1559
No log 26.0 338 0.6998 0.65 0.5474 2.9402 0.65 0.5468 0.2875 0.1707
No log 27.0 351 0.6967 0.66 0.5506 2.8344 0.66 0.5578 0.3105 0.1707
No log 28.0 364 0.6733 0.655 0.5332 2.6492 0.655 0.5719 0.2935 0.1554
No log 29.0 377 0.7162 0.67 0.5596 2.7250 0.67 0.5721 0.3388 0.1423
No log 30.0 390 0.6826 0.665 0.5291 2.7460 0.665 0.5797 0.3353 0.1469
No log 31.0 403 0.6761 0.665 0.5195 2.7938 0.665 0.5647 0.3096 0.1485
No log 32.0 416 0.6745 0.695 0.5295 2.6172 0.695 0.6160 0.3171 0.1636
No log 33.0 429 0.6785 0.695 0.5242 2.5816 0.695 0.6115 0.3475 0.1349
No log 34.0 442 0.6688 0.665 0.5174 2.6401 0.665 0.5833 0.2988 0.1427
No log 35.0 455 0.6767 0.675 0.5275 2.6364 0.675 0.6027 0.3285 0.1483
No log 36.0 468 0.6605 0.695 0.5076 2.6483 0.695 0.6252 0.3127 0.1372
No log 37.0 481 0.6538 0.705 0.5029 2.6284 0.705 0.6340 0.3173 0.1220
No log 38.0 494 0.6610 0.695 0.5102 2.5052 0.695 0.6375 0.3128 0.1298
0.7532 39.0 507 0.6618 0.695 0.5110 2.5663 0.695 0.6268 0.3297 0.1367
0.7532 40.0 520 0.6749 0.69 0.5235 2.5343 0.69 0.6341 0.3256 0.1332
0.7532 41.0 533 0.6574 0.695 0.5062 2.4223 0.695 0.6338 0.3292 0.1469
0.7532 42.0 546 0.6530 0.695 0.5026 2.6189 0.695 0.6390 0.2950 0.1391
0.7532 43.0 559 0.6509 0.685 0.5003 2.5417 0.685 0.6299 0.3150 0.1368
0.7532 44.0 572 0.6520 0.71 0.5030 2.4796 0.7100 0.6453 0.3251 0.1286
0.7532 45.0 585 0.6494 0.69 0.4994 2.5431 0.69 0.6327 0.3138 0.1279
0.7532 46.0 598 0.6515 0.71 0.5007 2.5295 0.7100 0.6541 0.3307 0.1208
0.7532 47.0 611 0.6477 0.69 0.4979 2.5971 0.69 0.6323 0.3263 0.1281
0.7532 48.0 624 0.6495 0.7 0.5007 2.6162 0.7 0.6395 0.3412 0.1272
0.7532 49.0 637 0.6478 0.7 0.4968 2.4946 0.7 0.6386 0.3191 0.1309
0.7532 50.0 650 0.6500 0.69 0.5003 2.5629 0.69 0.6350 0.3098 0.1329

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.2.0.dev20231002
  • Datasets 2.7.1
  • Tokenizers 0.13.3
Downloads last month
83
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bdpc/resnet101-base_tobacco-cnn_tobacco3482_kd_CEKD_t1.5_a0.5

Finetuned
(134)
this model