lombardata's picture
Evaluation on the test set completed on 2024_09_08.
59679ee verified
|
raw
history blame
12.5 kB
metadata
license: apache-2.0
base_model: facebook/dinov2-giant
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: DinoVdeau-giant-2024_08_28-batch-size32_epochs150_freeze
    results: []

DinoVdeau-giant-2024_08_28-batch-size32_epochs150_freeze

This model is a fine-tuned version of facebook/dinov2-giant on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1208
  • F1 Micro: 0.8209
  • F1 Macro: 0.7101
  • Roc Auc: 0.8812
  • Accuracy: 0.3080
  • Learning Rate: 0.0000

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Accuracy F1 Macro F1 Micro Validation Loss Roc Auc Rate
No log 1.0 273 0.2121 0.5175 0.7424 0.1744 0.8286 0.001
0.2593 2.0 546 0.2477 0.5913 0.7777 0.1514 0.8565 0.001
0.2593 3.0 819 0.2387 0.6203 0.7753 0.1557 0.8580 0.001
0.1694 4.0 1092 0.2495 0.6113 0.7691 0.1499 0.8373 0.001
0.1694 5.0 1365 0.2450 0.6317 0.7745 0.1577 0.8461 0.001
0.1637 6.0 1638 0.2574 0.6221 0.7803 0.1530 0.8509 0.001
0.1637 7.0 1911 0.2616 0.6318 0.7838 0.1423 0.8520 0.001
0.1598 8.0 2184 0.2592 0.6268 0.7825 0.1434 0.8521 0.001
0.1598 9.0 2457 0.2585 0.6407 0.7841 0.1432 0.8556 0.001
0.157 10.0 2730 0.2592 0.6350 0.7779 0.1507 0.8422 0.001
0.1564 11.0 3003 0.2685 0.6442 0.7906 0.1401 0.8599 0.001
0.1564 12.0 3276 0.2606 0.6413 0.7896 0.1404 0.8593 0.001
0.1556 13.0 3549 0.2696 0.6359 0.7822 0.1421 0.8492 0.001
0.1556 14.0 3822 0.2637 0.6460 0.7887 0.1394 0.8568 0.001
0.1547 15.0 4095 0.2554 0.6554 0.7915 0.1380 0.8576 0.001
0.1547 16.0 4368 0.2550 0.6453 0.7858 0.1441 0.8506 0.001
0.1539 17.0 4641 0.2678 0.6485 0.7904 0.1411 0.8607 0.001
0.1539 18.0 4914 0.2606 0.6549 0.7941 0.1381 0.8618 0.001
0.1552 19.0 5187 0.2654 0.6523 0.7937 0.1372 0.8604 0.001
0.1552 20.0 5460 0.2540 0.6515 0.7915 0.1396 0.8594 0.001
0.1531 21.0 5733 0.2578 0.6543 0.7925 0.1379 0.8593 0.001
0.1536 22.0 6006 0.2661 0.6524 0.7952 0.1363 0.8620 0.001
0.1536 23.0 6279 0.2710 0.6567 0.7962 0.1363 0.8595 0.001
0.1535 24.0 6552 0.2661 0.6439 0.7872 0.1401 0.8565 0.001
0.1535 25.0 6825 0.2755 0.6538 0.7961 0.1360 0.8589 0.001
0.153 26.0 7098 0.2692 0.6408 0.7942 0.1371 0.8612 0.001
0.153 27.0 7371 0.2654 0.6470 0.7902 0.1367 0.8539 0.001
0.1532 28.0 7644 0.2689 0.6427 0.7912 0.1371 0.8539 0.001
0.1532 29.0 7917 0.2692 0.6485 0.7944 0.1378 0.8597 0.001
0.1539 30.0 8190 0.2651 0.6472 0.7938 0.1364 0.8590 0.001
0.1539 31.0 8463 0.2748 0.6533 0.7999 0.1357 0.8673 0.001
0.1527 32.0 8736 0.2665 0.6620 0.7929 0.1379 0.8630 0.001
0.1524 33.0 9009 0.2730 0.6722 0.7990 0.1356 0.8643 0.001
0.1524 34.0 9282 0.2730 0.6706 0.7967 0.1347 0.8615 0.001
0.1516 35.0 9555 0.2772 0.6483 0.7947 0.1354 0.8588 0.001
0.1516 36.0 9828 0.2585 0.6553 0.7928 0.1376 0.8582 0.001
0.1527 37.0 10101 0.2748 0.6681 0.7992 0.1346 0.8638 0.001
0.1527 38.0 10374 0.2717 0.6543 0.7889 0.1378 0.8525 0.001
0.1503 39.0 10647 0.2665 0.6627 0.7965 0.1367 0.8659 0.001
0.1503 40.0 10920 0.2737 0.6702 0.8005 0.1373 0.8705 0.001
0.152 41.0 11193 0.2658 0.6610 0.7942 0.1377 0.8583 0.001
0.152 42.0 11466 0.2810 0.6706 0.8002 0.1354 0.8642 0.001
0.1515 43.0 11739 0.2651 0.6620 0.8000 0.1367 0.8699 0.001
0.147 44.0 12012 0.2869 0.6826 0.8087 0.1291 0.8724 0.0001
0.147 45.0 12285 0.2997 0.6939 0.8115 0.1276 0.8721 0.0001
0.139 46.0 12558 0.2959 0.6856 0.8103 0.1270 0.8700 0.0001
0.139 47.0 12831 0.2973 0.6943 0.8125 0.1269 0.8726 0.0001
0.1375 48.0 13104 0.2980 0.6942 0.8132 0.1262 0.8743 0.0001
0.1375 49.0 13377 0.2966 0.6956 0.8147 0.1263 0.8775 0.0001
0.1353 50.0 13650 0.2928 0.7007 0.8153 0.1258 0.8782 0.0001
0.1353 51.0 13923 0.2973 0.6995 0.8152 0.1257 0.8776 0.0001
0.1337 52.0 14196 0.2973 0.6975 0.8135 0.1250 0.8729 0.0001
0.1337 53.0 14469 0.2949 0.6962 0.8133 0.1248 0.8757 0.0001
0.1338 54.0 14742 0.3018 0.6981 0.8143 0.1247 0.8739 0.0001
0.1322 55.0 15015 0.3008 0.7020 0.8166 0.1245 0.8792 0.0001
0.1322 56.0 15288 0.3011 0.7041 0.8185 0.1244 0.8820 0.0001
0.1313 57.0 15561 0.3004 0.6984 0.8162 0.1239 0.8770 0.0001
0.1313 58.0 15834 0.3001 0.7041 0.8171 0.1236 0.8785 0.0001
0.1309 59.0 16107 0.3049 0.7019 0.8159 0.1237 0.8758 0.0001
0.1309 60.0 16380 0.2990 0.7008 0.8153 0.1234 0.8731 0.0001
0.13 61.0 16653 0.3025 0.7083 0.8189 0.1229 0.8791 0.0001
0.13 62.0 16926 0.3028 0.7055 0.8166 0.1227 0.8767 0.0001
0.1288 63.0 17199 0.3039 0.7106 0.8176 0.1230 0.8774 0.0001
0.1288 64.0 17472 0.3049 0.7086 0.8192 0.1233 0.8803 0.0001
0.1291 65.0 17745 0.3049 0.7104 0.8188 0.1231 0.8798 0.0001
0.1283 66.0 18018 0.3028 0.7061 0.8186 0.1219 0.8789 0.0001
0.1283 67.0 18291 0.3042 0.7155 0.8197 0.1229 0.8823 0.0001
0.1273 68.0 18564 0.3080 0.7153 0.8210 0.1225 0.8844 0.0001
0.1273 69.0 18837 0.3032 0.7102 0.8196 0.1222 0.8799 0.0001
0.1265 70.0 19110 0.3084 0.7109 0.8185 0.1223 0.8768 0.0001
0.1265 71.0 19383 0.3077 0.7120 0.8170 0.1224 0.8737 0.0001
0.1264 72.0 19656 0.3063 0.7204 0.8204 0.1221 0.8803 0.0001
0.1264 73.0 19929 0.3087 0.7144 0.8198 0.1217 0.8798 1e-05
0.1249 74.0 20202 0.3067 0.7124 0.8190 0.1215 0.8757 1e-05
0.1249 75.0 20475 0.3056 0.7145 0.8209 0.1212 0.8796 1e-05
0.1236 76.0 20748 0.3080 0.7191 0.8219 0.1216 0.8822 1e-05
0.1233 77.0 21021 0.3132 0.7203 0.8237 0.1214 0.8868 1e-05
0.1233 78.0 21294 0.3098 0.7168 0.8223 0.1211 0.8823 1e-05
0.123 79.0 21567 0.3067 0.7161 0.8203 0.1215 0.8783 1e-05
0.123 80.0 21840 0.3073 0.7151 0.8219 0.1216 0.8847 1e-05
0.123 81.0 22113 0.3115 0.7187 0.8216 0.1210 0.8808 1e-05
0.123 82.0 22386 0.3094 0.7157 0.8212 0.1208 0.8794 1e-05
0.1214 83.0 22659 0.3001 0.7102 0.8180 0.1215 0.8751 1e-05
0.1214 84.0 22932 0.3119 0.7196 0.8216 0.1210 0.8817 1e-05
0.1234 85.0 23205 0.3101 0.7201 0.8234 0.1208 0.8835 1e-05
0.1234 86.0 23478 0.1210 0.8218 0.7215 0.8813 0.3094 1e-05
0.1216 87.0 23751 0.1212 0.8207 0.7142 0.8796 0.3087 1e-05
0.1219 88.0 24024 0.1210 0.8224 0.7125 0.8824 0.3101 1e-05
0.1219 89.0 24297 0.1214 0.8241 0.7250 0.8876 0.3122 0.0000
0.1219 90.0 24570 0.1212 0.8234 0.7199 0.8864 0.3105 0.0000
0.1219 91.0 24843 0.1208 0.8212 0.7160 0.8790 0.3098 0.0000
0.1213 92.0 25116 0.1207 0.8224 0.7144 0.8807 0.3073 0.0000
0.1213 93.0 25389 0.1209 0.8227 0.7189 0.8834 0.3080 0.0000
0.122 94.0 25662 0.1209 0.8223 0.7188 0.8828 0.3098 0.0000
0.122 95.0 25935 0.1207 0.8222 0.7127 0.8807 0.3094 0.0000
0.1209 96.0 26208 0.1214 0.8218 0.7160 0.8821 0.3067 0.0000
0.1209 97.0 26481 0.1226 0.8209 0.7159 0.8793 0.3094 0.0000
0.122 98.0 26754 0.1210 0.8225 0.7190 0.8843 0.3119 0.0000
0.1218 99.0 27027 0.1208 0.8214 0.7177 0.8803 0.3098 0.0000
0.1218 100.0 27300 0.1208 0.8219 0.7191 0.8794 0.3108 0.0000
0.1222 101.0 27573 0.1207 0.8231 0.7199 0.8825 0.3098 0.0000
0.1222 102.0 27846 0.1210 0.8216 0.7181 0.8797 0.3101 0.0000
0.1212 103.0 28119 0.1207 0.8219 0.7156 0.8799 0.3112 0.0000
0.1212 104.0 28392 0.1212 0.8214 0.7151 0.8810 0.3091 0.0000
0.1204 105.0 28665 0.1208 0.8216 0.7175 0.8822 0.3084 0.0000

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1