finetuned-AffectNet / README.md
3una's picture
Model save
2ce0eef
|
raw
history blame
7.78 kB
metadata
license: apache-2.0
base_model: microsoft/beit-base-patch16-224-pt22k-ft22k
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: finetuned-AffectNet
    results: []

finetuned-AffectNet

This model is a fine-tuned version of microsoft/beit-base-patch16-224-pt22k-ft22k on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8122
  • Accuracy: 0.7345

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.0686 1.0 163 2.0963 0.1549
1.7148 2.0 327 1.7250 0.2943
1.4591 3.0 490 1.4418 0.4204
1.3351 4.0 654 1.2648 0.5194
1.1343 5.0 817 1.0728 0.5908
1.1022 6.0 981 0.9741 0.6355
1.0476 7.0 1144 0.9203 0.6631
1.0049 8.0 1308 0.8769 0.6760
0.9561 9.0 1471 0.8438 0.6966
0.9409 10.0 1635 0.8283 0.6988
0.9419 11.0 1798 0.7867 0.7164
0.89 12.0 1962 0.7858 0.7139
0.8761 13.0 2125 0.7704 0.7147
0.8662 14.0 2289 0.7590 0.7225
0.8561 15.0 2452 0.7574 0.7199
0.8234 16.0 2616 0.7457 0.7238
0.844 17.0 2779 0.7416 0.7255
0.7908 18.0 2943 0.7485 0.7255
0.809 19.0 3106 0.7428 0.7250
0.7976 20.0 3270 0.7597 0.7203
0.7691 21.0 3433 0.7333 0.7345
0.7408 22.0 3597 0.7362 0.7246
0.7516 23.0 3760 0.7301 0.7298
0.7887 24.0 3924 0.7263 0.7332
0.7475 25.0 4087 0.7301 0.7293
0.7619 26.0 4251 0.7334 0.7298
0.7509 27.0 4414 0.7332 0.7345
0.7212 28.0 4578 0.7301 0.7367
0.7053 29.0 4741 0.7293 0.7328
0.6634 30.0 4905 0.7412 0.7298
0.677 31.0 5068 0.7221 0.7375
0.6453 32.0 5232 0.7281 0.7392
0.6961 33.0 5395 0.7280 0.7392
0.7135 34.0 5559 0.7348 0.7362
0.6871 35.0 5722 0.7334 0.7293
0.6829 36.0 5886 0.7281 0.7328
0.6742 37.0 6049 0.7332 0.7354
0.6167 38.0 6213 0.7274 0.7384
0.665 39.0 6376 0.7322 0.7311
0.6433 40.0 6540 0.7473 0.7345
0.6661 41.0 6703 0.7358 0.7341
0.6424 42.0 6867 0.7413 0.7324
0.6369 43.0 7030 0.7314 0.7414
0.611 44.0 7194 0.7325 0.7388
0.6556 45.0 7357 0.7485 0.7354
0.6524 46.0 7521 0.7434 0.7418
0.6176 47.0 7684 0.7402 0.7410
0.6142 48.0 7848 0.7480 0.7315
0.5968 49.0 8011 0.7457 0.7384
0.6132 50.0 8175 0.7514 0.7328
0.592 51.0 8338 0.7500 0.7375
0.6347 52.0 8502 0.7533 0.7345
0.5976 53.0 8665 0.7539 0.7324
0.5496 54.0 8829 0.7495 0.7388
0.5845 55.0 8992 0.7550 0.7367
0.5624 56.0 9156 0.7606 0.7362
0.5582 57.0 9319 0.7598 0.7341
0.6206 58.0 9483 0.7608 0.7345
0.5647 59.0 9646 0.7578 0.7388
0.6093 60.0 9810 0.7646 0.7358
0.5625 61.0 9973 0.7622 0.7388
0.6114 62.0 10137 0.7702 0.7324
0.5304 63.0 10300 0.7710 0.7367
0.5646 64.0 10464 0.7807 0.7298
0.5774 65.0 10627 0.7793 0.7328
0.5825 66.0 10791 0.7786 0.7375
0.5111 67.0 10954 0.7742 0.7380
0.5849 68.0 11118 0.7779 0.7349
0.5454 69.0 11281 0.7795 0.7367
0.5158 70.0 11445 0.7806 0.7345
0.5576 71.0 11608 0.7903 0.7345
0.5394 72.0 11772 0.7812 0.7380
0.5099 73.0 11935 0.7808 0.7354
0.5209 74.0 12099 0.7851 0.7319
0.5322 75.0 12262 0.7908 0.7401
0.5351 76.0 12426 0.7960 0.7306
0.5272 77.0 12589 0.7924 0.7324
0.477 78.0 12753 0.7981 0.7332
0.5186 79.0 12916 0.7942 0.7341
0.5366 80.0 13080 0.8016 0.7367
0.4809 81.0 13243 0.8014 0.7341
0.4889 82.0 13407 0.8008 0.7354
0.5287 83.0 13570 0.8010 0.7349
0.4926 84.0 13734 0.8047 0.7371
0.4989 85.0 13897 0.8046 0.7384
0.5483 86.0 14061 0.8022 0.7371
0.5157 87.0 14224 0.8055 0.7358
0.4999 88.0 14388 0.8071 0.7319
0.519 89.0 14551 0.8083 0.7362
0.4534 90.0 14715 0.8082 0.7384
0.429 91.0 14878 0.8103 0.7354
0.5073 92.0 15042 0.8116 0.7336
0.5358 93.0 15205 0.8106 0.7341
0.5049 94.0 15369 0.8111 0.7315
0.4745 95.0 15532 0.8118 0.7336
0.5052 96.0 15696 0.8104 0.7371
0.495 97.0 15859 0.8101 0.7354
0.4752 98.0 16023 0.8117 0.7349
0.4927 99.0 16186 0.8120 0.7336
0.4875 99.69 16300 0.8122 0.7345

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0