hkivancoral's picture
End of training
6f94eeb
metadata
license: apache-2.0
base_model: facebook/deit-small-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: smids_10x_deit_small_sgd_00001_fold5
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.5933333333333334

smids_10x_deit_small_sgd_00001_fold5

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9331
  • Accuracy: 0.5933

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0687 1.0 750 1.0724 0.425
1.0435 2.0 1500 1.0669 0.42
1.0439 3.0 2250 1.0614 0.4283
1.0595 4.0 3000 1.0559 0.435
1.0216 5.0 3750 1.0506 0.4383
1.0179 6.0 4500 1.0454 0.4467
1.0048 7.0 5250 1.0402 0.4517
1.0171 8.0 6000 1.0351 0.4533
1.0075 9.0 6750 1.0302 0.4567
0.9942 10.0 7500 1.0255 0.4683
0.9968 11.0 8250 1.0209 0.4783
0.9853 12.0 9000 1.0163 0.485
0.9829 13.0 9750 1.0118 0.4933
0.9676 14.0 10500 1.0075 0.4933
0.9869 15.0 11250 1.0033 0.5
0.9385 16.0 12000 0.9992 0.5117
0.9422 17.0 12750 0.9953 0.5167
0.9475 18.0 13500 0.9914 0.5233
0.9706 19.0 14250 0.9876 0.5267
0.9823 20.0 15000 0.9840 0.53
0.9281 21.0 15750 0.9805 0.535
0.9429 22.0 16500 0.9770 0.54
0.9545 23.0 17250 0.9738 0.545
0.9266 24.0 18000 0.9706 0.545
0.943 25.0 18750 0.9675 0.545
0.9362 26.0 19500 0.9646 0.55
0.9017 27.0 20250 0.9618 0.5517
0.9415 28.0 21000 0.9592 0.555
0.9141 29.0 21750 0.9566 0.555
0.9329 30.0 22500 0.9543 0.5567
0.931 31.0 23250 0.9520 0.5617
0.9115 32.0 24000 0.9498 0.5633
0.9251 33.0 24750 0.9478 0.565
0.8996 34.0 25500 0.9460 0.5717
0.9232 35.0 26250 0.9442 0.5717
0.8817 36.0 27000 0.9427 0.5717
0.8794 37.0 27750 0.9412 0.575
0.8813 38.0 28500 0.9398 0.5767
0.8952 39.0 29250 0.9386 0.58
0.8846 40.0 30000 0.9375 0.5817
0.8967 41.0 30750 0.9366 0.5867
0.9065 42.0 31500 0.9358 0.5883
0.9123 43.0 32250 0.9351 0.59
0.8878 44.0 33000 0.9345 0.59
0.8772 45.0 33750 0.9340 0.59
0.9035 46.0 34500 0.9336 0.5933
0.9152 47.0 35250 0.9334 0.5933
0.8837 48.0 36000 0.9332 0.5933
0.8879 49.0 36750 0.9331 0.5933
0.8918 50.0 37500 0.9331 0.5933

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2