jaffe_crop_200_iter

This model is a fine-tuned version of WinKawaks/vit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6110
  • Accuracy: 0.8333

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 2.7256 0.1667
No log 2.0 2 2.5775 0.1667
No log 3.0 3 2.5633 0.1333
No log 4.0 4 2.3386 0.1333
No log 5.0 5 2.3028 0.1
No log 6.0 6 2.1672 0.2333
No log 7.0 7 2.1696 0.0333
No log 8.0 8 2.0133 0.0667
No log 9.0 9 2.0010 0.1
2.3074 10.0 10 1.9153 0.3333
2.3074 11.0 11 2.0177 0.3
2.3074 12.0 12 1.8241 0.3333
2.3074 13.0 13 1.7967 0.3
2.3074 14.0 14 1.7210 0.3333
2.3074 15.0 15 1.6551 0.3333
2.3074 16.0 16 1.5307 0.5333
2.3074 17.0 17 1.4263 0.4333
2.3074 18.0 18 1.4219 0.4333
2.3074 19.0 19 1.4402 0.4333
1.5864 20.0 20 1.3050 0.5
1.5864 21.0 21 1.4508 0.4
1.5864 22.0 22 1.3384 0.4667
1.5864 23.0 23 1.1632 0.6
1.5864 24.0 24 1.0711 0.6333
1.5864 25.0 25 1.1361 0.4667
1.5864 26.0 26 1.2230 0.5667
1.5864 27.0 27 1.0167 0.7
1.5864 28.0 28 0.8815 0.6667
1.5864 29.0 29 1.0191 0.6667
0.7909 30.0 30 0.9474 0.6667
0.7909 31.0 31 0.9328 0.5333
0.7909 32.0 32 0.9476 0.6
0.7909 33.0 33 0.7911 0.7
0.7909 34.0 34 0.7609 0.7333
0.7909 35.0 35 0.9489 0.6333
0.7909 36.0 36 0.7199 0.6667
0.7909 37.0 37 0.6195 0.7667
0.7909 38.0 38 0.7611 0.7333
0.7909 39.0 39 0.8063 0.7333
0.3677 40.0 40 0.8360 0.6667
0.3677 41.0 41 0.7250 0.7333
0.3677 42.0 42 0.6228 0.8333
0.3677 43.0 43 0.7757 0.7667
0.3677 44.0 44 0.6767 0.7667
0.3677 45.0 45 0.8128 0.6667
0.3677 46.0 46 0.8584 0.7333
0.3677 47.0 47 0.8866 0.6333
0.3677 48.0 48 0.5143 0.8
0.3677 49.0 49 0.7074 0.7
0.192 50.0 50 0.7536 0.7333
0.192 51.0 51 0.7113 0.7667
0.192 52.0 52 0.8321 0.7
0.192 53.0 53 0.7229 0.7333
0.192 54.0 54 0.8839 0.7
0.192 55.0 55 0.5496 0.8
0.192 56.0 56 0.6769 0.8
0.192 57.0 57 0.5740 0.8333
0.192 58.0 58 0.6321 0.7667
0.192 59.0 59 0.7635 0.7667
0.1225 60.0 60 0.6869 0.8
0.1225 61.0 61 0.6595 0.8333
0.1225 62.0 62 0.5097 0.8
0.1225 63.0 63 0.5642 0.8333
0.1225 64.0 64 0.6503 0.8
0.1225 65.0 65 0.6877 0.7667
0.1225 66.0 66 0.6811 0.7333
0.1225 67.0 67 0.7854 0.7
0.1225 68.0 68 0.6378 0.8
0.1225 69.0 69 0.5148 0.8667
0.088 70.0 70 0.6571 0.8
0.088 71.0 71 0.7112 0.7333
0.088 72.0 72 0.6602 0.7333
0.088 73.0 73 0.4259 0.9
0.088 74.0 74 0.3600 0.8667
0.088 75.0 75 0.5726 0.8667
0.088 76.0 76 0.6638 0.7667
0.088 77.0 77 0.4641 0.8667
0.088 78.0 78 0.5186 0.8667
0.088 79.0 79 0.6024 0.8333
0.0581 80.0 80 0.4450 0.8667
0.0581 81.0 81 0.3267 0.9
0.0581 82.0 82 0.6442 0.8
0.0581 83.0 83 0.8130 0.8
0.0581 84.0 84 0.4391 0.8667
0.0581 85.0 85 0.5013 0.8333
0.0581 86.0 86 0.6059 0.8
0.0581 87.0 87 0.4916 0.8333
0.0581 88.0 88 0.4384 0.8667
0.0581 89.0 89 0.3366 0.9
0.0359 90.0 90 0.6082 0.7667
0.0359 91.0 91 0.2967 0.8333
0.0359 92.0 92 0.3954 0.9
0.0359 93.0 93 0.5354 0.8333
0.0359 94.0 94 0.4755 0.8667
0.0359 95.0 95 0.5060 0.8
0.0359 96.0 96 0.3716 0.8333
0.0359 97.0 97 0.3547 0.8333
0.0359 98.0 98 0.6991 0.8333
0.0359 99.0 99 0.4895 0.8333
0.0251 100.0 100 0.5676 0.8
0.0251 101.0 101 0.3866 0.8667
0.0251 102.0 102 0.7054 0.7667
0.0251 103.0 103 0.5586 0.8667
0.0251 104.0 104 0.5839 0.8333
0.0251 105.0 105 0.5091 0.8333
0.0251 106.0 106 0.5459 0.9
0.0251 107.0 107 0.4228 0.9
0.0251 108.0 108 0.4447 0.9
0.0251 109.0 109 0.5052 0.8667
0.023 110.0 110 0.5494 0.8
0.023 111.0 111 0.6297 0.8333
0.023 112.0 112 0.6272 0.8667
0.023 113.0 113 0.6439 0.8333
0.023 114.0 114 0.3996 0.8667
0.023 115.0 115 0.4805 0.8333
0.023 116.0 116 0.4350 0.9
0.023 117.0 117 0.7719 0.8
0.023 118.0 118 0.5710 0.8333
0.023 119.0 119 0.3579 0.9
0.0292 120.0 120 0.3309 0.9333
0.0292 121.0 121 0.2649 0.9333
0.0292 122.0 122 0.3858 0.8333
0.0292 123.0 123 0.6109 0.8
0.0292 124.0 124 0.4734 0.8
0.0292 125.0 125 0.5604 0.8333
0.0292 126.0 126 0.3509 0.8667
0.0292 127.0 127 0.4752 0.9
0.0292 128.0 128 0.2966 0.8667
0.0292 129.0 129 0.5186 0.8667
0.0155 130.0 130 0.4547 0.8667
0.0155 131.0 131 0.3391 0.9
0.0155 132.0 132 0.4527 0.8667
0.0155 133.0 133 0.4476 0.8667
0.0155 134.0 134 0.5800 0.8667
0.0155 135.0 135 0.4653 0.8667
0.0155 136.0 136 0.3927 0.9
0.0155 137.0 137 0.4538 0.8333
0.0155 138.0 138 0.3952 0.9
0.0155 139.0 139 0.3949 0.9333
0.0112 140.0 140 0.5354 0.9
0.0112 141.0 141 0.5307 0.8333
0.0112 142.0 142 0.4757 0.9
0.0112 143.0 143 0.4342 0.9
0.0112 144.0 144 0.3978 0.9
0.0112 145.0 145 0.5695 0.8333
0.0112 146.0 146 0.5495 0.8
0.0112 147.0 147 0.4235 0.8667
0.0112 148.0 148 0.3878 0.9
0.0112 149.0 149 0.5536 0.9
0.0066 150.0 150 0.4343 0.8333
0.0066 151.0 151 0.5915 0.8667
0.0066 152.0 152 0.6523 0.8
0.0066 153.0 153 0.6478 0.8333
0.0066 154.0 154 0.6078 0.8333
0.0066 155.0 155 0.5253 0.9
0.0066 156.0 156 0.5023 0.9
0.0066 157.0 157 0.6782 0.8667
0.0066 158.0 158 0.4155 0.9
0.0066 159.0 159 0.6239 0.8667
0.0063 160.0 160 0.4657 0.8667
0.0063 161.0 161 0.3858 0.9
0.0063 162.0 162 0.4525 0.8667
0.0063 163.0 163 0.2853 0.9
0.0063 164.0 164 0.3835 0.8667
0.0063 165.0 165 0.3866 0.8333
0.0063 166.0 166 0.5272 0.8
0.0063 167.0 167 0.5175 0.8667
0.0063 168.0 168 0.5153 0.9
0.0063 169.0 169 0.4730 0.8667
0.0048 170.0 170 0.6657 0.8333
0.0048 171.0 171 0.5745 0.9
0.0048 172.0 172 0.7296 0.8
0.0048 173.0 173 0.4948 0.8333
0.0048 174.0 174 0.4577 0.8667
0.0048 175.0 175 0.5703 0.8667
0.0048 176.0 176 0.7392 0.8333
0.0048 177.0 177 0.7239 0.9
0.0048 178.0 178 0.3202 0.9
0.0048 179.0 179 0.4292 0.9
0.003 180.0 180 0.6611 0.8333
0.003 181.0 181 0.4829 0.9
0.003 182.0 182 0.4959 0.8667
0.003 183.0 183 0.2886 0.9
0.003 184.0 184 0.6587 0.8667
0.003 185.0 185 0.7223 0.8
0.003 186.0 186 0.6374 0.8667
0.003 187.0 187 0.5756 0.8333
0.003 188.0 188 0.6505 0.8667
0.003 189.0 189 0.5517 0.8667
0.0025 190.0 190 0.5176 0.8667
0.0025 191.0 191 0.4003 0.9
0.0025 192.0 192 0.5594 0.8667
0.0025 193.0 193 0.4229 0.9
0.0025 194.0 194 0.5938 0.8667
0.0025 195.0 195 0.6415 0.8667
0.0025 196.0 196 0.5056 0.8667
0.0025 197.0 197 0.5146 0.9
0.0025 198.0 198 0.5822 0.8667
0.0025 199.0 199 0.6066 0.8667
0.0021 200.0 200 0.6110 0.8333

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.3.1
  • Tokenizers 0.21.0
Downloads last month
3
Safetensors
Model size
5.53M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for ricardoSLabs/jaffe_crop_200_iter

Finetuned
(32)
this model

Evaluation results