swinv2-tiny-patch4-window8-256-dmae-humeda-DAV68

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2868
  • Accuracy: 0.9314

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 45
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0651 1.0 18 1.0769 0.4686
0.9503 2.0 36 0.8111 0.7257
0.5745 3.0 54 0.4972 0.7314
0.4746 4.0 72 0.4788 0.7486
0.4363 5.0 90 0.5427 0.7314
0.4362 6.0 108 0.3581 0.8686
0.3476 7.0 126 0.3572 0.8686
0.3113 8.0 144 0.4335 0.7886
0.3943 9.0 162 0.2782 0.8686
0.2574 10.0 180 0.3320 0.8686
0.2345 11.0 198 0.4383 0.8343
0.3002 12.0 216 0.3053 0.8686
0.2038 13.0 234 0.3189 0.8743
0.2244 14.0 252 0.2766 0.8743
0.2277 15.0 270 0.2637 0.8857
0.2318 16.0 288 0.4612 0.8114
0.1908 17.0 306 0.3167 0.8857
0.1932 18.0 324 0.2949 0.9029
0.1676 19.0 342 0.2627 0.9086
0.1442 20.0 360 0.2584 0.9143
0.1606 21.0 378 0.2626 0.9143
0.1624 22.0 396 0.2351 0.9257
0.1735 23.0 414 0.2746 0.9257
0.1604 24.0 432 0.3237 0.8914
0.122 25.0 450 0.2852 0.8914
0.1447 26.0 468 0.2594 0.92
0.1265 27.0 486 0.2857 0.9029
0.1265 28.0 504 0.3238 0.8743
0.122 29.0 522 0.3029 0.8857
0.0929 30.0 540 0.2936 0.9029
0.1276 31.0 558 0.2777 0.9143
0.1118 32.0 576 0.2812 0.9143
0.1058 33.0 594 0.2925 0.92
0.0824 34.0 612 0.3519 0.8914
0.1084 35.0 630 0.2847 0.92
0.1074 36.0 648 0.2735 0.9143
0.1415 37.0 666 0.2724 0.9257
0.0702 38.0 684 0.2873 0.92
0.0987 39.0 702 0.2924 0.92
0.0637 40.0 720 0.2868 0.9314
0.1183 41.0 738 0.2892 0.92
0.096 42.0 756 0.2910 0.9143
0.0719 42.5217 765 0.2897 0.9143

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu124
  • Datasets 3.4.1
  • Tokenizers 0.21.1
Downloads last month
8
Safetensors
Model size
27.6M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV68

Finetuned
(137)
this model