swin-tiny-patch4-window7-224-finetuned-st-wsdmhar
This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.1144
- Accuracy: 0.9711
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.4641 | 1.0 | 53 | 1.2510 | 0.5768 |
0.7713 | 2.0 | 106 | 0.6282 | 0.7435 |
0.5822 | 3.0 | 159 | 0.4503 | 0.8251 |
0.5086 | 4.0 | 212 | 0.4028 | 0.8306 |
0.4499 | 5.0 | 265 | 0.3399 | 0.8709 |
0.4171 | 6.0 | 318 | 0.3024 | 0.8898 |
0.3597 | 7.0 | 371 | 0.2584 | 0.9029 |
0.3201 | 8.0 | 424 | 0.2467 | 0.9143 |
0.2974 | 9.0 | 477 | 0.2784 | 0.8981 |
0.3323 | 10.0 | 530 | 0.2414 | 0.9101 |
0.2717 | 11.0 | 583 | 0.2051 | 0.9349 |
0.2647 | 12.0 | 636 | 0.1944 | 0.9294 |
0.296 | 13.0 | 689 | 0.1871 | 0.9329 |
0.2434 | 14.0 | 742 | 0.1701 | 0.9411 |
0.2293 | 15.0 | 795 | 0.1685 | 0.9435 |
0.2196 | 16.0 | 848 | 0.1486 | 0.9446 |
0.2249 | 17.0 | 901 | 0.1438 | 0.9456 |
0.2142 | 18.0 | 954 | 0.1529 | 0.9449 |
0.2024 | 19.0 | 1007 | 0.1361 | 0.9532 |
0.2177 | 20.0 | 1060 | 0.1906 | 0.9315 |
0.1812 | 21.0 | 1113 | 0.1333 | 0.9532 |
0.1771 | 22.0 | 1166 | 0.1424 | 0.9528 |
0.1718 | 23.0 | 1219 | 0.1370 | 0.9545 |
0.1333 | 24.0 | 1272 | 0.1550 | 0.9466 |
0.1592 | 25.0 | 1325 | 0.1344 | 0.9535 |
0.1279 | 26.0 | 1378 | 0.1201 | 0.9590 |
0.1547 | 27.0 | 1431 | 0.1225 | 0.9597 |
0.1443 | 28.0 | 1484 | 0.1526 | 0.9480 |
0.1306 | 29.0 | 1537 | 0.1188 | 0.9597 |
0.129 | 30.0 | 1590 | 0.1204 | 0.9614 |
0.1354 | 31.0 | 1643 | 0.1351 | 0.9570 |
0.135 | 32.0 | 1696 | 0.1010 | 0.9676 |
0.137 | 33.0 | 1749 | 0.1381 | 0.9566 |
0.101 | 34.0 | 1802 | 0.1119 | 0.9642 |
0.1118 | 35.0 | 1855 | 0.1056 | 0.9656 |
0.1206 | 36.0 | 1908 | 0.0975 | 0.9663 |
0.1028 | 37.0 | 1961 | 0.1265 | 0.9635 |
0.1058 | 38.0 | 2014 | 0.0958 | 0.9680 |
0.1013 | 39.0 | 2067 | 0.1060 | 0.9642 |
0.0765 | 40.0 | 2120 | 0.1024 | 0.9659 |
0.0997 | 41.0 | 2173 | 0.1116 | 0.9642 |
0.0933 | 42.0 | 2226 | 0.1082 | 0.9666 |
0.0937 | 43.0 | 2279 | 0.1095 | 0.9680 |
0.0831 | 44.0 | 2332 | 0.1034 | 0.9683 |
0.0849 | 45.0 | 2385 | 0.0995 | 0.9690 |
0.0739 | 46.0 | 2438 | 0.1042 | 0.9666 |
0.0944 | 47.0 | 2491 | 0.1068 | 0.9697 |
0.0821 | 48.0 | 2544 | 0.1087 | 0.9690 |
0.0835 | 49.0 | 2597 | 0.0975 | 0.9721 |
0.0666 | 50.0 | 2650 | 0.1189 | 0.9638 |
0.0614 | 51.0 | 2703 | 0.1421 | 0.9611 |
0.0738 | 52.0 | 2756 | 0.1253 | 0.9638 |
0.0949 | 53.0 | 2809 | 0.1274 | 0.9663 |
0.068 | 54.0 | 2862 | 0.1051 | 0.9669 |
0.0626 | 55.0 | 2915 | 0.1102 | 0.9673 |
0.0647 | 56.0 | 2968 | 0.1096 | 0.9673 |
0.0803 | 57.0 | 3021 | 0.1049 | 0.9683 |
0.0744 | 58.0 | 3074 | 0.1039 | 0.9697 |
0.0769 | 59.0 | 3127 | 0.1060 | 0.9690 |
0.0763 | 60.0 | 3180 | 0.1077 | 0.9680 |
0.0591 | 61.0 | 3233 | 0.1165 | 0.9680 |
0.0649 | 62.0 | 3286 | 0.1109 | 0.9694 |
0.0557 | 63.0 | 3339 | 0.1162 | 0.9680 |
0.0644 | 64.0 | 3392 | 0.1039 | 0.9718 |
0.0558 | 65.0 | 3445 | 0.1182 | 0.9687 |
0.0633 | 66.0 | 3498 | 0.1151 | 0.9680 |
0.0586 | 67.0 | 3551 | 0.1147 | 0.9694 |
0.0651 | 68.0 | 3604 | 0.1124 | 0.9711 |
0.0693 | 69.0 | 3657 | 0.1104 | 0.9687 |
0.0584 | 70.0 | 3710 | 0.1177 | 0.9697 |
0.0471 | 71.0 | 3763 | 0.1160 | 0.9690 |
0.0614 | 72.0 | 3816 | 0.1220 | 0.9680 |
0.0583 | 73.0 | 3869 | 0.1236 | 0.9656 |
0.0495 | 74.0 | 3922 | 0.1076 | 0.9718 |
0.0574 | 75.0 | 3975 | 0.1163 | 0.9673 |
0.0399 | 76.0 | 4028 | 0.1126 | 0.9683 |
0.0357 | 77.0 | 4081 | 0.1064 | 0.9728 |
0.0441 | 78.0 | 4134 | 0.1139 | 0.9694 |
0.0504 | 79.0 | 4187 | 0.1083 | 0.9707 |
0.0546 | 80.0 | 4240 | 0.1167 | 0.9676 |
0.0528 | 81.0 | 4293 | 0.1143 | 0.9697 |
0.0385 | 82.0 | 4346 | 0.1226 | 0.9676 |
0.0511 | 83.0 | 4399 | 0.1199 | 0.9694 |
0.0533 | 84.0 | 4452 | 0.1279 | 0.9673 |
0.043 | 85.0 | 4505 | 0.1161 | 0.9714 |
0.0231 | 86.0 | 4558 | 0.1166 | 0.9728 |
0.0426 | 87.0 | 4611 | 0.1239 | 0.9690 |
0.0565 | 88.0 | 4664 | 0.1189 | 0.9687 |
0.0378 | 89.0 | 4717 | 0.1186 | 0.9697 |
0.0406 | 90.0 | 4770 | 0.1209 | 0.9718 |
0.0306 | 91.0 | 4823 | 0.1189 | 0.9721 |
0.0354 | 92.0 | 4876 | 0.1244 | 0.9687 |
0.0293 | 93.0 | 4929 | 0.1235 | 0.9697 |
0.0381 | 94.0 | 4982 | 0.1186 | 0.9711 |
0.0372 | 95.0 | 5035 | 0.1172 | 0.9714 |
0.0469 | 96.0 | 5088 | 0.1180 | 0.9711 |
0.0535 | 97.0 | 5141 | 0.1152 | 0.9718 |
0.0496 | 98.0 | 5194 | 0.1157 | 0.9714 |
0.034 | 99.0 | 5247 | 0.1145 | 0.9714 |
0.0348 | 100.0 | 5300 | 0.1144 | 0.9711 |
Framework versions
- Transformers 4.43.2
- Pytorch 2.3.1+cu118
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 30
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for ayubkfupm/swin-tiny-patch4-window7-224-finetuned-st-wsdmhar
Base model
microsoft/swin-tiny-patch4-window7-224