Edit model card

vit-base-patch16-224-in21k-finetuned-papsmear

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2825
  • Accuracy: 0.9338

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9231 9 1.7346 0.2647
1.7645 1.9487 19 1.6152 0.3088
1.661 2.9744 29 1.4663 0.4118
1.496 4.0 39 1.2989 0.4853
1.3097 4.9231 48 1.1491 0.5588
1.091 5.9487 58 0.9933 0.7206
0.9088 6.9744 68 0.9171 0.6985
0.7858 8.0 78 0.8301 0.7721
0.7016 8.9231 87 0.7925 0.7353
0.6136 9.9487 97 0.6992 0.7647
0.532 10.9744 107 0.6401 0.8309
0.5018 12.0 117 0.5787 0.8382
0.4279 12.9231 126 0.6130 0.8088
0.4116 13.9487 136 0.5090 0.8382
0.3848 14.9744 146 0.5165 0.8676
0.3449 16.0 156 0.4843 0.8382
0.3008 16.9231 165 0.5460 0.8456
0.2797 17.9487 175 0.4985 0.8309
0.2696 18.9744 185 0.5586 0.8456
0.2633 20.0 195 0.4349 0.9044
0.2569 20.9231 204 0.4017 0.8897
0.27 21.9487 214 0.4758 0.8603
0.2706 22.9744 224 0.4133 0.8897
0.2211 24.0 234 0.3844 0.9118
0.1977 24.9231 243 0.3497 0.9265
0.1969 25.9487 253 0.3736 0.9044
0.1776 26.9744 263 0.3797 0.9044
0.1787 28.0 273 0.3949 0.8897
0.18 28.9231 282 0.3278 0.9265
0.1797 29.9487 292 0.3615 0.9044
0.1665 30.9744 302 0.4174 0.8603
0.163 32.0 312 0.3574 0.8971
0.1498 32.9231 321 0.3591 0.9044
0.1405 33.9487 331 0.3017 0.9191
0.155 34.9744 341 0.3303 0.9265
0.1519 36.0 351 0.3559 0.8971
0.1415 36.9231 360 0.2890 0.9191
0.1256 37.9487 370 0.3445 0.8897
0.1217 38.9744 380 0.3435 0.9118
0.1285 40.0 390 0.3025 0.9191
0.1285 40.9231 399 0.3602 0.8824
0.1301 41.9487 409 0.3336 0.8897
0.1243 42.9744 419 0.2825 0.9338
0.1191 44.0 429 0.2835 0.9265
0.1221 44.9231 438 0.2724 0.9191
0.1151 45.9487 448 0.2708 0.9191
0.1195 46.1538 450 0.2707 0.9191

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.2
  • Tokenizers 0.20.1
Downloads last month
73
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MakAIHealthLab/vit-base-patch16-224-in21k-finetuned-papsmear

Finetuned
(1688)
this model

Evaluation results