wcosmas's picture
End of training
e95ee9a verified
|
raw
history blame
4.62 kB
metadata
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: vit-base-patch16-224-in21k-finetuned-biopsy
    results: []

vit-base-patch16-224-in21k-finetuned-biopsy

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1249
  • Accuracy: 0.9682

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.1332 1.0 42 1.0712 0.5494
0.6556 2.0 84 0.5742 0.8492
0.3987 3.0 126 0.3950 0.8894
0.2825 4.0 168 0.3924 0.8777
0.3662 5.0 210 0.3622 0.8861
0.2218 6.0 252 0.2706 0.9246
0.2236 7.0 294 0.2283 0.9347
0.2224 8.0 336 0.2367 0.9313
0.1754 9.0 378 0.2139 0.9296
0.1707 10.0 420 0.1829 0.9497
0.1619 11.0 462 0.2172 0.9464
0.1547 12.0 504 0.1960 0.9380
0.1213 13.0 546 0.1484 0.9581
0.1388 14.0 588 0.1689 0.9581
0.1009 15.0 630 0.1494 0.9581
0.124 16.0 672 0.1564 0.9581
0.1078 17.0 714 0.1728 0.9514
0.102 18.0 756 0.1669 0.9447
0.1006 19.0 798 0.1405 0.9581
0.0791 20.0 840 0.1179 0.9665
0.0694 21.0 882 0.1424 0.9631
0.0627 22.0 924 0.1224 0.9665
0.0883 23.0 966 0.1602 0.9447
0.074 24.0 1008 0.1315 0.9615
0.0708 25.0 1050 0.1331 0.9631
0.06 26.0 1092 0.1191 0.9665
0.083 27.0 1134 0.1583 0.9531
0.0584 28.0 1176 0.1348 0.9564
0.0627 29.0 1218 0.1270 0.9564
0.0627 30.0 1260 0.1411 0.9564
0.038 31.0 1302 0.1208 0.9665
0.0569 32.0 1344 0.1587 0.9514
0.0502 33.0 1386 0.1501 0.9497
0.0464 34.0 1428 0.1508 0.9615
0.0317 35.0 1470 0.1309 0.9631
0.0552 36.0 1512 0.1372 0.9598
0.031 37.0 1554 0.1258 0.9598
0.0383 38.0 1596 0.1249 0.9682
0.036 39.0 1638 0.1312 0.9665
0.0405 40.0 1680 0.1207 0.9665
0.0343 41.0 1722 0.1233 0.9648
0.0325 42.0 1764 0.1286 0.9631
0.0293 43.0 1806 0.1135 0.9682
0.0306 44.0 1848 0.1258 0.9615
0.0267 45.0 1890 0.1261 0.9648
0.0338 46.0 1932 0.1209 0.9665
0.0213 47.0 1974 0.1157 0.9665
0.0285 48.0 2016 0.1203 0.9631
0.0287 49.0 2058 0.1240 0.9648
0.0183 50.0 2100 0.1224 0.9665

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.19.1