dwiedarioo/vit-base-patch16-224-in21k-final2multibrainmri

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0072
  • Train Accuracy: 1.0
  • Train Top-3-accuracy: 1.0
  • Validation Loss: 0.1111
  • Validation Accuracy: 0.9719
  • Validation Top-3-accuracy: 0.9914
  • Epoch: 49

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'inner_optimizer': {'module': 'transformers.optimization_tf', 'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 8200, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.8999999761581421, 'beta_2': 0.9990000128746033, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}, 'registered_name': 'AdamWeightDecay'}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000}
  • training_precision: mixed_float16

Training results

Train Loss Train Accuracy Train Top-3-accuracy Validation Loss Validation Accuracy Validation Top-3-accuracy Epoch
2.2742 0.3856 0.6522 1.8596 0.6112 0.8337 0
1.5673 0.6919 0.8778 1.3120 0.7883 0.9136 1
1.0377 0.8622 0.9576 0.9078 0.8661 0.9611 2
0.6816 0.9511 0.9859 0.6497 0.9222 0.9849 3
0.4698 0.9805 0.9939 0.5104 0.9395 0.9870 4
0.3375 0.9897 0.9973 0.3975 0.9590 0.9892 5
0.2554 0.9966 0.9992 0.3107 0.9676 0.9978 6
0.2346 0.9905 0.9992 0.3804 0.9287 0.9914 7
0.1976 0.9935 0.9989 0.3250 0.9546 0.9914 8
0.1686 0.9939 0.9992 0.4980 0.8920 0.9762 9
0.1423 0.9969 0.9996 0.2129 0.9654 0.9957 10
0.1073 0.9992 1.0 0.1840 0.9741 0.9978 11
0.0925 0.9992 1.0 0.1714 0.9719 0.9978 12
0.0809 0.9992 1.0 0.1595 0.9719 0.9978 13
0.0715 0.9992 1.0 0.1503 0.9719 0.9978 14
0.0637 1.0 1.0 0.1426 0.9762 0.9978 15
0.0573 0.9996 1.0 0.1361 0.9784 0.9978 16
0.0516 1.0 1.0 0.1325 0.9784 0.9957 17
0.0469 1.0 1.0 0.1279 0.9784 0.9957 18
0.0427 1.0 1.0 0.1248 0.9784 0.9957 19
0.0392 1.0 1.0 0.1224 0.9784 0.9957 20
0.0359 1.0 1.0 0.1191 0.9784 0.9957 21
0.0331 1.0 1.0 0.1178 0.9762 0.9914 22
0.0306 1.0 1.0 0.1162 0.9784 0.9957 23
0.0284 1.0 1.0 0.1144 0.9784 0.9957 24
0.0264 1.0 1.0 0.1143 0.9741 0.9957 25
0.0246 1.0 1.0 0.1126 0.9762 0.9957 26
0.0230 1.0 1.0 0.1104 0.9784 0.9957 27
0.0215 1.0 1.0 0.1110 0.9762 0.9935 28
0.0201 1.0 1.0 0.1091 0.9762 0.9957 29
0.0189 1.0 1.0 0.1101 0.9741 0.9957 30
0.0178 1.0 1.0 0.1099 0.9762 0.9914 31
0.0167 1.0 1.0 0.1091 0.9762 0.9935 32
0.0158 1.0 1.0 0.1091 0.9762 0.9914 33
0.0149 1.0 1.0 0.1094 0.9741 0.9914 34
0.0141 1.0 1.0 0.1088 0.9719 0.9914 35
0.0134 1.0 1.0 0.1089 0.9762 0.9914 36
0.0127 1.0 1.0 0.1084 0.9741 0.9935 37
0.0120 1.0 1.0 0.1087 0.9741 0.9914 38
0.0114 1.0 1.0 0.1078 0.9741 0.9914 39
0.0109 1.0 1.0 0.1088 0.9719 0.9914 40
0.0104 1.0 1.0 0.1087 0.9719 0.9914 41
0.0099 1.0 1.0 0.1094 0.9719 0.9935 42
0.0094 1.0 1.0 0.1095 0.9719 0.9914 43
0.0090 1.0 1.0 0.1099 0.9719 0.9914 44
0.0086 1.0 1.0 0.1112 0.9719 0.9914 45
0.0082 1.0 1.0 0.1104 0.9719 0.9914 46
0.0079 1.0 1.0 0.1107 0.9719 0.9914 47
0.0075 1.0 1.0 0.1102 0.9741 0.9914 48
0.0072 1.0 1.0 0.1111 0.9719 0.9914 49

Framework versions

  • Transformers 4.35.0
  • TensorFlow 2.14.0
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
32
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for dwiedarioo/vit-base-patch16-224-in21k-final2multibrainmri

Finetuned
(1769)
this model