metadata
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
- f1
- recall
- precision
model-index:
- name: deit-base-distilled-patch16-224-Brain_Tumors_Image_Classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.8045685279187818
language:
- en
pipeline_tag: image-classification
deit-base-distilled-patch16-224-Brain_Tumors_Image_Classification
This model is a fine-tuned version of facebook/deit-base-distilled-patch16-224.
It achieves the following results on the evaluation set:
- Loss: 1.8587
- Accuracy: 0.8046
- Weighted f1: 0.7749
- Micro f1: 0.8046
- Macro f1: 0.7814
- Weighted recall: 0.8046
- Micro recall: 0.8046
- Macro recall: 0.7996
- Weighted precision: 0.8567
- Micro precision: 0.8046
- Macro precision: 0.8710
Model description
Click here for the code that I used to create this modelThis project is part of a comparison of seventeen (17) transformers. Click here to see the README markdown file for the full project
Intended uses & limitations
This model is intended to demonstrate my ability to solve a complex problem using technology.
Training and evaluation data
Brain Tumor Image Classification Dataset
Sample Images
Class Distribution
Training Dataset
Evaluation Dataset
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Weighted f1 | Micro f1 | Macro f1 | Weighted recall | Micro recall | Macro recall | Weighted precision | Micro precision | Macro precision |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.6561 | 1.0 | 180 | 1.5974 | 0.7792 | 0.7454 | 0.7792 | 0.7524 | 0.7792 | 0.7792 | 0.7722 | 0.8318 | 0.7792 | 0.8488 |
1.6561 | 2.0 | 360 | 1.7614 | 0.7944 | 0.7575 | 0.7944 | 0.7633 | 0.7944 | 0.7944 | 0.7896 | 0.8458 | 0.7944 | 0.8582 |
0.172 | 3.0 | 540 | 1.8587 | 0.8046 | 0.7749 | 0.8046 | 0.7814 | 0.8046 | 0.8046 | 0.7996 | 0.8567 | 0.8046 | 0.8710 |
Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.11.0
- Tokenizers 0.13.3