mmomm25's picture
Model save
ea44f84 verified
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: >-
      vit-base-patch16-224-in21k-FINALLaneClassifier-VIT30epochsAUGMENTEDWITHTEST
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value:
              accuracy: 1
          - name: F1
            type: f1
            value:
              f1: 1
          - name: Precision
            type: precision
            value:
              precision: 1
          - name: Recall
            type: recall
            value:
              recall: 1

vit-base-patch16-224-in21k-FINALLaneClassifier-VIT30epochsAUGMENTEDWITHTEST

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Accuracy: {'accuracy': 1.0}
  • F1: {'f1': 1.0}
  • Precision: {'precision': 1.0}
  • Recall: {'recall': 1.0}

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
0.0229 0.9973 274 0.0166 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0083 1.9982 549 0.0062 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0055 2.9991 824 0.0032 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0025 4.0 1099 0.0019 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.004 4.9973 1373 0.0013 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.001 5.9982 1648 0.0009 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0032 6.9991 1923 0.0014 {'accuracy': 0.9998862343572241} {'f1': 0.9998861783406705} {'precision': 0.9998887157801024} {'recall': 0.9998836668217777}
0.0011 8.0 2198 0.0005 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0035 8.9973 2472 0.0004 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0004 9.9982 2747 0.0003 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0003 10.9991 3022 0.0003 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0004 12.0 3297 0.0003 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0002 12.9973 3571 0.0002 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0005 13.9982 3846 0.0002 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.006 14.9991 4121 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 16.0 4396 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 16.9973 4670 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 17.9982 4945 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0004 18.9991 5220 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 20.0 5495 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 20.9973 5769 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0012 21.9982 6044 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 22.9991 6319 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 24.0 6594 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 24.9973 6868 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0002 25.9982 7143 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 26.9991 7418 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 28.0 7693 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 28.9973 7967 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0001 29.9181 8220 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}

Framework versions

  • Transformers 4.43.3
  • Pytorch 2.3.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1