File size: 11,946 Bytes
e159bcb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
---

license: apache-2.0
base_model: microsoft/swin-tiny-patch4-window7-224
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: swin-tiny-patch4-window7-224-FINALConcreteClassifier-SWIN50epochsAUGMENTED
  results:
  - task:
      name: Image Classification
      type: image-classification
    dataset:
      name: imagefolder
      type: imagefolder
      config: default
      split: train
      args: default
    metrics:
    - name: Accuracy
      type: accuracy
      value:
        accuracy: 1.0
    - name: F1
      type: f1
      value:
        f1: 1.0
    - name: Precision
      type: precision
      value:
        precision: 1.0
    - name: Recall
      type: recall
      value:
        recall: 1.0
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# swin-tiny-patch4-window7-224-FINALConcreteClassifier-SWIN50epochsAUGMENTED

This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0000
- Accuracy: {'accuracy': 1.0}
- F1: {'f1': 1.0}
- Precision: {'precision': 1.0}
- Recall: {'recall': 1.0}

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05

- train_batch_size: 16

- eval_batch_size: 16

- seed: 42

- gradient_accumulation_steps: 4

- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1

- num_epochs: 50

### Training results

| Training Loss | Epoch   | Step  | Validation Loss | Accuracy                         | F1                         | Precision                         | Recall                         |
|:-------------:|:-------:|:-----:|:---------------:|:--------------------------------:|:--------------------------:|:---------------------------------:|:------------------------------:|
| 0.3875        | 0.9994  | 407   | 0.2752          | {'accuracy': 0.9272224781206817} | {'f1': 0.9299076962468851} | {'precision': 0.9308936484753314} | {'recall': 0.9298532516284993} |
| 0.2001        | 1.9988  | 814   | 0.0583          | {'accuracy': 0.983110701673576}  | {'f1': 0.9837765293059086} | {'precision': 0.9846788595224822} | {'recall': 0.9836211079426627} |
| 0.1626        | 2.9982  | 1221  | 0.0207          | {'accuracy': 0.9938584369722094} | {'f1': 0.9941597712458348} | {'precision': 0.9943896461187967} | {'recall': 0.9941051527238169} |
| 0.088         | 4.0     | 1629  | 0.0088          | {'accuracy': 0.9969292184861047} | {'f1': 0.9970539871142889} | {'precision': 0.9970656946831583} | {'recall': 0.9970776666292009} |
| 0.1079        | 4.9994  | 2036  | 0.0046          | {'accuracy': 0.9987716873944419} | {'f1': 0.9988142853329625} | {'precision': 0.9988066339632395} | {'recall': 0.99882263684388}   |
| 0.102         | 5.9988  | 2443  | 0.0034          | {'accuracy': 0.9989252264701366} | {'f1': 0.9989565946802677} | {'precision': 0.998933981872335}  | {'recall': 0.9989857043158454} |
| 0.0594        | 6.9982  | 2850  | 0.0118          | {'accuracy': 0.9972362966374942} | {'f1': 0.9973346644159505} | {'precision': 0.9973144572332442} | {'recall': 0.9974051297029489} |
| 0.0335        | 8.0     | 3258  | 0.0030          | {'accuracy': 0.9987716873944419} | {'f1': 0.9988034164628696} | {'precision': 0.9987863396601946} | {'recall': 0.9988260749455921} |
| 0.0368        | 8.9994  | 3665  | 0.0036          | {'accuracy': 0.9990787655458314} | {'f1': 0.999110823927686}  | {'precision': 0.99909200968523}   | {'recall': 0.9991359447004609} |
| 0.0564        | 9.9988  | 4072  | 0.0040          | {'accuracy': 0.9984646092430524} | {'f1': 0.998509715288995}  | {'precision': 0.9984881711855396} | {'recall': 0.9985402551521871} |
| 0.052         | 10.9982 | 4479  | 0.0021          | {'accuracy': 0.9989252264701366} | {'f1': 0.998956584824745}  | {'precision': 0.9989419496612204} | {'recall': 0.9989777168523596} |
| 0.0429        | 12.0    | 4887  | 0.0033          | {'accuracy': 0.9983110701673575} | {'f1': 0.9983570515623278} | {'precision': 0.9984174575960668} | {'recall': 0.9983115930842853} |
| 0.047         | 12.9994 | 5294  | 0.0008          | {'accuracy': 0.9998464609243052} | {'f1': 0.9998504202011455} | {'precision': 0.9998534583821805} | {'recall': 0.9998475609756098} |
| 0.0391        | 13.9988 | 5701  | 0.0005          | {'accuracy': 0.9998464609243052} | {'f1': 0.999851770829272}  | {'precision': 0.9998561565017261} | {'recall': 0.9998475609756098} |
| 0.0499        | 14.9982 | 6108  | 0.0011          | {'accuracy': 0.9995393827729157} | {'f1': 0.9995512387635233} | {'precision': 0.9995614035087719} | {'recall': 0.9995426829268292} |
| 0.0351        | 16.0    | 6516  | 0.0003          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.021         | 16.9994 | 6923  | 0.0054          | {'accuracy': 0.9984646092430524} | {'f1': 0.9985038406196534} | {'precision': 0.9985498839907192} | {'recall': 0.9984756097560976} |
| 0.0384        | 17.9988 | 7330  | 0.0004          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0093        | 18.9982 | 7737  | 0.0007          | {'accuracy': 0.9995393827729157} | {'f1': 0.999555371210602}  | {'precision': 0.9995443499392467} | {'recall': 0.9995679723502304} |
| 0.0264        | 20.0    | 8145  | 0.0004          | {'accuracy': 0.9998464609243052} | {'f1': 0.9998528788154148} | {'precision': 0.9998499399759904} | {'recall': 0.9998559907834101} |
| 0.0191        | 20.9994 | 8552  | 0.0002          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.05          | 21.9988 | 8959  | 0.0002          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0155        | 22.9982 | 9366  | 0.0003          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0164        | 24.0    | 9774  | 0.0038          | {'accuracy': 0.9987716873944419} | {'f1': 0.998813860406548}  | {'precision': 0.9988584474885844} | {'recall': 0.998780487804878}  |
| 0.0202        | 24.9994 | 10181 | 0.0004          | {'accuracy': 0.9998464609243052} | {'f1': 0.9998504202011455} | {'precision': 0.9998534583821805} | {'recall': 0.9998475609756098} |
| 0.0576        | 25.9988 | 10588 | 0.0001          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0098        | 26.9982 | 10995 | 0.0001          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0091        | 28.0    | 11403 | 0.0001          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0259        | 28.9994 | 11810 | 0.0004          | {'accuracy': 0.9995393827729157} | {'f1': 0.999555371210602}  | {'precision': 0.9995443499392467} | {'recall': 0.9995679723502304} |
| 0.0064        | 29.9988 | 12217 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0097        | 30.9982 | 12624 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0102        | 32.0    | 13032 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0082        | 32.9994 | 13439 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0094        | 33.9988 | 13846 | 0.0002          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0085        | 34.9982 | 14253 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0079        | 36.0    | 14661 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.006         | 36.9994 | 15068 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0039        | 37.9988 | 15475 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.023         | 38.9982 | 15882 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0026        | 40.0    | 16290 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0289        | 40.9994 | 16697 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0026        | 41.9988 | 17104 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0155        | 42.9982 | 17511 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0016        | 44.0    | 17919 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0005        | 44.9994 | 18326 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0058        | 45.9988 | 18733 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0012        | 46.9982 | 19140 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.001         | 48.0    | 19548 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0016        | 48.9994 | 19955 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |
| 0.0015        | 49.9693 | 20350 | 0.0000          | {'accuracy': 1.0}                | {'f1': 1.0}                | {'precision': 1.0}                | {'recall': 1.0}                |


### Framework versions

- Transformers 4.43.3
- Pytorch 2.3.1
- Datasets 2.20.0
- Tokenizers 0.19.1