Augusto777 commited on
Commit
e06f9cc
1 Parent(s): 616d069

Model save

Browse files
README.md ADDED
@@ -0,0 +1,153 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: google/vit-base-patch16-224
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: vit-base-patch16-224-dmae-va-U5-100-iN
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # vit-base-patch16-224-dmae-va-U5-100-iN
17
+
18
+ This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.9858
21
+ - Accuracy: 0.7833
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 5e-05
41
+ - train_batch_size: 32
42
+ - eval_batch_size: 32
43
+ - seed: 42
44
+ - gradient_accumulation_steps: 4
45
+ - total_train_batch_size: 128
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - lr_scheduler_warmup_ratio: 0.05
49
+ - num_epochs: 100
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
54
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
55
+ | No log | 0.9 | 7 | 1.3812 | 0.45 |
56
+ | 1.3848 | 1.94 | 15 | 1.3606 | 0.5 |
57
+ | 1.3686 | 2.97 | 23 | 1.3075 | 0.5333 |
58
+ | 1.2965 | 4.0 | 31 | 1.2370 | 0.4667 |
59
+ | 1.2965 | 4.9 | 38 | 1.1168 | 0.5333 |
60
+ | 1.1753 | 5.94 | 46 | 1.0310 | 0.5667 |
61
+ | 1.0294 | 6.97 | 54 | 0.9316 | 0.6 |
62
+ | 0.902 | 8.0 | 62 | 0.8728 | 0.6833 |
63
+ | 0.902 | 8.9 | 69 | 0.8129 | 0.7667 |
64
+ | 0.7812 | 9.94 | 77 | 0.7006 | 0.8 |
65
+ | 0.6419 | 10.97 | 85 | 0.6381 | 0.8667 |
66
+ | 0.5109 | 12.0 | 93 | 0.6327 | 0.8167 |
67
+ | 0.3838 | 12.9 | 100 | 0.5442 | 0.8667 |
68
+ | 0.3838 | 13.94 | 108 | 0.6755 | 0.75 |
69
+ | 0.285 | 14.97 | 116 | 0.7756 | 0.7167 |
70
+ | 0.2672 | 16.0 | 124 | 0.8107 | 0.7167 |
71
+ | 0.2466 | 16.9 | 131 | 0.5219 | 0.8333 |
72
+ | 0.2466 | 17.94 | 139 | 0.7041 | 0.7833 |
73
+ | 0.2312 | 18.97 | 147 | 0.7879 | 0.75 |
74
+ | 0.1933 | 20.0 | 155 | 0.7090 | 0.8 |
75
+ | 0.1692 | 20.9 | 162 | 0.5395 | 0.8333 |
76
+ | 0.1578 | 21.94 | 170 | 0.6419 | 0.8167 |
77
+ | 0.1578 | 22.97 | 178 | 0.5736 | 0.8333 |
78
+ | 0.1321 | 24.0 | 186 | 0.7471 | 0.75 |
79
+ | 0.1114 | 24.9 | 193 | 0.6447 | 0.7667 |
80
+ | 0.1385 | 25.94 | 201 | 0.6158 | 0.8167 |
81
+ | 0.1385 | 26.97 | 209 | 0.6467 | 0.8 |
82
+ | 0.1136 | 28.0 | 217 | 0.6180 | 0.85 |
83
+ | 0.0997 | 28.9 | 224 | 0.8578 | 0.75 |
84
+ | 0.1064 | 29.94 | 232 | 0.6778 | 0.8167 |
85
+ | 0.0775 | 30.97 | 240 | 0.8124 | 0.8 |
86
+ | 0.0775 | 32.0 | 248 | 0.7783 | 0.8 |
87
+ | 0.0921 | 32.9 | 255 | 0.8320 | 0.7333 |
88
+ | 0.0919 | 33.94 | 263 | 0.8310 | 0.7833 |
89
+ | 0.0888 | 34.97 | 271 | 0.6576 | 0.85 |
90
+ | 0.0888 | 36.0 | 279 | 0.7044 | 0.8333 |
91
+ | 0.0693 | 36.9 | 286 | 0.7608 | 0.8167 |
92
+ | 0.061 | 37.94 | 294 | 0.7802 | 0.8 |
93
+ | 0.0699 | 38.97 | 302 | 0.7762 | 0.8167 |
94
+ | 0.0652 | 40.0 | 310 | 0.7579 | 0.8 |
95
+ | 0.0652 | 40.9 | 317 | 0.9985 | 0.75 |
96
+ | 0.0562 | 41.94 | 325 | 0.8027 | 0.8167 |
97
+ | 0.0534 | 42.97 | 333 | 0.9705 | 0.7833 |
98
+ | 0.0519 | 44.0 | 341 | 0.7301 | 0.8333 |
99
+ | 0.0519 | 44.9 | 348 | 0.8433 | 0.8 |
100
+ | 0.0529 | 45.94 | 356 | 0.8534 | 0.8 |
101
+ | 0.0772 | 46.97 | 364 | 0.8562 | 0.8 |
102
+ | 0.0644 | 48.0 | 372 | 0.8419 | 0.8 |
103
+ | 0.0644 | 48.9 | 379 | 1.1251 | 0.7667 |
104
+ | 0.0467 | 49.94 | 387 | 0.7537 | 0.8333 |
105
+ | 0.0576 | 50.97 | 395 | 0.7517 | 0.8333 |
106
+ | 0.0344 | 52.0 | 403 | 0.8343 | 0.8 |
107
+ | 0.0663 | 52.9 | 410 | 0.7636 | 0.8 |
108
+ | 0.0663 | 53.94 | 418 | 0.8253 | 0.8167 |
109
+ | 0.0353 | 54.97 | 426 | 0.9348 | 0.8 |
110
+ | 0.0524 | 56.0 | 434 | 0.8217 | 0.8167 |
111
+ | 0.0479 | 56.9 | 441 | 0.7586 | 0.8167 |
112
+ | 0.0479 | 57.94 | 449 | 0.8147 | 0.8 |
113
+ | 0.0595 | 58.97 | 457 | 1.0000 | 0.7833 |
114
+ | 0.0475 | 60.0 | 465 | 0.9291 | 0.7833 |
115
+ | 0.049 | 60.9 | 472 | 0.9588 | 0.7833 |
116
+ | 0.0398 | 61.94 | 480 | 0.9501 | 0.8 |
117
+ | 0.0398 | 62.97 | 488 | 0.9499 | 0.8 |
118
+ | 0.0496 | 64.0 | 496 | 0.9279 | 0.8 |
119
+ | 0.0354 | 64.9 | 503 | 0.9677 | 0.75 |
120
+ | 0.0325 | 65.94 | 511 | 0.8371 | 0.8333 |
121
+ | 0.0325 | 66.97 | 519 | 0.9683 | 0.8 |
122
+ | 0.0335 | 68.0 | 527 | 1.0455 | 0.7833 |
123
+ | 0.0375 | 68.9 | 534 | 0.9027 | 0.8167 |
124
+ | 0.0424 | 69.94 | 542 | 0.8043 | 0.85 |
125
+ | 0.0383 | 70.97 | 550 | 0.9035 | 0.7833 |
126
+ | 0.0383 | 72.0 | 558 | 0.9360 | 0.7833 |
127
+ | 0.0295 | 72.9 | 565 | 0.9841 | 0.7833 |
128
+ | 0.0307 | 73.94 | 573 | 0.9300 | 0.8 |
129
+ | 0.0376 | 74.97 | 581 | 0.9630 | 0.7833 |
130
+ | 0.0376 | 76.0 | 589 | 0.9777 | 0.7833 |
131
+ | 0.0259 | 76.9 | 596 | 0.9323 | 0.8 |
132
+ | 0.0345 | 77.94 | 604 | 0.9075 | 0.8 |
133
+ | 0.0346 | 78.97 | 612 | 0.8951 | 0.8 |
134
+ | 0.0319 | 80.0 | 620 | 0.9676 | 0.8 |
135
+ | 0.0319 | 80.9 | 627 | 0.9884 | 0.8 |
136
+ | 0.0226 | 81.94 | 635 | 0.9851 | 0.7833 |
137
+ | 0.033 | 82.97 | 643 | 0.9710 | 0.7833 |
138
+ | 0.0262 | 84.0 | 651 | 0.9851 | 0.7833 |
139
+ | 0.0262 | 84.9 | 658 | 0.9868 | 0.7833 |
140
+ | 0.0345 | 85.94 | 666 | 0.9702 | 0.7833 |
141
+ | 0.0299 | 86.97 | 674 | 0.9889 | 0.7833 |
142
+ | 0.0347 | 88.0 | 682 | 1.0003 | 0.7833 |
143
+ | 0.0347 | 88.9 | 689 | 0.9913 | 0.7833 |
144
+ | 0.0288 | 89.94 | 697 | 0.9859 | 0.7833 |
145
+ | 0.0198 | 90.32 | 700 | 0.9858 | 0.7833 |
146
+
147
+
148
+ ### Framework versions
149
+
150
+ - Transformers 4.36.2
151
+ - Pytorch 2.1.2+cu118
152
+ - Datasets 2.16.1
153
+ - Tokenizers 0.15.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e9ec76dcd225b4317424ae013f2a05cdbd3ab12c613c91ef9caa412f334214dd
3
  size 343230128
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:80c8beceb2965d2c29daa3e99b61a92d795d92b3c1190a2259e67e9b0b1b89e4
3
  size 343230128
runs/Apr26_08-18-02_DESKTOP-SKBE9FB/events.out.tfevents.1714141084.DESKTOP-SKBE9FB.7308.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:88c5eb34adec62864c23d1921ed95b6a3e25d54631fb9f5a6d3edfd0c1f7611d
3
- size 44772
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e14bf4f32df12dafef316c16eb7bf7ce1dcfbc1a023762c887b2eb14a75c8d55
3
+ size 45126