PushkarA07 commited on
Commit
22bfd5c
·
verified ·
1 Parent(s): dd9392a

End of training

Browse files
Files changed (4) hide show
  1. README.md +24 -248
  2. config.json +3 -1
  3. model.safetensors +2 -2
  4. training_args.bin +1 -1
README.md CHANGED
@@ -16,12 +16,14 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  # segformer-b0-finetuned-oldapp-oct-1
18
 
19
- This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the PushkarA07/batch1-tiles dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: -434.0362
22
- - Mean Iou: 0.0000
23
- - Mean Accuracy: 1.0
24
- - Overall Accuracy: 1.0
 
 
25
 
26
  ## Model description
27
 
@@ -46,252 +48,26 @@ The following hyperparameters were used during training:
46
  - seed: 42
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
  - lr_scheduler_type: linear
49
- - num_epochs: 100
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy |
54
- |:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|
55
- | 9.1177 | 0.4167 | 10 | 3.1747 | 0.0000 | 1.0 | 1.0 |
56
- | -7.5948 | 0.8333 | 20 | 3.4239 | 0.0000 | 1.0 | 1.0 |
57
- | -36.4006 | 1.25 | 30 | 4.9370 | 0.0000 | 1.0 | 1.0 |
58
- | -68.5634 | 1.6667 | 40 | -11.8052 | 0.0000 | 1.0 | 1.0 |
59
- | -72.8737 | 2.0833 | 50 | -10.8982 | 0.0000 | 1.0 | 1.0 |
60
- | -60.664 | 2.5 | 60 | -46.7299 | 0.0000 | 1.0 | 1.0 |
61
- | -106.3157 | 2.9167 | 70 | -44.4029 | 0.0000 | 1.0 | 1.0 |
62
- | -95.8069 | 3.3333 | 80 | -61.6743 | 0.0000 | 1.0 | 1.0 |
63
- | -132.0845 | 3.75 | 90 | -57.1445 | 0.0000 | 1.0 | 1.0 |
64
- | -165.5214 | 4.1667 | 100 | -54.9286 | 0.0000 | 1.0 | 1.0 |
65
- | -160.2556 | 4.5833 | 110 | -94.0934 | 0.0000 | 1.0 | 1.0 |
66
- | -183.3402 | 5.0 | 120 | -61.7439 | 0.0000 | 1.0 | 1.0 |
67
- | -228.4443 | 5.4167 | 130 | -58.0895 | 0.0000 | 1.0 | 1.0 |
68
- | -190.4287 | 5.8333 | 140 | -109.6511 | 0.0000 | 1.0 | 1.0 |
69
- | -212.9903 | 6.25 | 150 | -102.2368 | 0.0000 | 1.0 | 1.0 |
70
- | -227.9513 | 6.6667 | 160 | -93.2920 | 0.0000 | 1.0 | 1.0 |
71
- | -225.2955 | 7.0833 | 170 | -86.0104 | 0.0000 | 1.0 | 1.0 |
72
- | -248.1907 | 7.5 | 180 | -110.2658 | 0.0000 | 1.0 | 1.0 |
73
- | -237.4253 | 7.9167 | 190 | -112.2953 | 0.0000 | 1.0 | 1.0 |
74
- | -249.7178 | 8.3333 | 200 | -144.2673 | 0.0000 | 1.0 | 1.0 |
75
- | -291.5307 | 8.75 | 210 | -117.0896 | 0.0000 | 1.0 | 1.0 |
76
- | -263.4248 | 9.1667 | 220 | -150.6111 | 0.0000 | 1.0 | 1.0 |
77
- | -290.6562 | 9.5833 | 230 | -186.4456 | 0.0000 | 1.0 | 1.0 |
78
- | -291.933 | 10.0 | 240 | -136.8444 | 0.0000 | 1.0 | 1.0 |
79
- | -330.3426 | 10.4167 | 250 | -134.6306 | 0.0000 | 1.0 | 1.0 |
80
- | -302.9307 | 10.8333 | 260 | -164.5891 | 0.0000 | 1.0 | 1.0 |
81
- | -303.2674 | 11.25 | 270 | -184.5806 | 0.0000 | 1.0 | 1.0 |
82
- | -334.0839 | 11.6667 | 280 | -94.3697 | 0.0000 | 1.0 | 1.0 |
83
- | -349.2104 | 12.0833 | 290 | -143.0347 | 0.0000 | 1.0 | 1.0 |
84
- | -367.3026 | 12.5 | 300 | -130.5752 | 0.0000 | 1.0 | 1.0 |
85
- | -376.3403 | 12.9167 | 310 | -135.2888 | 0.0000 | 1.0 | 1.0 |
86
- | -369.0916 | 13.3333 | 320 | -121.6049 | 0.0000 | 1.0 | 1.0 |
87
- | -380.9592 | 13.75 | 330 | -196.7207 | 0.0000 | 1.0 | 1.0 |
88
- | -379.2241 | 14.1667 | 340 | -150.6844 | 0.0000 | 1.0 | 1.0 |
89
- | -390.002 | 14.5833 | 350 | -213.3066 | 0.0000 | 1.0 | 1.0 |
90
- | -428.7926 | 15.0 | 360 | -129.6799 | 0.0000 | 1.0 | 1.0 |
91
- | -399.4067 | 15.4167 | 370 | -172.0849 | 0.0000 | 1.0 | 1.0 |
92
- | -425.9248 | 15.8333 | 380 | -85.1861 | 0.0000 | 1.0 | 1.0 |
93
- | -452.1733 | 16.25 | 390 | -116.3002 | 0.0000 | 1.0 | 1.0 |
94
- | -438.8081 | 16.6667 | 400 | -162.4811 | 0.0000 | 1.0 | 1.0 |
95
- | -442.2211 | 17.0833 | 410 | -310.0329 | 0.0000 | 1.0 | 1.0 |
96
- | -481.9792 | 17.5 | 420 | -208.4158 | 0.0000 | 1.0 | 1.0 |
97
- | -471.0272 | 17.9167 | 430 | -139.1131 | 0.0000 | 1.0 | 1.0 |
98
- | -480.2365 | 18.3333 | 440 | -138.3438 | 0.0000 | 1.0 | 1.0 |
99
- | -532.4087 | 18.75 | 450 | -145.5148 | 0.0000 | 1.0 | 1.0 |
100
- | -459.1265 | 19.1667 | 460 | -151.2145 | 0.0000 | 1.0 | 1.0 |
101
- | -495.9417 | 19.5833 | 470 | -290.5081 | 0.0000 | 1.0 | 1.0 |
102
- | -526.197 | 20.0 | 480 | -114.2300 | 0.0000 | 1.0 | 1.0 |
103
- | -567.6094 | 20.4167 | 490 | -199.9692 | 0.0000 | 1.0 | 1.0 |
104
- | -588.7917 | 20.8333 | 500 | -223.3900 | 0.0000 | 1.0 | 1.0 |
105
- | -522.3259 | 21.25 | 510 | -225.7774 | 0.0000 | 1.0 | 1.0 |
106
- | -538.589 | 21.6667 | 520 | -203.5701 | 0.0000 | 1.0 | 1.0 |
107
- | -570.6451 | 22.0833 | 530 | -170.6749 | 0.0000 | 1.0 | 1.0 |
108
- | -592.1448 | 22.5 | 540 | -227.9248 | 0.0000 | 1.0 | 1.0 |
109
- | -618.1144 | 22.9167 | 550 | -228.8316 | 0.0000 | 1.0 | 1.0 |
110
- | -678.1088 | 23.3333 | 560 | -226.7061 | 0.0000 | 1.0 | 1.0 |
111
- | -599.4714 | 23.75 | 570 | -275.0445 | 0.0000 | 1.0 | 1.0 |
112
- | -637.0996 | 24.1667 | 580 | -218.5857 | 0.0000 | 1.0 | 1.0 |
113
- | -642.3939 | 24.5833 | 590 | -314.6775 | 0.0000 | 1.0 | 1.0 |
114
- | -648.5231 | 25.0 | 600 | -163.4852 | 0.0000 | 1.0 | 1.0 |
115
- | -685.6893 | 25.4167 | 610 | -235.4895 | 0.0000 | 1.0 | 1.0 |
116
- | -698.37 | 25.8333 | 620 | -172.1634 | 0.0000 | 1.0 | 1.0 |
117
- | -765.9796 | 26.25 | 630 | -211.5028 | 0.0000 | 1.0 | 1.0 |
118
- | -674.8707 | 26.6667 | 640 | -261.2273 | 0.0000 | 1.0 | 1.0 |
119
- | -656.3912 | 27.0833 | 650 | -229.8386 | 0.0000 | 1.0 | 1.0 |
120
- | -678.0515 | 27.5 | 660 | -320.2002 | 0.0000 | 1.0 | 1.0 |
121
- | -718.6143 | 27.9167 | 670 | -305.6219 | 0.0000 | 1.0 | 1.0 |
122
- | -704.337 | 28.3333 | 680 | -233.4762 | 0.0000 | 1.0 | 1.0 |
123
- | -725.5703 | 28.75 | 690 | -264.2842 | 0.0000 | 1.0 | 1.0 |
124
- | -721.0368 | 29.1667 | 700 | -221.6762 | 0.0000 | 1.0 | 1.0 |
125
- | -715.1995 | 29.5833 | 710 | -270.3821 | 0.0000 | 1.0 | 1.0 |
126
- | -726.7212 | 30.0 | 720 | -366.0666 | 0.0000 | 1.0 | 1.0 |
127
- | -776.2849 | 30.4167 | 730 | -349.2112 | 0.0000 | 1.0 | 1.0 |
128
- | -775.3517 | 30.8333 | 740 | -338.3333 | 0.0000 | 1.0 | 1.0 |
129
- | -799.2989 | 31.25 | 750 | -336.1113 | 0.0000 | 1.0 | 1.0 |
130
- | -781.5917 | 31.6667 | 760 | -432.8610 | 0.0000 | 1.0 | 1.0 |
131
- | -782.3719 | 32.0833 | 770 | -392.7339 | 0.0000 | 1.0 | 1.0 |
132
- | -805.8129 | 32.5 | 780 | -374.5876 | 0.0000 | 1.0 | 1.0 |
133
- | -802.058 | 32.9167 | 790 | -253.8625 | 0.0000 | 1.0 | 1.0 |
134
- | -878.0176 | 33.3333 | 800 | -376.2746 | 0.0000 | 1.0 | 1.0 |
135
- | -824.186 | 33.75 | 810 | -493.8241 | 0.0000 | 1.0 | 1.0 |
136
- | -844.0652 | 34.1667 | 820 | -271.3152 | 0.0000 | 1.0 | 1.0 |
137
- | -838.5065 | 34.5833 | 830 | -374.8744 | 0.0000 | 1.0 | 1.0 |
138
- | -856.5721 | 35.0 | 840 | -618.2783 | 0.0000 | 1.0 | 1.0 |
139
- | -809.7913 | 35.4167 | 850 | -523.4037 | 0.0000 | 1.0 | 1.0 |
140
- | -840.6163 | 35.8333 | 860 | -318.4138 | 0.0000 | 1.0 | 1.0 |
141
- | -864.1099 | 36.25 | 870 | -225.3786 | 0.0000 | 1.0 | 1.0 |
142
- | -894.3154 | 36.6667 | 880 | -252.0861 | 0.0000 | 1.0 | 1.0 |
143
- | -868.4321 | 37.0833 | 890 | -285.0858 | 0.0000 | 1.0 | 1.0 |
144
- | -889.9147 | 37.5 | 900 | -393.6808 | 0.0000 | 1.0 | 1.0 |
145
- | -905.4691 | 37.9167 | 910 | -228.8913 | 0.0000 | 1.0 | 1.0 |
146
- | -928.3636 | 38.3333 | 920 | -267.1704 | 0.0000 | 1.0 | 1.0 |
147
- | -883.4459 | 38.75 | 930 | -350.1960 | 0.0000 | 1.0 | 1.0 |
148
- | -910.781 | 39.1667 | 940 | -372.5999 | 0.0000 | 1.0 | 1.0 |
149
- | -986.6632 | 39.5833 | 950 | -263.4685 | 0.0000 | 1.0 | 1.0 |
150
- | -942.8903 | 40.0 | 960 | -322.1251 | 0.0000 | 1.0 | 1.0 |
151
- | -933.802 | 40.4167 | 970 | -363.5819 | 0.0000 | 1.0 | 1.0 |
152
- | -936.968 | 40.8333 | 980 | -332.9326 | 0.0000 | 1.0 | 1.0 |
153
- | -881.9745 | 41.25 | 990 | -408.6155 | 0.0000 | 1.0 | 1.0 |
154
- | -989.938 | 41.6667 | 1000 | -276.6076 | 0.0000 | 1.0 | 1.0 |
155
- | -990.3528 | 42.0833 | 1010 | -436.2013 | 0.0000 | 1.0 | 1.0 |
156
- | -1010.2992 | 42.5 | 1020 | -730.1772 | 0.0000 | 1.0 | 1.0 |
157
- | -1015.8688 | 42.9167 | 1030 | -905.7499 | 0.0000 | 1.0 | 1.0 |
158
- | -1194.1057 | 43.3333 | 1040 | -550.5521 | 0.0000 | 1.0 | 1.0 |
159
- | -1046.6821 | 43.75 | 1050 | -424.3086 | 0.0000 | 1.0 | 1.0 |
160
- | -1042.28 | 44.1667 | 1060 | -589.9116 | 0.0000 | 1.0 | 1.0 |
161
- | -1064.3098 | 44.5833 | 1070 | -379.3090 | 0.0000 | 1.0 | 1.0 |
162
- | -1035.7889 | 45.0 | 1080 | -394.1577 | 0.0000 | 1.0 | 1.0 |
163
- | -1016.9705 | 45.4167 | 1090 | -413.3618 | 0.0000 | 1.0 | 1.0 |
164
- | -1050.0835 | 45.8333 | 1100 | -284.9713 | 0.0000 | 1.0 | 1.0 |
165
- | -1101.0037 | 46.25 | 1110 | -319.4436 | 0.0000 | 1.0 | 1.0 |
166
- | -1074.5059 | 46.6667 | 1120 | -309.4717 | 0.0000 | 1.0 | 1.0 |
167
- | -1116.697 | 47.0833 | 1130 | -285.6954 | 0.0000 | 1.0 | 1.0 |
168
- | -1033.0059 | 47.5 | 1140 | -413.5283 | 0.0000 | 1.0 | 1.0 |
169
- | -1107.1597 | 47.9167 | 1150 | -345.1506 | 0.0000 | 1.0 | 1.0 |
170
- | -1129.9175 | 48.3333 | 1160 | -395.6399 | 0.0000 | 1.0 | 1.0 |
171
- | -1084.7349 | 48.75 | 1170 | -494.9291 | 0.0000 | 1.0 | 1.0 |
172
- | -1155.866 | 49.1667 | 1180 | -699.7101 | 0.0000 | 1.0 | 1.0 |
173
- | -1142.4673 | 49.5833 | 1190 | -588.2758 | 0.0000 | 1.0 | 1.0 |
174
- | -1112.2432 | 50.0 | 1200 | -662.2170 | 0.0000 | 1.0 | 1.0 |
175
- | -1143.2042 | 50.4167 | 1210 | -588.6242 | 0.0000 | 1.0 | 1.0 |
176
- | -1118.0652 | 50.8333 | 1220 | -633.6115 | 0.0000 | 1.0 | 1.0 |
177
- | -1172.4246 | 51.25 | 1230 | -446.0499 | 0.0000 | 1.0 | 1.0 |
178
- | -1144.6307 | 51.6667 | 1240 | -384.4327 | 0.0000 | 1.0 | 1.0 |
179
- | -1110.9465 | 52.0833 | 1250 | -580.2764 | 0.0000 | 1.0 | 1.0 |
180
- | -1253.3848 | 52.5 | 1260 | -711.7065 | 0.0000 | 1.0 | 1.0 |
181
- | -1146.2629 | 52.9167 | 1270 | -588.7798 | 0.0000 | 1.0 | 1.0 |
182
- | -1210.6774 | 53.3333 | 1280 | -437.9136 | 0.0000 | 1.0 | 1.0 |
183
- | -1195.0886 | 53.75 | 1290 | -311.9930 | 0.0000 | 1.0 | 1.0 |
184
- | -1196.9758 | 54.1667 | 1300 | -433.6264 | 0.0000 | 1.0 | 1.0 |
185
- | -1240.3611 | 54.5833 | 1310 | -346.8630 | 0.0000 | 1.0 | 1.0 |
186
- | -1244.5226 | 55.0 | 1320 | -312.5194 | 0.0000 | 1.0 | 1.0 |
187
- | -1183.843 | 55.4167 | 1330 | -524.9532 | 0.0000 | 1.0 | 1.0 |
188
- | -1198.5249 | 55.8333 | 1340 | -435.6567 | 0.0000 | 1.0 | 1.0 |
189
- | -1209.6641 | 56.25 | 1350 | -315.1211 | 0.0000 | 1.0 | 1.0 |
190
- | -1176.8691 | 56.6667 | 1360 | -354.5521 | 0.0000 | 1.0 | 1.0 |
191
- | -1225.2603 | 57.0833 | 1370 | -344.3314 | 0.0000 | 1.0 | 1.0 |
192
- | -1337.3123 | 57.5 | 1380 | -303.1938 | 0.0000 | 1.0 | 1.0 |
193
- | -1177.5667 | 57.9167 | 1390 | -387.6610 | 0.0000 | 1.0 | 1.0 |
194
- | -1255.4438 | 58.3333 | 1400 | -408.7074 | 0.0000 | 1.0 | 1.0 |
195
- | -1285.6516 | 58.75 | 1410 | -386.6487 | 0.0000 | 1.0 | 1.0 |
196
- | -1415.5852 | 59.1667 | 1420 | -591.8176 | 0.0000 | 1.0 | 1.0 |
197
- | -1242.7576 | 59.5833 | 1430 | -366.7484 | 0.0000 | 1.0 | 1.0 |
198
- | -1355.8105 | 60.0 | 1440 | -742.4871 | 0.0000 | 1.0 | 1.0 |
199
- | -1307.8296 | 60.4167 | 1450 | -584.0139 | 0.0000 | 1.0 | 1.0 |
200
- | -1354.4441 | 60.8333 | 1460 | -381.6559 | 0.0000 | 1.0 | 1.0 |
201
- | -1491.1709 | 61.25 | 1470 | -571.5823 | 0.0000 | 1.0 | 1.0 |
202
- | -1325.4679 | 61.6667 | 1480 | -835.5530 | 0.0000 | 1.0 | 1.0 |
203
- | -1347.5428 | 62.0833 | 1490 | -374.3561 | 0.0000 | 1.0 | 1.0 |
204
- | -1316.9006 | 62.5 | 1500 | -353.7957 | 0.0000 | 1.0 | 1.0 |
205
- | -1300.4648 | 62.9167 | 1510 | -792.2424 | 0.0000 | 1.0 | 1.0 |
206
- | -1317.0828 | 63.3333 | 1520 | -473.5695 | 0.0000 | 1.0 | 1.0 |
207
- | -1397.0754 | 63.75 | 1530 | -379.3622 | 0.0000 | 1.0 | 1.0 |
208
- | -1286.0005 | 64.1667 | 1540 | -334.6458 | 0.0000 | 1.0 | 1.0 |
209
- | -1431.1766 | 64.5833 | 1550 | -585.5664 | 0.0000 | 1.0 | 1.0 |
210
- | -1330.3528 | 65.0 | 1560 | -500.5868 | 0.0000 | 1.0 | 1.0 |
211
- | -1468.8844 | 65.4167 | 1570 | -462.8506 | 0.0000 | 1.0 | 1.0 |
212
- | -1352.9685 | 65.8333 | 1580 | -414.3020 | 0.0000 | 1.0 | 1.0 |
213
- | -1533.5442 | 66.25 | 1590 | -751.7989 | 0.0000 | 1.0 | 1.0 |
214
- | -1425.9828 | 66.6667 | 1600 | -438.8446 | 0.0000 | 1.0 | 1.0 |
215
- | -1409.2983 | 67.0833 | 1610 | -527.6069 | 0.0000 | 1.0 | 1.0 |
216
- | -1390.3611 | 67.5 | 1620 | -780.3410 | 0.0000 | 1.0 | 1.0 |
217
- | -1477.4402 | 67.9167 | 1630 | -471.2637 | 0.0000 | 1.0 | 1.0 |
218
- | -1374.2109 | 68.3333 | 1640 | -441.2887 | 0.0000 | 1.0 | 1.0 |
219
- | -1385.1523 | 68.75 | 1650 | -444.8388 | 0.0000 | 1.0 | 1.0 |
220
- | -1394.7666 | 69.1667 | 1660 | -333.1129 | 0.0000 | 1.0 | 1.0 |
221
- | -1404.7578 | 69.5833 | 1670 | -708.2736 | 0.0000 | 1.0 | 1.0 |
222
- | -1405.9553 | 70.0 | 1680 | -417.8641 | 0.0000 | 1.0 | 1.0 |
223
- | -1439.1917 | 70.4167 | 1690 | -881.1912 | 0.0000 | 1.0 | 1.0 |
224
- | -1432.6394 | 70.8333 | 1700 | -682.9276 | 0.0000 | 1.0 | 1.0 |
225
- | -1426.861 | 71.25 | 1710 | -348.2993 | 0.0000 | 1.0 | 1.0 |
226
- | -1397.6794 | 71.6667 | 1720 | -613.1935 | 0.0000 | 1.0 | 1.0 |
227
- | -1397.5701 | 72.0833 | 1730 | -697.1166 | 0.0000 | 1.0 | 1.0 |
228
- | -1388.8636 | 72.5 | 1740 | -492.6042 | 0.0000 | 1.0 | 1.0 |
229
- | -1368.7869 | 72.9167 | 1750 | -606.6381 | 0.0000 | 1.0 | 1.0 |
230
- | -1464.3013 | 73.3333 | 1760 | -545.0535 | 0.0000 | 1.0 | 1.0 |
231
- | -1481.2261 | 73.75 | 1770 | -456.0034 | 0.0000 | 1.0 | 1.0 |
232
- | -1457.8424 | 74.1667 | 1780 | -573.8081 | 0.0000 | 1.0 | 1.0 |
233
- | -1417.1211 | 74.5833 | 1790 | -703.7387 | 0.0000 | 1.0 | 1.0 |
234
- | -1515.972 | 75.0 | 1800 | -390.8029 | 0.0000 | 1.0 | 1.0 |
235
- | -1504.0477 | 75.4167 | 1810 | -434.6499 | 0.0000 | 1.0 | 1.0 |
236
- | -1409.7925 | 75.8333 | 1820 | -340.4917 | 0.0000 | 1.0 | 1.0 |
237
- | -1488.1987 | 76.25 | 1830 | -526.0972 | 0.0000 | 1.0 | 1.0 |
238
- | -1423.2494 | 76.6667 | 1840 | -342.7698 | 0.0000 | 1.0 | 1.0 |
239
- | -1412.7407 | 77.0833 | 1850 | -409.7189 | 0.0000 | 1.0 | 1.0 |
240
- | -1603.1936 | 77.5 | 1860 | -375.4607 | 0.0000 | 1.0 | 1.0 |
241
- | -1446.4662 | 77.9167 | 1870 | -349.1838 | 0.0000 | 1.0 | 1.0 |
242
- | -1452.2996 | 78.3333 | 1880 | -382.6560 | 0.0000 | 1.0 | 1.0 |
243
- | -1512.5234 | 78.75 | 1890 | -526.1674 | 0.0000 | 1.0 | 1.0 |
244
- | -1532.6047 | 79.1667 | 1900 | -487.4632 | 0.0000 | 1.0 | 1.0 |
245
- | -1450.5111 | 79.5833 | 1910 | -436.1980 | 0.0000 | 1.0 | 1.0 |
246
- | -1596.5342 | 80.0 | 1920 | -710.6350 | 0.0000 | 1.0 | 1.0 |
247
- | -1516.173 | 80.4167 | 1930 | -467.0529 | 0.0000 | 1.0 | 1.0 |
248
- | -1543.4387 | 80.8333 | 1940 | -379.2095 | 0.0000 | 1.0 | 1.0 |
249
- | -1546.9175 | 81.25 | 1950 | -713.6882 | 0.0000 | 1.0 | 1.0 |
250
- | -1482.7693 | 81.6667 | 1960 | -826.5051 | 0.0000 | 1.0 | 1.0 |
251
- | -1489.7949 | 82.0833 | 1970 | -473.3296 | 0.0000 | 1.0 | 1.0 |
252
- | -1557.4614 | 82.5 | 1980 | -642.0671 | 0.0000 | 1.0 | 1.0 |
253
- | -1510.1143 | 82.9167 | 1990 | -851.6248 | 0.0000 | 1.0 | 1.0 |
254
- | -1579.5544 | 83.3333 | 2000 | -816.4890 | 0.0000 | 1.0 | 1.0 |
255
- | -1533.7808 | 83.75 | 2010 | -411.4224 | 0.0000 | 1.0 | 1.0 |
256
- | -1642.6189 | 84.1667 | 2020 | -432.7599 | 0.0000 | 1.0 | 1.0 |
257
- | -1494.0178 | 84.5833 | 2030 | -879.1949 | 0.0000 | 1.0 | 1.0 |
258
- | -1798.853 | 85.0 | 2040 | -907.0826 | 0.0000 | 1.0 | 1.0 |
259
- | -1553.5759 | 85.4167 | 2050 | -844.8438 | 0.0000 | 1.0 | 1.0 |
260
- | -1535.3077 | 85.8333 | 2060 | -687.7254 | 0.0000 | 1.0 | 1.0 |
261
- | -1646.7931 | 86.25 | 2070 | -411.8168 | 0.0000 | 1.0 | 1.0 |
262
- | -1491.8165 | 86.6667 | 2080 | -715.3487 | 0.0000 | 1.0 | 1.0 |
263
- | -1521.231 | 87.0833 | 2090 | -843.9893 | 0.0000 | 1.0 | 1.0 |
264
- | -1623.2894 | 87.5 | 2100 | -644.5621 | 0.0000 | 1.0 | 1.0 |
265
- | -1580.976 | 87.9167 | 2110 | -395.0671 | 0.0000 | 1.0 | 1.0 |
266
- | -1613.3984 | 88.3333 | 2120 | -425.4142 | 0.0000 | 1.0 | 1.0 |
267
- | -1558.4185 | 88.75 | 2130 | -677.3099 | 0.0000 | 1.0 | 1.0 |
268
- | -1533.4421 | 89.1667 | 2140 | -555.4234 | 0.0000 | 1.0 | 1.0 |
269
- | -1503.1881 | 89.5833 | 2150 | -479.6435 | 0.0000 | 1.0 | 1.0 |
270
- | -1616.7373 | 90.0 | 2160 | -552.3852 | 0.0000 | 1.0 | 1.0 |
271
- | -1653.3994 | 90.4167 | 2170 | -554.8867 | 0.0000 | 1.0 | 1.0 |
272
- | -1617.019 | 90.8333 | 2180 | -452.9881 | 0.0000 | 1.0 | 1.0 |
273
- | -1484.0178 | 91.25 | 2190 | -368.5315 | 0.0000 | 1.0 | 1.0 |
274
- | -1613.8223 | 91.6667 | 2200 | -681.4908 | 0.0000 | 1.0 | 1.0 |
275
- | -1636.2089 | 92.0833 | 2210 | -401.3724 | 0.0000 | 1.0 | 1.0 |
276
- | -1605.0856 | 92.5 | 2220 | -449.5143 | 0.0000 | 1.0 | 1.0 |
277
- | -1508.0002 | 92.9167 | 2230 | -444.6108 | 0.0000 | 1.0 | 1.0 |
278
- | -1596.582 | 93.3333 | 2240 | -350.9911 | 0.0000 | 1.0 | 1.0 |
279
- | -1807.7092 | 93.75 | 2250 | -374.4726 | 0.0000 | 1.0 | 1.0 |
280
- | -1672.6074 | 94.1667 | 2260 | -328.1292 | 0.0000 | 1.0 | 1.0 |
281
- | -1533.2402 | 94.5833 | 2270 | -544.1064 | 0.0000 | 1.0 | 1.0 |
282
- | -1594.6263 | 95.0 | 2280 | -409.3813 | 0.0000 | 1.0 | 1.0 |
283
- | -1614.0004 | 95.4167 | 2290 | -417.8049 | 0.0000 | 1.0 | 1.0 |
284
- | -1626.0732 | 95.8333 | 2300 | -459.6439 | 0.0000 | 1.0 | 1.0 |
285
- | -1620.7437 | 96.25 | 2310 | -363.0363 | 0.0000 | 1.0 | 1.0 |
286
- | -1556.9773 | 96.6667 | 2320 | -424.9406 | 0.0000 | 1.0 | 1.0 |
287
- | -1570.3413 | 97.0833 | 2330 | -349.0226 | 0.0000 | 1.0 | 1.0 |
288
- | -1577.0763 | 97.5 | 2340 | -363.4831 | 0.0000 | 1.0 | 1.0 |
289
- | -1790.7856 | 97.9167 | 2350 | -440.4658 | 0.0000 | 1.0 | 1.0 |
290
- | -1582.803 | 98.3333 | 2360 | -937.8145 | 0.0000 | 1.0 | 1.0 |
291
- | -1558.3069 | 98.75 | 2370 | -881.9451 | 0.0000 | 1.0 | 1.0 |
292
- | -1572.5793 | 99.1667 | 2380 | -814.5864 | 0.0000 | 1.0 | 1.0 |
293
- | -1536.8755 | 99.5833 | 2390 | -563.1765 | 0.0000 | 1.0 | 1.0 |
294
- | -1548.0015 | 100.0 | 2400 | -434.0362 | 0.0000 | 1.0 | 1.0 |
295
 
296
 
297
  ### Framework versions
 
16
 
17
  # segformer-b0-finetuned-oldapp-oct-1
18
 
19
+ This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the PushkarA07/oldapptiles5 dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.2630
22
+ - Mean Iou: 0.4984
23
+ - Mean Accuracy: 0.4989
24
+ - Overall Accuracy: 0.9967
25
+ - Accuracy Abnormality: 0.0
26
+ - Iou Abnormality: 0.0
27
 
28
  ## Model description
29
 
 
48
  - seed: 42
49
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
50
  - lr_scheduler_type: linear
51
+ - num_epochs: 10
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Abnormality | Iou Abnormality |
56
+ |:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------:|:---------------:|
57
+ | 0.6912 | 0.7143 | 10 | 0.6575 | 0.4714 | 0.4784 | 0.9426 | 0.0133 | 0.0002 |
58
+ | 0.5802 | 1.4286 | 20 | 0.5878 | 0.4942 | 0.4947 | 0.9884 | 0.0 | 0.0 |
59
+ | 0.5498 | 2.1429 | 30 | 0.4770 | 0.4984 | 0.4989 | 0.9968 | 0.0 | 0.0 |
60
+ | 0.6084 | 2.8571 | 40 | 0.4125 | 0.4971 | 0.4976 | 0.9941 | 0.0 | 0.0 |
61
+ | 0.4675 | 3.5714 | 50 | 0.4355 | 0.4885 | 0.4992 | 0.9761 | 0.0213 | 0.0009 |
62
+ | 0.3863 | 4.2857 | 60 | 0.3699 | 0.4965 | 0.5005 | 0.9920 | 0.0081 | 0.0010 |
63
+ | 0.3954 | 5.0 | 70 | 0.3401 | 0.4983 | 0.4989 | 0.9967 | 0.0 | 0.0 |
64
+ | 0.3286 | 5.7143 | 80 | 0.3279 | 0.4983 | 0.4988 | 0.9967 | 0.0 | 0.0 |
65
+ | 0.3458 | 6.4286 | 90 | 0.2908 | 0.4974 | 0.4979 | 0.9948 | 0.0 | 0.0 |
66
+ | 0.3559 | 7.1429 | 100 | 0.2693 | 0.4989 | 0.4994 | 0.9978 | 0.0 | 0.0 |
67
+ | 0.3196 | 7.8571 | 110 | 0.2596 | 0.4977 | 0.4982 | 0.9954 | 0.0 | 0.0 |
68
+ | 0.3109 | 8.5714 | 120 | 0.2915 | 0.4958 | 0.4963 | 0.9916 | 0.0 | 0.0 |
69
+ | 0.2711 | 9.2857 | 130 | 0.2720 | 0.4991 | 0.4997 | 0.9983 | 0.0 | 0.0 |
70
+ | 0.3051 | 10.0 | 140 | 0.2630 | 0.4984 | 0.4989 | 0.9967 | 0.0 | 0.0 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
71
 
72
 
73
  ### Framework versions
config.json CHANGED
@@ -28,12 +28,14 @@
28
  256
29
  ],
30
  "id2label": {
 
31
  "1": "abnormality"
32
  },
33
  "image_size": 224,
34
  "initializer_range": 0.02,
35
  "label2id": {
36
- "abnormality": "1"
 
37
  },
38
  "layer_norm_eps": 1e-06,
39
  "mlp_ratios": [
 
28
  256
29
  ],
30
  "id2label": {
31
+ "0": "normal",
32
  "1": "abnormality"
33
  },
34
  "image_size": 224,
35
  "initializer_range": 0.02,
36
  "label2id": {
37
+ "abnormality": 1,
38
+ "normal": 0
39
  },
40
  "layer_norm_eps": 1e-06,
41
  "mlp_ratios": [
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:310372825b6fe4078ce266ac325f2a35da652780f2765fec5d6cead08340f946
3
- size 14883748
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:61221ef83e5a5fa192c0256567a785dadbf7706d8891f1967fe91589366a4990
3
+ size 14884776
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f3da6cde2f6c0720725812758eab66008339f7db1d9623a4d0a4a1d71354b37f
3
  size 5304
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3ede39d16b21c4d141e34f15da15cc0ff8c9858a703ccd0dc4fc6ea95a7a0a15
3
  size 5304