joe611 commited on
Commit
60d5b23
·
verified ·
1 Parent(s): e5bbcf9

End of training

Browse files
Files changed (1) hide show
  1. README.md +67 -124
README.md CHANGED
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.1949
20
 
21
  ## Model description
22
 
@@ -41,132 +41,75 @@ The following hyperparameters were used during training:
41
  - seed: 42
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: cosine
44
- - num_epochs: 120
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss |
49
- |:-------------:|:-----:|:----:|:---------------:|
50
- | No log | 1.0 | 12 | 1.3837 |
51
- | No log | 2.0 | 24 | 0.9192 |
52
- | 1.2199 | 3.0 | 36 | 0.9478 |
53
- | 1.2199 | 4.0 | 48 | 0.8188 |
54
- | 0.9301 | 5.0 | 60 | 0.8648 |
55
- | 0.9301 | 6.0 | 72 | 0.7913 |
56
- | 0.9301 | 7.0 | 84 | 0.8269 |
57
- | 0.834 | 8.0 | 96 | 0.7546 |
58
- | 0.834 | 9.0 | 108 | 0.7128 |
59
- | 0.7676 | 10.0 | 120 | 0.6706 |
60
- | 0.7676 | 11.0 | 132 | 0.6042 |
61
- | 0.7676 | 12.0 | 144 | 0.5586 |
62
- | 0.6807 | 13.0 | 156 | 0.5129 |
63
- | 0.6807 | 14.0 | 168 | 0.4815 |
64
- | 0.5846 | 15.0 | 180 | 0.4724 |
65
- | 0.5846 | 16.0 | 192 | 0.4970 |
66
- | 0.5846 | 17.0 | 204 | 0.4900 |
67
- | 0.5437 | 18.0 | 216 | 0.4985 |
68
- | 0.5437 | 19.0 | 228 | 0.6295 |
69
- | 0.5232 | 20.0 | 240 | 0.5023 |
70
- | 0.5232 | 21.0 | 252 | 0.4312 |
71
- | 0.5232 | 22.0 | 264 | 0.4583 |
72
- | 0.5147 | 23.0 | 276 | 0.4499 |
73
- | 0.5147 | 24.0 | 288 | 0.3438 |
74
- | 0.4613 | 25.0 | 300 | 0.3953 |
75
- | 0.4613 | 26.0 | 312 | 0.3916 |
76
- | 0.4613 | 27.0 | 324 | 0.4285 |
77
- | 0.4288 | 28.0 | 336 | 0.3532 |
78
- | 0.4288 | 29.0 | 348 | 0.3513 |
79
- | 0.4251 | 30.0 | 360 | 0.3761 |
80
- | 0.4251 | 31.0 | 372 | 0.3183 |
81
- | 0.4251 | 32.0 | 384 | 0.3419 |
82
- | 0.3963 | 33.0 | 396 | 0.3186 |
83
- | 0.3963 | 34.0 | 408 | 0.2799 |
84
- | 0.3684 | 35.0 | 420 | 0.3688 |
85
- | 0.3684 | 36.0 | 432 | 0.4035 |
86
- | 0.3684 | 37.0 | 444 | 0.3491 |
87
- | 0.4062 | 38.0 | 456 | 0.3147 |
88
- | 0.4062 | 39.0 | 468 | 0.3333 |
89
- | 0.3745 | 40.0 | 480 | 0.2822 |
90
- | 0.3745 | 41.0 | 492 | 0.2734 |
91
- | 0.3745 | 42.0 | 504 | 0.2816 |
92
- | 0.3461 | 43.0 | 516 | 0.3289 |
93
- | 0.3461 | 44.0 | 528 | 0.3538 |
94
- | 0.3707 | 45.0 | 540 | 0.2969 |
95
- | 0.3707 | 46.0 | 552 | 0.3335 |
96
- | 0.3707 | 47.0 | 564 | 0.3201 |
97
- | 0.3906 | 48.0 | 576 | 0.3262 |
98
- | 0.3906 | 49.0 | 588 | 0.3213 |
99
- | 0.3622 | 50.0 | 600 | 0.2825 |
100
- | 0.3622 | 51.0 | 612 | 0.3111 |
101
- | 0.3622 | 52.0 | 624 | 0.2814 |
102
- | 0.336 | 53.0 | 636 | 0.3242 |
103
- | 0.336 | 54.0 | 648 | 0.2615 |
104
- | 0.3326 | 55.0 | 660 | 0.3107 |
105
- | 0.3326 | 56.0 | 672 | 0.2904 |
106
- | 0.3326 | 57.0 | 684 | 0.2967 |
107
- | 0.3407 | 58.0 | 696 | 0.2818 |
108
- | 0.3407 | 59.0 | 708 | 0.2759 |
109
- | 0.3467 | 60.0 | 720 | 0.2862 |
110
- | 0.3467 | 61.0 | 732 | 0.3529 |
111
- | 0.3467 | 62.0 | 744 | 0.3559 |
112
- | 0.354 | 63.0 | 756 | 0.2403 |
113
- | 0.354 | 64.0 | 768 | 0.2815 |
114
- | 0.3237 | 65.0 | 780 | 0.2819 |
115
- | 0.3237 | 66.0 | 792 | 0.2476 |
116
- | 0.3237 | 67.0 | 804 | 0.3193 |
117
- | 0.3182 | 68.0 | 816 | 0.2444 |
118
- | 0.3182 | 69.0 | 828 | 0.2510 |
119
- | 0.3151 | 70.0 | 840 | 0.2951 |
120
- | 0.3151 | 71.0 | 852 | 0.2389 |
121
- | 0.3151 | 72.0 | 864 | 0.2657 |
122
- | 0.3173 | 73.0 | 876 | 0.2783 |
123
- | 0.3173 | 74.0 | 888 | 0.2791 |
124
- | 0.3283 | 75.0 | 900 | 0.2445 |
125
- | 0.3283 | 76.0 | 912 | 0.2507 |
126
- | 0.3283 | 77.0 | 924 | 0.2778 |
127
- | 0.3041 | 78.0 | 936 | 0.2471 |
128
- | 0.3041 | 79.0 | 948 | 0.2219 |
129
- | 0.2925 | 80.0 | 960 | 0.2767 |
130
- | 0.2925 | 81.0 | 972 | 0.3046 |
131
- | 0.2925 | 82.0 | 984 | 0.2837 |
132
- | 0.3112 | 83.0 | 996 | 0.2710 |
133
- | 0.3112 | 84.0 | 1008 | 0.2399 |
134
- | 0.282 | 85.0 | 1020 | 0.2388 |
135
- | 0.282 | 86.0 | 1032 | 0.2401 |
136
- | 0.282 | 87.0 | 1044 | 0.2302 |
137
- | 0.2806 | 88.0 | 1056 | 0.1975 |
138
- | 0.2806 | 89.0 | 1068 | 0.2154 |
139
- | 0.271 | 90.0 | 1080 | 0.1875 |
140
- | 0.271 | 91.0 | 1092 | 0.2032 |
141
- | 0.271 | 92.0 | 1104 | 0.2198 |
142
- | 0.2695 | 93.0 | 1116 | 0.2018 |
143
- | 0.2695 | 94.0 | 1128 | 0.2124 |
144
- | 0.2593 | 95.0 | 1140 | 0.2150 |
145
- | 0.2593 | 96.0 | 1152 | 0.1841 |
146
- | 0.2593 | 97.0 | 1164 | 0.2062 |
147
- | 0.2643 | 98.0 | 1176 | 0.1977 |
148
- | 0.2643 | 99.0 | 1188 | 0.1847 |
149
- | 0.2508 | 100.0 | 1200 | 0.1939 |
150
- | 0.2508 | 101.0 | 1212 | 0.2070 |
151
- | 0.2508 | 102.0 | 1224 | 0.1943 |
152
- | 0.2547 | 103.0 | 1236 | 0.1911 |
153
- | 0.2547 | 104.0 | 1248 | 0.1922 |
154
- | 0.2512 | 105.0 | 1260 | 0.1988 |
155
- | 0.2512 | 106.0 | 1272 | 0.1968 |
156
- | 0.2512 | 107.0 | 1284 | 0.1984 |
157
- | 0.2465 | 108.0 | 1296 | 0.2030 |
158
- | 0.2465 | 109.0 | 1308 | 0.1995 |
159
- | 0.2428 | 110.0 | 1320 | 0.1948 |
160
- | 0.2428 | 111.0 | 1332 | 0.1969 |
161
- | 0.2428 | 112.0 | 1344 | 0.1969 |
162
- | 0.2432 | 113.0 | 1356 | 0.1943 |
163
- | 0.2432 | 114.0 | 1368 | 0.1949 |
164
- | 0.2428 | 115.0 | 1380 | 0.1930 |
165
- | 0.2428 | 116.0 | 1392 | 0.1924 |
166
- | 0.2428 | 117.0 | 1404 | 0.1948 |
167
- | 0.2452 | 118.0 | 1416 | 0.1967 |
168
- | 0.2452 | 119.0 | 1428 | 0.1943 |
169
- | 0.245 | 120.0 | 1440 | 0.1949 |
170
 
171
 
172
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.1692
20
 
21
  ## Model description
22
 
 
41
  - seed: 42
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: cosine
44
+ - training_steps: 1440
45
 
46
  ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss |
49
+ |:-------------:|:-------:|:----:|:---------------:|
50
+ | No log | 1.0 | 23 | 0.9258 |
51
+ | 1.1129 | 2.0 | 46 | 0.8241 |
52
+ | 0.8989 | 3.0 | 69 | 0.8080 |
53
+ | 0.8799 | 4.0 | 92 | 0.9306 |
54
+ | 0.8799 | 5.0 | 115 | 0.5536 |
55
+ | 0.6986 | 6.0 | 138 | 0.5199 |
56
+ | 0.6088 | 7.0 | 161 | 0.4739 |
57
+ | 0.631 | 8.0 | 184 | 0.5007 |
58
+ | 0.631 | 9.0 | 207 | 0.6598 |
59
+ | 0.5804 | 10.0 | 230 | 0.4345 |
60
+ | 0.5656 | 11.0 | 253 | 0.4864 |
61
+ | 0.5725 | 12.0 | 276 | 0.3707 |
62
+ | 0.5725 | 13.0 | 299 | 0.3357 |
63
+ | 0.4953 | 14.0 | 322 | 0.4104 |
64
+ | 0.4619 | 15.0 | 345 | 0.3681 |
65
+ | 0.4463 | 16.0 | 368 | 0.3045 |
66
+ | 0.433 | 17.0 | 391 | 0.3330 |
67
+ | 0.433 | 18.0 | 414 | 0.3561 |
68
+ | 0.396 | 19.0 | 437 | 0.2583 |
69
+ | 0.3845 | 20.0 | 460 | 0.2699 |
70
+ | 0.3569 | 21.0 | 483 | 0.2714 |
71
+ | 0.3569 | 22.0 | 506 | 0.2978 |
72
+ | 0.3574 | 23.0 | 529 | 0.2844 |
73
+ | 0.3424 | 24.0 | 552 | 0.2650 |
74
+ | 0.35 | 25.0 | 575 | 0.2829 |
75
+ | 0.35 | 26.0 | 598 | 0.2533 |
76
+ | 0.34 | 27.0 | 621 | 0.2306 |
77
+ | 0.3309 | 28.0 | 644 | 0.2348 |
78
+ | 0.3297 | 29.0 | 667 | 0.2912 |
79
+ | 0.3357 | 30.0 | 690 | 0.2679 |
80
+ | 0.3357 | 31.0 | 713 | 0.2685 |
81
+ | 0.3267 | 32.0 | 736 | 0.2384 |
82
+ | 0.3102 | 33.0 | 759 | 0.2346 |
83
+ | 0.3204 | 34.0 | 782 | 0.2850 |
84
+ | 0.3204 | 35.0 | 805 | 0.2969 |
85
+ | 0.3191 | 36.0 | 828 | 0.2315 |
86
+ | 0.3051 | 37.0 | 851 | 0.1958 |
87
+ | 0.2825 | 38.0 | 874 | 0.2211 |
88
+ | 0.2825 | 39.0 | 897 | 0.2309 |
89
+ | 0.2895 | 40.0 | 920 | 0.2610 |
90
+ | 0.2891 | 41.0 | 943 | 0.2334 |
91
+ | 0.279 | 42.0 | 966 | 0.2149 |
92
+ | 0.279 | 43.0 | 989 | 0.2017 |
93
+ | 0.2735 | 44.0 | 1012 | 0.2445 |
94
+ | 0.2688 | 45.0 | 1035 | 0.2164 |
95
+ | 0.2602 | 46.0 | 1058 | 0.1995 |
96
+ | 0.2644 | 47.0 | 1081 | 0.1936 |
97
+ | 0.2644 | 48.0 | 1104 | 0.1884 |
98
+ | 0.2634 | 49.0 | 1127 | 0.1974 |
99
+ | 0.2568 | 50.0 | 1150 | 0.1981 |
100
+ | 0.2456 | 51.0 | 1173 | 0.1799 |
101
+ | 0.2456 | 52.0 | 1196 | 0.1777 |
102
+ | 0.2479 | 53.0 | 1219 | 0.1915 |
103
+ | 0.2529 | 54.0 | 1242 | 0.1928 |
104
+ | 0.2533 | 55.0 | 1265 | 0.1772 |
105
+ | 0.2533 | 56.0 | 1288 | 0.1863 |
106
+ | 0.2516 | 57.0 | 1311 | 0.1775 |
107
+ | 0.2495 | 58.0 | 1334 | 0.1808 |
108
+ | 0.2428 | 59.0 | 1357 | 0.1734 |
109
+ | 0.2454 | 60.0 | 1380 | 0.1696 |
110
+ | 0.2454 | 61.0 | 1403 | 0.1766 |
111
+ | 0.2452 | 62.0 | 1426 | 0.1718 |
112
+ | 0.2367 | 62.6087 | 1440 | 0.1692 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
113
 
114
 
115
  ### Framework versions