End of training
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
|
|
16 |
|
17 |
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- Loss: 0.
|
20 |
|
21 |
## Model description
|
22 |
|
@@ -41,132 +41,75 @@ The following hyperparameters were used during training:
|
|
41 |
- seed: 42
|
42 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
43 |
- lr_scheduler_type: cosine
|
44 |
-
-
|
45 |
|
46 |
### Training results
|
47 |
|
48 |
-
| Training Loss | Epoch
|
49 |
-
|
50 |
-
| No log | 1.0
|
51 |
-
|
|
52 |
-
|
|
53 |
-
|
|
54 |
-
| 0.
|
55 |
-
| 0.
|
56 |
-
| 0.
|
57 |
-
| 0.
|
58 |
-
| 0.
|
59 |
-
| 0.
|
60 |
-
| 0.
|
61 |
-
| 0.
|
62 |
-
| 0.
|
63 |
-
| 0.
|
64 |
-
| 0.
|
65 |
-
| 0.
|
66 |
-
| 0.
|
67 |
-
| 0.
|
68 |
-
| 0.
|
69 |
-
| 0.
|
70 |
-
| 0.
|
71 |
-
| 0.
|
72 |
-
| 0.
|
73 |
-
| 0.
|
74 |
-
| 0.
|
75 |
-
| 0.
|
76 |
-
| 0.
|
77 |
-
| 0.
|
78 |
-
| 0.
|
79 |
-
| 0.
|
80 |
-
| 0.
|
81 |
-
| 0.
|
82 |
-
| 0.
|
83 |
-
| 0.
|
84 |
-
| 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
-
| 0.
|
90 |
-
| 0.
|
91 |
-
| 0.
|
92 |
-
| 0.
|
93 |
-
| 0.
|
94 |
-
| 0.
|
95 |
-
| 0.
|
96 |
-
| 0.
|
97 |
-
| 0.
|
98 |
-
| 0.
|
99 |
-
| 0.
|
100 |
-
| 0.
|
101 |
-
| 0.
|
102 |
-
| 0.
|
103 |
-
| 0.
|
104 |
-
| 0.
|
105 |
-
| 0.
|
106 |
-
| 0.
|
107 |
-
| 0.
|
108 |
-
| 0.
|
109 |
-
| 0.
|
110 |
-
| 0.
|
111 |
-
| 0.
|
112 |
-
| 0.
|
113 |
-
| 0.354 | 64.0 | 768 | 0.2815 |
|
114 |
-
| 0.3237 | 65.0 | 780 | 0.2819 |
|
115 |
-
| 0.3237 | 66.0 | 792 | 0.2476 |
|
116 |
-
| 0.3237 | 67.0 | 804 | 0.3193 |
|
117 |
-
| 0.3182 | 68.0 | 816 | 0.2444 |
|
118 |
-
| 0.3182 | 69.0 | 828 | 0.2510 |
|
119 |
-
| 0.3151 | 70.0 | 840 | 0.2951 |
|
120 |
-
| 0.3151 | 71.0 | 852 | 0.2389 |
|
121 |
-
| 0.3151 | 72.0 | 864 | 0.2657 |
|
122 |
-
| 0.3173 | 73.0 | 876 | 0.2783 |
|
123 |
-
| 0.3173 | 74.0 | 888 | 0.2791 |
|
124 |
-
| 0.3283 | 75.0 | 900 | 0.2445 |
|
125 |
-
| 0.3283 | 76.0 | 912 | 0.2507 |
|
126 |
-
| 0.3283 | 77.0 | 924 | 0.2778 |
|
127 |
-
| 0.3041 | 78.0 | 936 | 0.2471 |
|
128 |
-
| 0.3041 | 79.0 | 948 | 0.2219 |
|
129 |
-
| 0.2925 | 80.0 | 960 | 0.2767 |
|
130 |
-
| 0.2925 | 81.0 | 972 | 0.3046 |
|
131 |
-
| 0.2925 | 82.0 | 984 | 0.2837 |
|
132 |
-
| 0.3112 | 83.0 | 996 | 0.2710 |
|
133 |
-
| 0.3112 | 84.0 | 1008 | 0.2399 |
|
134 |
-
| 0.282 | 85.0 | 1020 | 0.2388 |
|
135 |
-
| 0.282 | 86.0 | 1032 | 0.2401 |
|
136 |
-
| 0.282 | 87.0 | 1044 | 0.2302 |
|
137 |
-
| 0.2806 | 88.0 | 1056 | 0.1975 |
|
138 |
-
| 0.2806 | 89.0 | 1068 | 0.2154 |
|
139 |
-
| 0.271 | 90.0 | 1080 | 0.1875 |
|
140 |
-
| 0.271 | 91.0 | 1092 | 0.2032 |
|
141 |
-
| 0.271 | 92.0 | 1104 | 0.2198 |
|
142 |
-
| 0.2695 | 93.0 | 1116 | 0.2018 |
|
143 |
-
| 0.2695 | 94.0 | 1128 | 0.2124 |
|
144 |
-
| 0.2593 | 95.0 | 1140 | 0.2150 |
|
145 |
-
| 0.2593 | 96.0 | 1152 | 0.1841 |
|
146 |
-
| 0.2593 | 97.0 | 1164 | 0.2062 |
|
147 |
-
| 0.2643 | 98.0 | 1176 | 0.1977 |
|
148 |
-
| 0.2643 | 99.0 | 1188 | 0.1847 |
|
149 |
-
| 0.2508 | 100.0 | 1200 | 0.1939 |
|
150 |
-
| 0.2508 | 101.0 | 1212 | 0.2070 |
|
151 |
-
| 0.2508 | 102.0 | 1224 | 0.1943 |
|
152 |
-
| 0.2547 | 103.0 | 1236 | 0.1911 |
|
153 |
-
| 0.2547 | 104.0 | 1248 | 0.1922 |
|
154 |
-
| 0.2512 | 105.0 | 1260 | 0.1988 |
|
155 |
-
| 0.2512 | 106.0 | 1272 | 0.1968 |
|
156 |
-
| 0.2512 | 107.0 | 1284 | 0.1984 |
|
157 |
-
| 0.2465 | 108.0 | 1296 | 0.2030 |
|
158 |
-
| 0.2465 | 109.0 | 1308 | 0.1995 |
|
159 |
-
| 0.2428 | 110.0 | 1320 | 0.1948 |
|
160 |
-
| 0.2428 | 111.0 | 1332 | 0.1969 |
|
161 |
-
| 0.2428 | 112.0 | 1344 | 0.1969 |
|
162 |
-
| 0.2432 | 113.0 | 1356 | 0.1943 |
|
163 |
-
| 0.2432 | 114.0 | 1368 | 0.1949 |
|
164 |
-
| 0.2428 | 115.0 | 1380 | 0.1930 |
|
165 |
-
| 0.2428 | 116.0 | 1392 | 0.1924 |
|
166 |
-
| 0.2428 | 117.0 | 1404 | 0.1948 |
|
167 |
-
| 0.2452 | 118.0 | 1416 | 0.1967 |
|
168 |
-
| 0.2452 | 119.0 | 1428 | 0.1943 |
|
169 |
-
| 0.245 | 120.0 | 1440 | 0.1949 |
|
170 |
|
171 |
|
172 |
### Framework versions
|
|
|
16 |
|
17 |
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 0.1692
|
20 |
|
21 |
## Model description
|
22 |
|
|
|
41 |
- seed: 42
|
42 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
43 |
- lr_scheduler_type: cosine
|
44 |
+
- training_steps: 1440
|
45 |
|
46 |
### Training results
|
47 |
|
48 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
49 |
+
|:-------------:|:-------:|:----:|:---------------:|
|
50 |
+
| No log | 1.0 | 23 | 0.9258 |
|
51 |
+
| 1.1129 | 2.0 | 46 | 0.8241 |
|
52 |
+
| 0.8989 | 3.0 | 69 | 0.8080 |
|
53 |
+
| 0.8799 | 4.0 | 92 | 0.9306 |
|
54 |
+
| 0.8799 | 5.0 | 115 | 0.5536 |
|
55 |
+
| 0.6986 | 6.0 | 138 | 0.5199 |
|
56 |
+
| 0.6088 | 7.0 | 161 | 0.4739 |
|
57 |
+
| 0.631 | 8.0 | 184 | 0.5007 |
|
58 |
+
| 0.631 | 9.0 | 207 | 0.6598 |
|
59 |
+
| 0.5804 | 10.0 | 230 | 0.4345 |
|
60 |
+
| 0.5656 | 11.0 | 253 | 0.4864 |
|
61 |
+
| 0.5725 | 12.0 | 276 | 0.3707 |
|
62 |
+
| 0.5725 | 13.0 | 299 | 0.3357 |
|
63 |
+
| 0.4953 | 14.0 | 322 | 0.4104 |
|
64 |
+
| 0.4619 | 15.0 | 345 | 0.3681 |
|
65 |
+
| 0.4463 | 16.0 | 368 | 0.3045 |
|
66 |
+
| 0.433 | 17.0 | 391 | 0.3330 |
|
67 |
+
| 0.433 | 18.0 | 414 | 0.3561 |
|
68 |
+
| 0.396 | 19.0 | 437 | 0.2583 |
|
69 |
+
| 0.3845 | 20.0 | 460 | 0.2699 |
|
70 |
+
| 0.3569 | 21.0 | 483 | 0.2714 |
|
71 |
+
| 0.3569 | 22.0 | 506 | 0.2978 |
|
72 |
+
| 0.3574 | 23.0 | 529 | 0.2844 |
|
73 |
+
| 0.3424 | 24.0 | 552 | 0.2650 |
|
74 |
+
| 0.35 | 25.0 | 575 | 0.2829 |
|
75 |
+
| 0.35 | 26.0 | 598 | 0.2533 |
|
76 |
+
| 0.34 | 27.0 | 621 | 0.2306 |
|
77 |
+
| 0.3309 | 28.0 | 644 | 0.2348 |
|
78 |
+
| 0.3297 | 29.0 | 667 | 0.2912 |
|
79 |
+
| 0.3357 | 30.0 | 690 | 0.2679 |
|
80 |
+
| 0.3357 | 31.0 | 713 | 0.2685 |
|
81 |
+
| 0.3267 | 32.0 | 736 | 0.2384 |
|
82 |
+
| 0.3102 | 33.0 | 759 | 0.2346 |
|
83 |
+
| 0.3204 | 34.0 | 782 | 0.2850 |
|
84 |
+
| 0.3204 | 35.0 | 805 | 0.2969 |
|
85 |
+
| 0.3191 | 36.0 | 828 | 0.2315 |
|
86 |
+
| 0.3051 | 37.0 | 851 | 0.1958 |
|
87 |
+
| 0.2825 | 38.0 | 874 | 0.2211 |
|
88 |
+
| 0.2825 | 39.0 | 897 | 0.2309 |
|
89 |
+
| 0.2895 | 40.0 | 920 | 0.2610 |
|
90 |
+
| 0.2891 | 41.0 | 943 | 0.2334 |
|
91 |
+
| 0.279 | 42.0 | 966 | 0.2149 |
|
92 |
+
| 0.279 | 43.0 | 989 | 0.2017 |
|
93 |
+
| 0.2735 | 44.0 | 1012 | 0.2445 |
|
94 |
+
| 0.2688 | 45.0 | 1035 | 0.2164 |
|
95 |
+
| 0.2602 | 46.0 | 1058 | 0.1995 |
|
96 |
+
| 0.2644 | 47.0 | 1081 | 0.1936 |
|
97 |
+
| 0.2644 | 48.0 | 1104 | 0.1884 |
|
98 |
+
| 0.2634 | 49.0 | 1127 | 0.1974 |
|
99 |
+
| 0.2568 | 50.0 | 1150 | 0.1981 |
|
100 |
+
| 0.2456 | 51.0 | 1173 | 0.1799 |
|
101 |
+
| 0.2456 | 52.0 | 1196 | 0.1777 |
|
102 |
+
| 0.2479 | 53.0 | 1219 | 0.1915 |
|
103 |
+
| 0.2529 | 54.0 | 1242 | 0.1928 |
|
104 |
+
| 0.2533 | 55.0 | 1265 | 0.1772 |
|
105 |
+
| 0.2533 | 56.0 | 1288 | 0.1863 |
|
106 |
+
| 0.2516 | 57.0 | 1311 | 0.1775 |
|
107 |
+
| 0.2495 | 58.0 | 1334 | 0.1808 |
|
108 |
+
| 0.2428 | 59.0 | 1357 | 0.1734 |
|
109 |
+
| 0.2454 | 60.0 | 1380 | 0.1696 |
|
110 |
+
| 0.2454 | 61.0 | 1403 | 0.1766 |
|
111 |
+
| 0.2452 | 62.0 | 1426 | 0.1718 |
|
112 |
+
| 0.2367 | 62.6087 | 1440 | 0.1692 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
113 |
|
114 |
|
115 |
### Framework versions
|