File size: 9,110 Bytes
e75e8ff
 
cffc13b
e75e8ff
 
 
 
 
 
 
 
 
 
 
 
 
 
155f78e
 
 
 
e75e8ff
155f78e
 
 
 
 
e75e8ff
155f78e
 
 
 
 
 
e75e8ff
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
155f78e
e75e8ff
 
 
 
 
 
155f78e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e75e8ff
 
 
 
 
155f78e
e75e8ff
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
---

base_model: microsoft/conditional-detr-resnet-50
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: queue_detection
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# queue_detection



This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the None dataset.

It achieves the following results on the evaluation set:

- Loss: 1.4822

- Map: 0.2108

- Map 50: 0.3072

- Map 75: 0.2725

- Map Small: -1.0

- Map Medium: -1.0

- Map Large: 0.2219

- Mar 1: 0.1833

- Mar 10: 0.4195

- Mar 100: 0.736

- Mar Small: -1.0

- Mar Medium: -1.0

- Mar Large: 0.736

- Map Cashier: 0.0544

- Mar 100 Cashier: 0.8053

- Map Cx: 0.3672

- Mar 100 Cx: 0.6667



## Model description



More information needed



## Intended uses & limitations



More information needed



## Training and evaluation data



More information needed



## Training procedure



### Training hyperparameters



The following hyperparameters were used during training:

- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 30

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Cashier | Mar 100 Cashier | Map Cx | Mar 100 Cx |

|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:------:|:----------:|

| No log        | 1.0   | 5    | 38.9601         | 0.0007 | 0.0042 | 0.0    | -1.0      | -1.0       | 0.001     | 0.0    | 0.0186 | 0.053   | -1.0      | -1.0       | 0.053     | 0.0004      | 0.0526          | 0.001  | 0.0533     |

| No log        | 2.0   | 10   | 20.3884         | 0.0053 | 0.0378 | 0.001  | -1.0      | -1.0       | 0.0119    | 0.0    | 0.0133 | 0.11    | -1.0      | -1.0       | 0.11      | 0.0         | 0.0             | 0.0106 | 0.22       |

| No log        | 3.0   | 15   | 8.6229          | 0.0257 | 0.0582 | 0.0072 | -1.0      | -1.0       | 0.0573    | 0.0333 | 0.12   | 0.1737  | -1.0      | -1.0       | 0.1737    | 0.0002      | 0.0474          | 0.0512 | 0.3        |

| No log        | 4.0   | 20   | 5.4986          | 0.0061 | 0.02   | 0.0018 | -1.0      | -1.0       | 0.0175    | 0.0    | 0.0833 | 0.1     | -1.0      | -1.0       | 0.1       | 0.0         | 0.0             | 0.0122 | 0.2        |

| No log        | 5.0   | 25   | 3.2810          | 0.0152 | 0.0644 | 0.0    | -1.0      | -1.0       | 0.0182    | 0.0067 | 0.0933 | 0.1989  | -1.0      | -1.0       | 0.1989    | 0.004       | 0.1579          | 0.0264 | 0.24       |

| No log        | 6.0   | 30   | 2.4527          | 0.0174 | 0.0707 | 0.0008 | -1.0      | -1.0       | 0.0201    | 0.0067 | 0.0667 | 0.3286  | -1.0      | -1.0       | 0.3286    | 0.0076      | 0.3105          | 0.0272 | 0.3467     |

| No log        | 7.0   | 35   | 2.1059          | 0.0222 | 0.0715 | 0.0077 | -1.0      | 0.0        | 0.0244    | 0.0233 | 0.0993 | 0.4733  | -1.0      | 0.0        | 0.4857    | 0.017       | 0.6             | 0.0273 | 0.3467     |

| No log        | 8.0   | 40   | 2.1013          | 0.035  | 0.117  | 0.0054 | -1.0      | -1.0       | 0.0354    | 0.0333 | 0.0467 | 0.3653  | -1.0      | -1.0       | 0.3653    | 0.0204      | 0.6105          | 0.0497 | 0.12       |

| No log        | 9.0   | 45   | 1.9401          | 0.0412 | 0.1081 | 0.0058 | -1.0      | -1.0       | 0.042     | 0.0267 | 0.04   | 0.3639  | -1.0      | -1.0       | 0.3639    | 0.0184      | 0.6211          | 0.064  | 0.1067     |

| No log        | 10.0  | 50   | 2.2227          | 0.0071 | 0.0254 | 0.0017 | -1.0      | -1.0       | 0.0075    | 0.0    | 0.0433 | 0.256   | -1.0      | -1.0       | 0.256     | 0.0044      | 0.3053          | 0.0098 | 0.2067     |

| No log        | 11.0  | 55   | 1.9404          | 0.0126 | 0.033  | 0.01   | -1.0      | -1.0       | 0.0127    | 0.0    | 0.08   | 0.333   | -1.0      | -1.0       | 0.333     | 0.0082      | 0.4526          | 0.0169 | 0.2133     |

| No log        | 12.0  | 60   | 1.8410          | 0.0186 | 0.0658 | 0.0078 | -1.0      | -1.0       | 0.0187    | 0.0    | 0.0733 | 0.3591  | -1.0      | -1.0       | 0.3591    | 0.0125      | 0.5316          | 0.0247 | 0.1867     |

| No log        | 13.0  | 65   | 1.7476          | 0.0735 | 0.1829 | 0.0136 | -1.0      | -1.0       | 0.0736    | 0.0633 | 0.1133 | 0.4335  | -1.0      | -1.0       | 0.4335    | 0.0105      | 0.4737          | 0.1366 | 0.3933     |

| No log        | 14.0  | 70   | 1.8939          | 0.0634 | 0.1492 | 0.0276 | -1.0      | -1.0       | 0.0638    | 0.05   | 0.2033 | 0.3719  | -1.0      | -1.0       | 0.3719    | 0.0032      | 0.2105          | 0.1236 | 0.5333     |

| No log        | 15.0  | 75   | 1.7653          | 0.0535 | 0.1438 | 0.0387 | -1.0      | -1.0       | 0.0554    | 0.0964 | 0.1929 | 0.4917  | -1.0      | -1.0       | 0.4917    | 0.0103      | 0.4263          | 0.0968 | 0.5571     |

| No log        | 16.0  | 80   | 1.6493          | 0.0988 | 0.2205 | 0.0787 | -1.0      | -1.0       | 0.1014    | 0.0733 | 0.2419 | 0.6505  | -1.0      | -1.0       | 0.6505    | 0.0225      | 0.7211          | 0.1751 | 0.58       |

| No log        | 17.0  | 85   | 1.7624          | 0.1198 | 0.2464 | 0.0858 | 0.0       | -1.0       | 0.1289    | 0.1167 | 0.2333 | 0.6149  | 0.0       | -1.0       | 0.6352    | 0.0285      | 0.6632          | 0.211  | 0.5667     |

| No log        | 18.0  | 90   | 1.8609          | 0.149  | 0.3181 | 0.0874 | -1.0      | -1.0       | 0.1517    | 0.13   | 0.2861 | 0.5789  | -1.0      | -1.0       | 0.5789    | 0.0403      | 0.6579          | 0.2576 | 0.5        |

| No log        | 19.0  | 95   | 1.6860          | 0.1455 | 0.257  | 0.1026 | -1.0      | -1.0       | 0.1497    | 0.1393 | 0.2697 | 0.6496  | -1.0      | -1.0       | 0.6496    | 0.039       | 0.7421          | 0.2521 | 0.5571     |

| No log        | 20.0  | 100  | 1.7541          | 0.1966 | 0.3274 | 0.2313 | -1.0      | -1.0       | 0.1994    | 0.17   | 0.317  | 0.6182  | -1.0      | -1.0       | 0.6182    | 0.0379      | 0.6632          | 0.3553 | 0.5733     |

| No log        | 21.0  | 105  | 1.7043          | 0.1735 | 0.2572 | 0.1786 | -1.0      | 0.0        | 0.1913    | 0.1821 | 0.3295 | 0.6252  | -1.0      | 0.0        | 0.6472    | 0.0459      | 0.6789          | 0.3012 | 0.5714     |

| No log        | 22.0  | 110  | 1.5489          | 0.1959 | 0.3051 | 0.264  | -1.0      | -1.0       | 0.2039    | 0.17   | 0.3568 | 0.6896  | -1.0      | -1.0       | 0.6896    | 0.046       | 0.7526          | 0.3458 | 0.6267     |

| No log        | 23.0  | 115  | 1.6402          | 0.166  | 0.293  | 0.1791 | -1.0      | -1.0       | 0.1773    | 0.1432 | 0.2851 | 0.6496  | -1.0      | -1.0       | 0.6496    | 0.048       | 0.7526          | 0.284  | 0.5467     |

| No log        | 24.0  | 120  | 1.5800          | 0.188  | 0.3099 | 0.1815 | -1.0      | -1.0       | 0.1965    | 0.1791 | 0.3337 | 0.693   | -1.0      | -1.0       | 0.693     | 0.0408      | 0.7526          | 0.3352 | 0.6333     |

| No log        | 25.0  | 125  | 1.5566          | 0.1921 | 0.3038 | 0.2227 | -1.0      | -1.0       | 0.2037    | 0.17   | 0.4244 | 0.6928  | -1.0      | -1.0       | 0.6928    | 0.0642      | 0.7789          | 0.32   | 0.6067     |

| No log        | 26.0  | 130  | 1.7227          | 0.2044 | 0.3337 | 0.2155 | -1.0      | -1.0       | 0.2145    | 0.1667 | 0.3319 | 0.5968  | -1.0      | -1.0       | 0.5968    | 0.0484      | 0.6737          | 0.3604 | 0.52       |

| No log        | 27.0  | 135  | 1.5184          | 0.2095 | 0.3161 | 0.2389 | -1.0      | -1.0       | 0.2211    | 0.1877 | 0.3163 | 0.7026  | -1.0      | -1.0       | 0.7026    | 0.0603      | 0.8053          | 0.3586 | 0.6        |

| No log        | 28.0  | 140  | 1.5156          | 0.2172 | 0.3273 | 0.2672 | -1.0      | -1.0       | 0.2286    | 0.2077 | 0.4402 | 0.7226  | -1.0      | -1.0       | 0.7226    | 0.0681      | 0.8053          | 0.3664 | 0.64       |

| No log        | 29.0  | 145  | 1.6211          | 0.1652 | 0.2652 | 0.2007 | -1.0      | -1.0       | 0.1737    | 0.15   | 0.4081 | 0.677   | -1.0      | -1.0       | 0.677     | 0.0463      | 0.7474          | 0.2841 | 0.6067     |

| No log        | 30.0  | 150  | 1.4822          | 0.2108 | 0.3072 | 0.2725 | -1.0      | -1.0       | 0.2219    | 0.1833 | 0.4195 | 0.736   | -1.0      | -1.0       | 0.736     | 0.0544      | 0.8053          | 0.3672 | 0.6667     |





### Framework versions



- Transformers 4.42.3

- Pytorch 2.3.1+cu121

- Datasets 2.20.0

- Tokenizers 0.19.1