detr_domain_shift

This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4246
  • Map: 0.8353
  • Map 50: 0.9529
  • Map 75: 0.9029
  • Map Small: 0.0159
  • Map Medium: 0.6742
  • Map Large: 0.8707
  • Mar 1: 0.7106
  • Mar 10: 0.8977
  • Mar 100: 0.9152
  • Mar Small: 0.2984
  • Mar Medium: 0.8276
  • Mar Large: 0.9365
  • Map Garbage bag: 0.8185
  • Mar 100 Garbage bag: 0.9059
  • Map Paper bag: 0.8446
  • Mar 100 Paper bag: 0.9239
  • Map Plastic bag: 0.8429
  • Mar 100 Plastic bag: 0.9159

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Garbage bag Mar 100 Garbage bag Map Paper bag Mar 100 Paper bag Map Plastic bag Mar 100 Plastic bag
0.9775 1.0 1557 0.8739 0.2419 0.3276 0.2719 0.0003 0.1808 0.2566 0.4246 0.773 0.8186 0.0271 0.6849 0.8501 0.1999 0.808 0.3045 0.8292 0.2213 0.8187
0.919 2.0 3114 0.8527 0.4465 0.6037 0.5018 0.0043 0.2877 0.4787 0.5098 0.7627 0.8137 0.0123 0.6656 0.8474 0.5496 0.8075 0.4518 0.8235 0.3382 0.8101
0.85 3.0 4671 0.7649 0.5508 0.7268 0.6183 0.0013 0.3684 0.5898 0.5619 0.7944 0.8335 0.0652 0.7147 0.8613 0.5635 0.8206 0.5877 0.8471 0.5012 0.8327
0.844 4.0 6228 0.7562 0.605 0.7924 0.6857 0.0023 0.4402 0.6405 0.5732 0.7901 0.8296 0.0579 0.7174 0.856 0.6113 0.8213 0.6125 0.8364 0.5912 0.8313
0.8332 5.0 7785 0.7811 0.5982 0.8004 0.6799 0.007 0.4141 0.6365 0.5683 0.7801 0.8214 0.0194 0.6914 0.852 0.5785 0.8091 0.6326 0.8361 0.5834 0.8188
0.9303 6.0 9342 0.9325 0.5179 0.7395 0.5906 0.0001 0.3665 0.5506 0.5192 0.7288 0.7756 0.0097 0.6358 0.8074 0.52 0.7664 0.5744 0.7825 0.4593 0.7778
0.8773 7.0 10899 0.8869 0.5561 0.7748 0.626 0.0 0.374 0.5948 0.547 0.748 0.7895 0.0 0.6483 0.8223 0.5435 0.7752 0.5909 0.8132 0.5339 0.7801
1.011 8.0 12456 0.8729 0.5604 0.7945 0.6437 0.0013 0.3991 0.5954 0.538 0.7475 0.7971 0.0218 0.6489 0.831 0.5251 0.7729 0.6152 0.8222 0.541 0.7962
1.0019 9.0 14013 1.2187 0.313 0.5258 0.3396 0.0 0.2026 0.337 0.3859 0.6118 0.6892 0.0 0.5141 0.7282 0.4395 0.7076 0.2496 0.6916 0.25 0.6685
0.9286 10.0 15570 0.8059 0.5848 0.7852 0.6584 0.0 0.3874 0.6257 0.5595 0.7725 0.8196 0.0179 0.6689 0.8539 0.5762 0.8015 0.6239 0.8429 0.5544 0.8143
0.8782 11.0 17127 0.8458 0.5587 0.7551 0.627 0.01 0.377 0.596 0.5551 0.7678 0.812 0.0516 0.6635 0.8453 0.5726 0.7966 0.6164 0.8355 0.4872 0.804
0.8493 12.0 18684 0.7872 0.5899 0.7787 0.6622 0.0012 0.3803 0.6329 0.57 0.7832 0.8287 0.019 0.6719 0.8643 0.5813 0.8215 0.6336 0.8412 0.5548 0.8235
0.793 13.0 20241 0.7784 0.6112 0.8128 0.6924 0.0023 0.4275 0.6502 0.5785 0.7783 0.8249 0.0434 0.6872 0.8566 0.6157 0.8253 0.6326 0.8344 0.5852 0.8151
0.7712 14.0 21798 0.7881 0.6089 0.8162 0.688 0.0001 0.4211 0.6497 0.5772 0.7785 0.8184 0.039 0.6874 0.8491 0.5984 0.803 0.6429 0.84 0.5853 0.8123
0.7892 15.0 23355 0.7204 0.6548 0.8515 0.742 0.0007 0.486 0.6916 0.6018 0.7991 0.8376 0.0359 0.7139 0.8668 0.653 0.8377 0.6716 0.8447 0.6399 0.8302
0.7735 16.0 24912 0.7768 0.604 0.7908 0.6807 0.0152 0.4142 0.6445 0.5803 0.7794 0.8258 0.043 0.7167 0.8522 0.5854 0.8359 0.6421 0.8307 0.5844 0.8106
0.8052 17.0 26469 0.7466 0.6389 0.8417 0.7325 0.0001 0.4658 0.6767 0.5883 0.7847 0.8231 0.0245 0.7291 0.8469 0.6493 0.8262 0.6572 0.8329 0.6101 0.8101
0.7366 18.0 28026 0.7490 0.6372 0.8435 0.7198 0.0069 0.4492 0.6783 0.5883 0.7881 0.8287 0.0271 0.7084 0.8572 0.6427 0.8307 0.6486 0.8376 0.6202 0.8178
0.7316 19.0 29583 0.6964 0.6697 0.869 0.7578 0.0028 0.5095 0.704 0.6068 0.8029 0.839 0.0705 0.7283 0.8651 0.6529 0.8277 0.6915 0.8538 0.6646 0.8353
0.7243 20.0 31140 0.7165 0.6616 0.8605 0.753 0.0033 0.4782 0.7003 0.6046 0.7972 0.8299 0.0581 0.7148 0.8574 0.6366 0.8126 0.6791 0.8452 0.6692 0.832
0.7189 21.0 32697 0.6921 0.6748 0.8694 0.7621 0.001 0.5019 0.7121 0.6105 0.8042 0.8382 0.0788 0.7253 0.8649 0.6422 0.8241 0.7087 0.8581 0.6735 0.8324
0.6802 22.0 34254 0.6381 0.7091 0.886 0.7875 0.0007 0.5366 0.7465 0.6318 0.8291 0.8618 0.0716 0.752 0.8883 0.6953 0.8572 0.7253 0.8718 0.7067 0.8565
0.6676 23.0 35811 0.6252 0.7186 0.8865 0.7994 0.0034 0.5423 0.7573 0.6391 0.8373 0.8649 0.0665 0.7597 0.8908 0.7021 0.8567 0.7358 0.8752 0.7181 0.8628
0.6624 24.0 37368 0.6432 0.7117 0.8986 0.8041 0.0019 0.5384 0.7488 0.63 0.8205 0.8535 0.0974 0.7492 0.8786 0.6871 0.8394 0.7287 0.8639 0.7194 0.8572
0.6356 25.0 38925 0.6101 0.7284 0.905 0.8178 0.004 0.5444 0.7676 0.6448 0.8322 0.863 0.1108 0.7616 0.8874 0.7125 0.8556 0.7417 0.8709 0.7311 0.8626
0.6319 26.0 40482 0.6330 0.7191 0.9005 0.8113 0.0085 0.557 0.754 0.6392 0.8262 0.8513 0.106 0.7566 0.8745 0.702 0.849 0.7371 0.8558 0.7182 0.8489
0.6069 27.0 42039 0.5855 0.7461 0.9072 0.824 0.0082 0.5615 0.7852 0.6512 0.845 0.8742 0.2484 0.7625 0.9003 0.7159 0.8596 0.7615 0.884 0.7609 0.8791
0.5898 28.0 43596 0.5582 0.7581 0.9158 0.8405 0.0091 0.5844 0.7957 0.6649 0.8552 0.878 0.2474 0.785 0.9008 0.7384 0.8677 0.7742 0.8882 0.7616 0.8782
0.5777 29.0 45153 0.5412 0.7706 0.9226 0.8492 0.0029 0.6117 0.8054 0.6718 0.8591 0.8878 0.2147 0.7882 0.9118 0.7542 0.8826 0.7799 0.8953 0.7777 0.8854
0.5461 30.0 46710 0.5424 0.7714 0.9259 0.8563 0.0024 0.6033 0.8081 0.6708 0.8571 0.8829 0.206 0.7823 0.9069 0.7602 0.8784 0.7814 0.8906 0.7726 0.8798
0.5392 31.0 48267 0.5274 0.7773 0.9219 0.8591 0.0026 0.5986 0.8159 0.6752 0.8648 0.8888 0.2832 0.795 0.9111 0.7549 0.8835 0.7902 0.894 0.7867 0.8889
0.5367 32.0 49824 0.5181 0.7863 0.9312 0.8654 0.0038 0.6234 0.8216 0.678 0.8663 0.8882 0.2332 0.7956 0.9109 0.7689 0.8805 0.7985 0.8962 0.7915 0.8879
0.5187 33.0 51381 0.5079 0.7853 0.934 0.8672 0.0187 0.6245 0.8206 0.6814 0.8681 0.8917 0.2412 0.7984 0.9143 0.7692 0.89 0.7937 0.8966 0.7931 0.8884
0.5102 34.0 52938 0.4861 0.8049 0.9381 0.8761 0.0318 0.6481 0.8388 0.6912 0.8811 0.9047 0.2659 0.818 0.9262 0.7893 0.8991 0.8191 0.914 0.8062 0.901
0.4868 35.0 54495 0.4753 0.8046 0.9392 0.8827 0.0084 0.6389 0.8413 0.6933 0.8796 0.9015 0.2495 0.8056 0.9249 0.7884 0.8981 0.811 0.9036 0.8143 0.903
0.4821 36.0 56052 0.4714 0.8096 0.9427 0.8861 0.0225 0.6485 0.8448 0.6955 0.8828 0.9042 0.278 0.8159 0.9258 0.7923 0.8981 0.8221 0.9121 0.8143 0.9024
0.4714 37.0 57609 0.4447 0.8232 0.9433 0.8911 0.0219 0.6619 0.8599 0.7032 0.8935 0.9126 0.269 0.8235 0.9344 0.8072 0.9051 0.8295 0.9173 0.8329 0.9154
0.4653 38.0 59166 0.4554 0.819 0.946 0.8912 0.0178 0.6567 0.8548 0.7021 0.8873 0.9059 0.3004 0.8138 0.9281 0.8025 0.899 0.8281 0.9122 0.8264 0.9065
0.4494 39.0 60723 0.4310 0.8308 0.9451 0.8939 0.0243 0.6685 0.8663 0.7091 0.8985 0.918 0.3242 0.8358 0.9381 0.8127 0.9114 0.8401 0.9237 0.8397 0.919
0.4389 40.0 62280 0.4289 0.8336 0.9489 0.8967 0.0177 0.6757 0.8684 0.7102 0.8976 0.9165 0.2958 0.8329 0.9369 0.8167 0.9102 0.8425 0.9219 0.8416 0.9174
0.4376 41.0 63837 0.4245 0.8365 0.9498 0.9011 0.0179 0.6795 0.8709 0.7122 0.8985 0.9182 0.3335 0.8382 0.9377 0.8167 0.9067 0.8477 0.9272 0.845 0.9205
0.4252 42.0 65394 0.4244 0.8368 0.9511 0.8998 0.0175 0.6802 0.8713 0.7106 0.8988 0.9166 0.3604 0.8331 0.9369 0.8183 0.9089 0.8467 0.9229 0.8456 0.9179
0.4215 43.0 66951 0.4281 0.8342 0.9517 0.8987 0.0297 0.6731 0.8696 0.7088 0.8985 0.9159 0.3176 0.83 0.9369 0.8177 0.9092 0.8431 0.9217 0.8418 0.9169
0.4279 44.0 68508 0.4235 0.8375 0.9527 0.9012 0.0191 0.6777 0.873 0.7116 0.8984 0.9164 0.32 0.8288 0.9376 0.8191 0.9094 0.8479 0.9226 0.8453 0.9172
0.4133 45.0 70065 0.4220 0.837 0.9525 0.9014 0.0168 0.6764 0.8719 0.7117 0.9 0.9175 0.3059 0.8353 0.9377 0.8168 0.9081 0.8488 0.9244 0.8455 0.9199
0.4085 46.0 71622 0.4231 0.837 0.9538 0.9026 0.0158 0.679 0.8721 0.7118 0.8986 0.9157 0.2905 0.8276 0.9371 0.8196 0.908 0.8463 0.9227 0.8451 0.9165
0.4138 47.0 73179 0.4269 0.8335 0.9529 0.9019 0.014 0.6735 0.8689 0.7097 0.8965 0.9141 0.2954 0.8246 0.9358 0.8145 0.9048 0.8436 0.9226 0.8424 0.9149
0.4147 48.0 74736 0.4246 0.8353 0.9529 0.9023 0.0158 0.6736 0.8709 0.7108 0.8979 0.9155 0.2982 0.8266 0.937 0.8182 0.9062 0.8447 0.9242 0.843 0.916
0.4145 49.0 76293 0.4237 0.8362 0.9531 0.9027 0.0159 0.6751 0.8715 0.7108 0.8984 0.916 0.2984 0.8286 0.9373 0.8192 0.9067 0.8454 0.9248 0.8439 0.9167
0.406 50.0 77850 0.4246 0.8353 0.9529 0.9029 0.0159 0.6742 0.8707 0.7106 0.8977 0.9152 0.2984 0.8276 0.9365 0.8185 0.9059 0.8446 0.9239 0.8429 0.9159

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.5.1+cu118
  • Datasets 3.3.2
  • Tokenizers 0.21.0
Downloads last month
81
Safetensors
Model size
43.5M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for toukapy/detr_domain_shift

Finetuned
(60)
this model