furion-123 commited on
Commit
39b73fa
1 Parent(s): a3dd19b

End of training

Browse files
README.md ADDED
@@ -0,0 +1,102 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: SenseTime/deformable-detr
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - imagefolder
8
+ model-index:
9
+ - name: deformable_detr_finetuned_rsna_2018
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # deformable_detr_finetuned_rsna_2018
17
+
18
+ This model is a fine-tuned version of [SenseTime/deformable-detr](https://huggingface.co/SenseTime/deformable-detr) on the imagefolder dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.1784
21
+ - Map: 0.1070
22
+ - Map 50: 0.2451
23
+ - Map 75: 0.0775
24
+ - Map Small: 0.0225
25
+ - Map Medium: 0.1112
26
+ - Map Large: 0.1525
27
+ - Mar 1: 0.1582
28
+ - Mar 10: 0.4060
29
+ - Mar 100: 0.6497
30
+ - Mar Small: 0.3167
31
+ - Mar Medium: 0.6607
32
+ - Mar Large: 0.7767
33
+ - Map Pneumonia: 0.1070
34
+ - Mar 100 Pneumonia: 0.6497
35
+
36
+ ## Model description
37
+
38
+ More information needed
39
+
40
+ ## Intended uses & limitations
41
+
42
+ More information needed
43
+
44
+ ## Training and evaluation data
45
+
46
+ More information needed
47
+
48
+ ## Training procedure
49
+
50
+ ### Training hyperparameters
51
+
52
+ The following hyperparameters were used during training:
53
+ - learning_rate: 5e-05
54
+ - train_batch_size: 16
55
+ - eval_batch_size: 8
56
+ - seed: 42
57
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
58
+ - lr_scheduler_type: cosine
59
+ - num_epochs: 30
60
+
61
+ ### Training results
62
+
63
+ | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Pneumonia | Mar 100 Pneumonia |
64
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------------:|:-----------------:|
65
+ | No log | 1.0 | 301 | 1.5776 | 0.0290 | 0.0947 | 0.0092 | 0.0027 | 0.0320 | 0.0329 | 0.0576 | 0.2553 | 0.4849 | 0.0667 | 0.4968 | 0.6628 | 0.0290 | 0.4849 |
66
+ | 1.8843 | 2.0 | 602 | 1.5194 | 0.0623 | 0.1806 | 0.0287 | 0.0285 | 0.0565 | 0.2087 | 0.0932 | 0.2880 | 0.4894 | 0.0900 | 0.5034 | 0.6349 | 0.0623 | 0.4894 |
67
+ | 1.8843 | 3.0 | 903 | 1.4116 | 0.0840 | 0.2315 | 0.0439 | 0.0162 | 0.0785 | 0.2462 | 0.1110 | 0.3184 | 0.5424 | 0.1300 | 0.5573 | 0.6884 | 0.0840 | 0.5424 |
68
+ | 1.4595 | 4.0 | 1204 | 1.4335 | 0.0640 | 0.1683 | 0.0328 | 0.0366 | 0.0652 | 0.1351 | 0.1041 | 0.3313 | 0.5402 | 0.0900 | 0.5573 | 0.6907 | 0.0640 | 0.5402 |
69
+ | 1.4125 | 5.0 | 1505 | 1.4249 | 0.0681 | 0.1879 | 0.0344 | 0.0517 | 0.0653 | 0.1600 | 0.1137 | 0.3120 | 0.5298 | 0.1133 | 0.5424 | 0.7000 | 0.0681 | 0.5298 |
70
+ | 1.4125 | 6.0 | 1806 | 1.4030 | 0.0620 | 0.1636 | 0.0316 | 0.0570 | 0.0615 | 0.1675 | 0.1269 | 0.3244 | 0.5518 | 0.1067 | 0.5659 | 0.7279 | 0.0620 | 0.5518 |
71
+ | 1.396 | 7.0 | 2107 | 1.3371 | 0.0755 | 0.2010 | 0.0418 | 0.0611 | 0.0794 | 0.2684 | 0.1149 | 0.3116 | 0.5708 | 0.2333 | 0.5834 | 0.6860 | 0.0755 | 0.5708 |
72
+ | 1.396 | 8.0 | 2408 | 1.3428 | 0.0761 | 0.2083 | 0.0380 | 0.0438 | 0.0785 | 0.2026 | 0.1238 | 0.3190 | 0.5712 | 0.1633 | 0.5868 | 0.7070 | 0.0761 | 0.5712 |
73
+ | 1.3424 | 9.0 | 2709 | 1.3000 | 0.0720 | 0.1792 | 0.0420 | 0.0746 | 0.0739 | 0.1360 | 0.1348 | 0.3385 | 0.5867 | 0.2000 | 0.5985 | 0.7442 | 0.0720 | 0.5867 |
74
+ | 1.3113 | 10.0 | 3010 | 1.2955 | 0.0774 | 0.1925 | 0.0482 | 0.0407 | 0.0759 | 0.1505 | 0.1356 | 0.3511 | 0.5915 | 0.2233 | 0.6027 | 0.7419 | 0.0774 | 0.5915 |
75
+ | 1.3113 | 11.0 | 3311 | 1.2706 | 0.0902 | 0.2191 | 0.0595 | 0.0668 | 0.0904 | 0.1601 | 0.1418 | 0.3588 | 0.6133 | 0.1933 | 0.6305 | 0.7419 | 0.0902 | 0.6133 |
76
+ | 1.2808 | 12.0 | 3612 | 1.2623 | 0.0870 | 0.2131 | 0.0593 | 0.0560 | 0.0900 | 0.1509 | 0.1356 | 0.3449 | 0.6041 | 0.25 | 0.6141 | 0.7558 | 0.0870 | 0.6041 |
77
+ | 1.2808 | 13.0 | 3913 | 1.2412 | 0.0940 | 0.2286 | 0.0562 | 0.0285 | 0.0996 | 0.1058 | 0.1470 | 0.3843 | 0.6159 | 0.2567 | 0.6283 | 0.7488 | 0.0940 | 0.6159 |
78
+ | 1.2554 | 14.0 | 4214 | 1.2547 | 0.0838 | 0.2087 | 0.0590 | 0.0256 | 0.0949 | 0.1144 | 0.1482 | 0.3704 | 0.6174 | 0.2567 | 0.6305 | 0.7442 | 0.0838 | 0.6174 |
79
+ | 1.2427 | 15.0 | 4515 | 1.2478 | 0.0905 | 0.2212 | 0.0563 | 0.0814 | 0.1071 | 0.1178 | 0.1474 | 0.3770 | 0.6203 | 0.2200 | 0.6354 | 0.7558 | 0.0905 | 0.6203 |
80
+ | 1.2427 | 16.0 | 4816 | 1.2225 | 0.0972 | 0.2415 | 0.0638 | 0.0354 | 0.1005 | 0.1391 | 0.1513 | 0.3754 | 0.6246 | 0.2833 | 0.6378 | 0.7372 | 0.0972 | 0.6246 |
81
+ | 1.2246 | 17.0 | 5117 | 1.2105 | 0.0999 | 0.2357 | 0.0755 | 0.0248 | 0.1057 | 0.1150 | 0.1530 | 0.3981 | 0.6244 | 0.2800 | 0.6339 | 0.7744 | 0.0999 | 0.6244 |
82
+ | 1.2246 | 18.0 | 5418 | 1.2324 | 0.0813 | 0.1970 | 0.0535 | 0.0207 | 0.0854 | 0.1488 | 0.1209 | 0.3621 | 0.6176 | 0.2967 | 0.6293 | 0.7302 | 0.0813 | 0.6176 |
83
+ | 1.2085 | 19.0 | 5719 | 1.1992 | 0.1033 | 0.2369 | 0.0718 | 0.0431 | 0.1040 | 0.1668 | 0.1598 | 0.4002 | 0.6366 | 0.2900 | 0.6490 | 0.7605 | 0.1033 | 0.6366 |
84
+ | 1.1883 | 20.0 | 6020 | 1.2163 | 0.0978 | 0.2356 | 0.0640 | 0.0291 | 0.1030 | 0.1321 | 0.1545 | 0.3994 | 0.6282 | 0.2767 | 0.6410 | 0.7512 | 0.0978 | 0.6282 |
85
+ | 1.1883 | 21.0 | 6321 | 1.2100 | 0.0995 | 0.2346 | 0.0701 | 0.0353 | 0.1059 | 0.1336 | 0.1547 | 0.3983 | 0.6311 | 0.2567 | 0.6446 | 0.7628 | 0.0995 | 0.6311 |
86
+ | 1.1775 | 22.0 | 6622 | 1.1979 | 0.1014 | 0.2343 | 0.0735 | 0.0210 | 0.1080 | 0.1287 | 0.1636 | 0.3915 | 0.6333 | 0.3000 | 0.6446 | 0.7581 | 0.1014 | 0.6333 |
87
+ | 1.1775 | 23.0 | 6923 | 1.1840 | 0.1019 | 0.2410 | 0.0697 | 0.0213 | 0.1050 | 0.1525 | 0.1615 | 0.4000 | 0.6427 | 0.3100 | 0.6551 | 0.7558 | 0.1019 | 0.6427 |
88
+ | 1.1678 | 24.0 | 7224 | 1.1822 | 0.1066 | 0.2445 | 0.0802 | 0.0245 | 0.1113 | 0.1572 | 0.1689 | 0.4021 | 0.6487 | 0.3200 | 0.6612 | 0.7581 | 0.1066 | 0.6487 |
89
+ | 1.1604 | 25.0 | 7525 | 1.1796 | 0.1042 | 0.2409 | 0.0773 | 0.0253 | 0.1082 | 0.1478 | 0.1660 | 0.3954 | 0.6482 | 0.2933 | 0.6605 | 0.7791 | 0.1042 | 0.6482 |
90
+ | 1.1604 | 26.0 | 7826 | 1.1830 | 0.1035 | 0.2390 | 0.0771 | 0.0189 | 0.1084 | 0.1452 | 0.1615 | 0.3921 | 0.6395 | 0.3200 | 0.6502 | 0.7605 | 0.1035 | 0.6395 |
91
+ | 1.1551 | 27.0 | 8127 | 1.1789 | 0.1082 | 0.2489 | 0.0823 | 0.0236 | 0.1139 | 0.1521 | 0.1636 | 0.3959 | 0.6497 | 0.3033 | 0.6624 | 0.7698 | 0.1082 | 0.6497 |
92
+ | 1.1551 | 28.0 | 8428 | 1.1784 | 0.1057 | 0.2429 | 0.0764 | 0.0223 | 0.1104 | 0.1471 | 0.1561 | 0.4033 | 0.6482 | 0.3133 | 0.6600 | 0.7698 | 0.1057 | 0.6482 |
93
+ | 1.1471 | 29.0 | 8729 | 1.1783 | 0.1068 | 0.2452 | 0.0773 | 0.0222 | 0.1114 | 0.1501 | 0.1576 | 0.4054 | 0.6480 | 0.3133 | 0.6595 | 0.7721 | 0.1068 | 0.6480 |
94
+ | 1.1449 | 30.0 | 9030 | 1.1784 | 0.1070 | 0.2451 | 0.0775 | 0.0225 | 0.1112 | 0.1525 | 0.1582 | 0.4060 | 0.6497 | 0.3167 | 0.6607 | 0.7767 | 0.1070 | 0.6497 |
95
+
96
+
97
+ ### Framework versions
98
+
99
+ - Transformers 4.43.3
100
+ - Pytorch 2.4.0+cu121
101
+ - Datasets 2.20.0
102
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e4ec4f6a7dd2576721a9b4f3284d0d2b615864c8e8f1b91a044788190ca6df3f
3
  size 160676460
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:10f2834614aa5a8b7972e5b476991ae45ed7eb34b48286503bc9745cf0166a7f
3
  size 160676460
runs/Aug19_23-09-35_big-desktop/events.out.tfevents.1724089182.big-desktop.235564.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:bbecf0600ddab86662fd03142e7fa838a1143bedff726ef5fe213020574219d5
3
- size 38812
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0cb1a55050daaeccb251f917767742ca4618fd63128195bfcb900667ffbe81b3
3
+ size 40172