lombardata
commited on
Commit
•
528b0b1
1
Parent(s):
c42c931
Upload README.md
Browse files
README.md
CHANGED
@@ -1,131 +1,197 @@
|
|
|
|
1 |
---
|
2 |
-
|
3 |
-
|
|
|
4 |
tags:
|
|
|
|
|
5 |
- generated_from_trainer
|
|
|
6 |
model-index:
|
7 |
- name: drone-DinoVdeau-produttoria-probabilities-large-2024_11_04-batch-size64_freeze_probs
|
8 |
results: []
|
9 |
---
|
10 |
|
11 |
-
|
12 |
-
should probably proofread and complete it, then remove this comment. -->
|
13 |
|
14 |
-
# drone-DinoVdeau-produttoria-probabilities-large-2024_11_04-batch-size64_freeze_probs
|
15 |
|
16 |
-
This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
|
17 |
-
It achieves the following results on the evaluation set:
|
18 |
- Loss: 0.3194
|
19 |
-
-
|
20 |
-
-
|
21 |
-
-
|
22 |
-
-
|
23 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
|
25 |
-
|
|
|
26 |
|
27 |
-
|
28 |
|
29 |
-
|
30 |
|
31 |
-
|
|
|
|
|
|
|
|
|
|
|
32 |
|
33 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
34 |
|
35 |
-
|
36 |
|
37 |
-
|
38 |
|
39 |
-
|
40 |
|
41 |
The following hyperparameters were used during training:
|
42 |
-
|
43 |
-
-
|
44 |
-
-
|
45 |
-
-
|
46 |
-
-
|
47 |
-
-
|
48 |
-
-
|
49 |
-
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
| 0.
|
76 |
-
| 0.
|
77 |
-
| 0.
|
78 |
-
| 0.
|
79 |
-
| 0.
|
80 |
-
| 0.
|
81 |
-
| 0.
|
82 |
-
| 0.
|
83 |
-
| 0.
|
84 |
-
| 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
-
| 0.
|
90 |
-
| 0.
|
91 |
-
| 0.
|
92 |
-
| 0.
|
93 |
-
| 0.
|
94 |
-
| 0.
|
95 |
-
| 0.
|
96 |
-
| 0.
|
97 |
-
| 0.
|
98 |
-
| 0.
|
99 |
-
| 0.
|
100 |
-
| 0.
|
101 |
-
| 0.
|
102 |
-
| 0.
|
103 |
-
| 0.
|
104 |
-
| 0.
|
105 |
-
| 0.
|
106 |
-
| 0.
|
107 |
-
| 0.
|
108 |
-
| 0.
|
109 |
-
| 0.
|
110 |
-
| 0.
|
111 |
-
| 0.
|
112 |
-
| 0.
|
113 |
-
| 0.
|
114 |
-
| 0.
|
115 |
-
| 0.
|
116 |
-
| 0.
|
117 |
-
| 0.
|
118 |
-
| 0.
|
119 |
-
| 0.
|
120 |
-
| 0.
|
121 |
-
| 0.
|
122 |
-
| 0.
|
123 |
-
| 0.
|
124 |
-
|
125 |
-
|
126 |
-
|
127 |
-
|
128 |
-
|
129 |
-
|
130 |
-
|
131 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
---
|
3 |
+
language:
|
4 |
+
- eng
|
5 |
+
license: cc0-1.0
|
6 |
tags:
|
7 |
+
- multilabel-image-classification
|
8 |
+
- multilabel
|
9 |
- generated_from_trainer
|
10 |
+
base_model: drone-DinoVdeau-produttoria-probabilities-large-2024_11_04-batch-size64_freeze_probs
|
11 |
model-index:
|
12 |
- name: drone-DinoVdeau-produttoria-probabilities-large-2024_11_04-batch-size64_freeze_probs
|
13 |
results: []
|
14 |
---
|
15 |
|
16 |
+
drone-DinoVdeau-produttoria-probabilities is a fine-tuned version of [drone-DinoVdeau-produttoria-probabilities-large-2024_11_04-batch-size64_freeze_probs](https://huggingface.co/drone-DinoVdeau-produttoria-probabilities-large-2024_11_04-batch-size64_freeze_probs). It achieves the following results on the test set:
|
|
|
17 |
|
|
|
18 |
|
|
|
|
|
19 |
- Loss: 0.3194
|
20 |
+
- F1 Micro: 0.8663
|
21 |
+
- F1 Macro: 0.8311
|
22 |
+
- Accuracy: 0.1799
|
23 |
+
- RMSE: 0.2404
|
24 |
+
- MAE: 0.1536
|
25 |
+
- R2: 0.4282
|
26 |
+
|
27 |
+
| Class | F1 per class |
|
28 |
+
|----------|-------|
|
29 |
+
| Acropore_branched | 0.8010 |
|
30 |
+
| Acropore_digitised | 0.7454 |
|
31 |
+
| Acropore_tabular | 0.6426 |
|
32 |
+
| Algae | 0.9852 |
|
33 |
+
| Dead_coral | 0.8448 |
|
34 |
+
| Fish | 0.7497 |
|
35 |
+
| Millepore | 0.6641 |
|
36 |
+
| No_acropore_encrusting | 0.7391 |
|
37 |
+
| No_acropore_massive | 0.8688 |
|
38 |
+
| No_acropore_sub_massive | 0.8137 |
|
39 |
+
| Rock | 0.9924 |
|
40 |
+
| Rubble | 0.9691 |
|
41 |
+
| Sand | 0.9888 |
|
42 |
+
|
43 |
+
|
44 |
+
---
|
45 |
|
46 |
+
# Model description
|
47 |
+
drone-DinoVdeau-produttoria-probabilities is a model built on top of drone-DinoVdeau-produttoria-probabilities-large-2024_11_04-batch-size64_freeze_probs model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
|
48 |
|
49 |
+
The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
|
50 |
|
51 |
+
- **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
|
52 |
|
53 |
+
---
|
54 |
+
|
55 |
+
# Intended uses & limitations
|
56 |
+
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
|
57 |
+
|
58 |
+
---
|
59 |
|
60 |
+
# Training and evaluation data
|
61 |
+
Details on the estimated number of images for each class are given in the following table:
|
62 |
+
| Class | train | test | val | Total |
|
63 |
+
|:------------------------|--------:|-------:|------:|--------:|
|
64 |
+
| Acropore_branched | 2028 | 684 | 686 | 3398 |
|
65 |
+
| Acropore_digitised | 2006 | 735 | 717 | 3458 |
|
66 |
+
| Acropore_tabular | 1237 | 461 | 451 | 2149 |
|
67 |
+
| Algae | 11086 | 3671 | 3675 | 18432 |
|
68 |
+
| Dead_coral | 6354 | 2161 | 2147 | 10662 |
|
69 |
+
| Fish | 4032 | 1430 | 1430 | 6892 |
|
70 |
+
| Millepore | 1943 | 783 | 772 | 3498 |
|
71 |
+
| No_acropore_encrusting | 2663 | 986 | 957 | 4606 |
|
72 |
+
| No_acropore_massive | 6897 | 2375 | 2375 | 11647 |
|
73 |
+
| No_acropore_sub_massive | 5416 | 1988 | 1958 | 9362 |
|
74 |
+
| Rock | 11164 | 3726 | 3725 | 18615 |
|
75 |
+
| Rubble | 10687 | 3570 | 3572 | 17829 |
|
76 |
+
| Sand | 11151 | 3726 | 3723 | 18600 |
|
77 |
|
78 |
+
---
|
79 |
|
80 |
+
# Training procedure
|
81 |
|
82 |
+
## Training hyperparameters
|
83 |
|
84 |
The following hyperparameters were used during training:
|
85 |
+
|
86 |
+
- **Number of Epochs**: 69.0
|
87 |
+
- **Learning Rate**: 0.001
|
88 |
+
- **Train Batch Size**: 64
|
89 |
+
- **Eval Batch Size**: 64
|
90 |
+
- **Optimizer**: Adam
|
91 |
+
- **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
|
92 |
+
- **Freeze Encoder**: Yes
|
93 |
+
- **Data Augmentation**: Yes
|
94 |
+
|
95 |
+
|
96 |
+
## Data Augmentation
|
97 |
+
Data were augmented using the following transformations :
|
98 |
+
|
99 |
+
Train Transforms
|
100 |
+
- **PreProcess**: No additional parameters
|
101 |
+
- **Resize**: probability=1.00
|
102 |
+
- **RandomHorizontalFlip**: probability=0.25
|
103 |
+
- **RandomVerticalFlip**: probability=0.25
|
104 |
+
- **ColorJiggle**: probability=0.25
|
105 |
+
- **RandomPerspective**: probability=0.25
|
106 |
+
- **Normalize**: probability=1.00
|
107 |
+
|
108 |
+
Val Transforms
|
109 |
+
- **PreProcess**: No additional parameters
|
110 |
+
- **Resize**: probability=1.00
|
111 |
+
- **Normalize**: probability=1.00
|
112 |
+
|
113 |
+
|
114 |
+
|
115 |
+
## Training results
|
116 |
+
Epoch | Validation Loss | MAE | RMSE | R2 | Learning Rate
|
117 |
+
--- | --- | --- | --- | --- | ---
|
118 |
+
1 | 0.36101797223091125 | 0.1878 | 0.2645 | 0.2818 | 0.001
|
119 |
+
2 | 0.3464529514312744 | 0.1778 | 0.2566 | 0.3349 | 0.001
|
120 |
+
3 | 0.34133487939834595 | 0.1731 | 0.2532 | 0.3536 | 0.001
|
121 |
+
4 | 0.3406243324279785 | 0.1743 | 0.2532 | 0.3519 | 0.001
|
122 |
+
5 | 0.3341675400733948 | 0.1661 | 0.2496 | 0.3702 | 0.001
|
123 |
+
6 | 0.33848145604133606 | 0.1739 | 0.2512 | 0.3651 | 0.001
|
124 |
+
7 | 0.3320676386356354 | 0.1650 | 0.2476 | 0.3836 | 0.001
|
125 |
+
8 | 0.3332081437110901 | 0.1629 | 0.2484 | 0.3802 | 0.001
|
126 |
+
9 | 0.3305376172065735 | 0.1652 | 0.2468 | 0.3859 | 0.001
|
127 |
+
10 | 0.33136793971061707 | 0.1655 | 0.2476 | 0.3827 | 0.001
|
128 |
+
11 | 0.3319685757160187 | 0.1602 | 0.2474 | 0.3840 | 0.001
|
129 |
+
12 | 0.3341500759124756 | 0.1683 | 0.2494 | 0.3761 | 0.001
|
130 |
+
13 | 0.33248215913772583 | 0.1649 | 0.2480 | 0.3821 | 0.001
|
131 |
+
14 | 0.33228376507759094 | 0.1700 | 0.2472 | 0.3878 | 0.001
|
132 |
+
15 | 0.334873229265213 | 0.1703 | 0.2493 | 0.3749 | 0.001
|
133 |
+
16 | 0.3279329538345337 | 0.1649 | 0.2448 | 0.3983 | 0.0001
|
134 |
+
17 | 0.3279244005680084 | 0.1648 | 0.2448 | 0.3984 | 0.0001
|
135 |
+
18 | 0.3262367248535156 | 0.1622 | 0.2440 | 0.4025 | 0.0001
|
136 |
+
19 | 0.3247373402118683 | 0.1588 | 0.2432 | 0.4046 | 0.0001
|
137 |
+
20 | 0.32612329721450806 | 0.1625 | 0.2433 | 0.4059 | 0.0001
|
138 |
+
21 | 0.3241129517555237 | 0.1606 | 0.2424 | 0.4095 | 0.0001
|
139 |
+
22 | 0.32355180382728577 | 0.1587 | 0.2422 | 0.4111 | 0.0001
|
140 |
+
23 | 0.3242079019546509 | 0.1601 | 0.2423 | 0.4107 | 0.0001
|
141 |
+
24 | 0.3227241337299347 | 0.1586 | 0.2414 | 0.4150 | 0.0001
|
142 |
+
25 | 0.3223778307437897 | 0.1587 | 0.2413 | 0.4148 | 0.0001
|
143 |
+
26 | 0.3217927813529968 | 0.1557 | 0.2413 | 0.4143 | 0.0001
|
144 |
+
27 | 0.3227355182170868 | 0.1603 | 0.2416 | 0.4138 | 0.0001
|
145 |
+
28 | 0.32067713141441345 | 0.1562 | 0.2405 | 0.4186 | 0.0001
|
146 |
+
29 | 0.32205939292907715 | 0.1597 | 0.2411 | 0.4163 | 0.0001
|
147 |
+
30 | 0.32246074080467224 | 0.1608 | 0.2413 | 0.4164 | 0.0001
|
148 |
+
31 | 0.3223503530025482 | 0.1535 | 0.2416 | 0.4134 | 0.0001
|
149 |
+
32 | 0.3212696313858032 | 0.1553 | 0.2408 | 0.4180 | 0.0001
|
150 |
+
33 | 0.32156360149383545 | 0.1583 | 0.2414 | 0.4123 | 0.0001
|
151 |
+
34 | 0.3205103278160095 | 0.1562 | 0.2406 | 0.4172 | 0.0001
|
152 |
+
35 | 0.3197581171989441 | 0.1535 | 0.2399 | 0.4215 | 0.0001
|
153 |
+
36 | 0.3211075961589813 | 0.1577 | 0.2406 | 0.4187 | 0.0001
|
154 |
+
37 | 0.3203599154949188 | 0.1520 | 0.2403 | 0.4188 | 0.0001
|
155 |
+
38 | 0.32143038511276245 | 0.1560 | 0.2409 | 0.4170 | 0.0001
|
156 |
+
39 | 0.3195198178291321 | 0.1520 | 0.2397 | 0.4226 | 0.0001
|
157 |
+
40 | 0.3207896649837494 | 0.1577 | 0.2404 | 0.4204 | 0.0001
|
158 |
+
41 | 0.3197501003742218 | 0.1547 | 0.2398 | 0.4217 | 0.0001
|
159 |
+
42 | 0.32175716757774353 | 0.1589 | 0.2410 | 0.4174 | 0.0001
|
160 |
+
43 | 0.3189575970172882 | 0.1544 | 0.2396 | 0.4235 | 0.0001
|
161 |
+
44 | 0.31898385286331177 | 0.1534 | 0.2396 | 0.4230 | 0.0001
|
162 |
+
45 | 0.31977778673171997 | 0.1566 | 0.2397 | 0.4239 | 0.0001
|
163 |
+
46 | 0.3193351626396179 | 0.1556 | 0.2398 | 0.4213 | 0.0001
|
164 |
+
47 | 0.31895366311073303 | 0.1524 | 0.2393 | 0.4245 | 0.0001
|
165 |
+
48 | 0.3192996680736542 | 0.1525 | 0.2398 | 0.4215 | 0.0001
|
166 |
+
49 | 0.32073548436164856 | 0.1558 | 0.2405 | 0.4187 | 0.0001
|
167 |
+
50 | 0.3198453485965729 | 0.1572 | 0.2400 | 0.4218 | 1e-05
|
168 |
+
51 | 0.32436585426330566 | 0.1602 | 0.2426 | 0.4092 | 1e-05
|
169 |
+
52 | 0.31899821758270264 | 0.1550 | 0.2396 | 0.4227 | 1e-05
|
170 |
+
53 | 0.31892043352127075 | 0.1552 | 0.2394 | 0.4249 | 1e-05
|
171 |
+
54 | 0.3194037675857544 | 0.1540 | 0.2396 | 0.4227 | 1e-05
|
172 |
+
55 | 0.3184601366519928 | 0.1539 | 0.2391 | 0.4250 | 1e-05
|
173 |
+
56 | 0.318115234375 | 0.1527 | 0.2388 | 0.4273 | 1e-05
|
174 |
+
57 | 0.31871330738067627 | 0.1532 | 0.2392 | 0.4259 | 1e-05
|
175 |
+
58 | 0.32010164856910706 | 0.1567 | 0.2401 | 0.4217 | 1e-05
|
176 |
+
59 | 0.31807705760002136 | 0.1528 | 0.2388 | 0.4270 | 1e-05
|
177 |
+
60 | 0.3181913495063782 | 0.1534 | 0.2389 | 0.4256 | 1e-05
|
178 |
+
61 | 0.3185857832431793 | 0.1510 | 0.2391 | 0.4255 | 1e-05
|
179 |
+
62 | 0.32031872868537903 | 0.1596 | 0.2398 | 0.4240 | 1e-05
|
180 |
+
63 | 0.31964218616485596 | 0.1570 | 0.2397 | 0.4242 | 1e-05
|
181 |
+
64 | 0.31808170676231384 | 0.1527 | 0.2391 | 0.4244 | 1e-05
|
182 |
+
65 | 0.31850185990333557 | 0.1550 | 0.2390 | 0.4259 | 1e-05
|
183 |
+
66 | 0.3186076879501343 | 0.1562 | 0.2389 | 0.4278 | 1.0000000000000002e-06
|
184 |
+
67 | 0.3181016743183136 | 0.1526 | 0.2387 | 0.4270 | 1.0000000000000002e-06
|
185 |
+
68 | 0.3194774389266968 | 0.1549 | 0.2397 | 0.4221 | 1.0000000000000002e-06
|
186 |
+
69 | 0.3183264136314392 | 0.1540 | 0.2390 | 0.4259 | 1.0000000000000002e-06
|
187 |
+
|
188 |
+
|
189 |
+
---
|
190 |
+
|
191 |
+
# Framework Versions
|
192 |
+
|
193 |
+
- **Transformers**: 4.41.0
|
194 |
+
- **Pytorch**: 2.5.0+cu124
|
195 |
+
- **Datasets**: 3.0.2
|
196 |
+
- **Tokenizers**: 0.19.1
|
197 |
+
|