|
|
|
--- |
|
language: |
|
- eng |
|
license: wtfpl |
|
tags: |
|
- multilabel-image-classification |
|
- multilabel |
|
- generated_from_trainer |
|
base_model: facebook/dinov2-large |
|
model-index: |
|
- name: DinoVdeauTest-large-2024_09_24-batch-size32_freeze |
|
results: [] |
|
--- |
|
|
|
DinoVdeauTest is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large). It achieves the following results on the test set: |
|
|
|
- Loss: 0.1204 |
|
- F1 Micro: 0.8214 |
|
- F1 Macro: 0.7191 |
|
- Accuracy: 0.3135 |
|
|
|
--- |
|
|
|
# Model description |
|
DinoVdeauTest is a model built on top of facebook/dinov2-large model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers. |
|
|
|
The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau). |
|
|
|
- **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg) |
|
|
|
--- |
|
|
|
# Intended uses & limitations |
|
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species. |
|
|
|
--- |
|
|
|
# Training and evaluation data |
|
Details on the number of images for each class are given in the following table: |
|
| Class | train | val | test | Total | |
|
|:-------------------------|--------:|------:|-------:|--------:| |
|
| Acropore_branched | 1469 | 464 | 475 | 2408 | |
|
| Acropore_digitised | 568 | 160 | 160 | 888 | |
|
| Acropore_sub_massive | 150 | 50 | 43 | 243 | |
|
| Acropore_tabular | 999 | 297 | 293 | 1589 | |
|
| Algae_assembly | 2546 | 847 | 845 | 4238 | |
|
| Algae_drawn_up | 367 | 126 | 127 | 620 | |
|
| Algae_limestone | 1652 | 557 | 563 | 2772 | |
|
| Algae_sodding | 3148 | 984 | 985 | 5117 | |
|
| Atra/Leucospilota | 1084 | 348 | 360 | 1792 | |
|
| Bleached_coral | 219 | 71 | 70 | 360 | |
|
| Blurred | 191 | 67 | 62 | 320 | |
|
| Dead_coral | 1979 | 642 | 643 | 3264 | |
|
| Fish | 2018 | 656 | 647 | 3321 | |
|
| Homo_sapiens | 161 | 62 | 59 | 282 | |
|
| Human_object | 157 | 58 | 55 | 270 | |
|
| Living_coral | 406 | 154 | 141 | 701 | |
|
| Millepore | 385 | 127 | 125 | 637 | |
|
| No_acropore_encrusting | 441 | 130 | 154 | 725 | |
|
| No_acropore_foliaceous | 204 | 36 | 46 | 286 | |
|
| No_acropore_massive | 1031 | 336 | 338 | 1705 | |
|
| No_acropore_solitary | 202 | 53 | 48 | 303 | |
|
| No_acropore_sub_massive | 1401 | 433 | 422 | 2256 | |
|
| Rock | 4489 | 1495 | 1473 | 7457 | |
|
| Rubble | 3092 | 1030 | 1001 | 5123 | |
|
| Sand | 5842 | 1939 | 1938 | 9719 | |
|
| Sea_cucumber | 1408 | 439 | 447 | 2294 | |
|
| Sea_urchins | 327 | 107 | 111 | 545 | |
|
| Sponge | 269 | 96 | 105 | 470 | |
|
| Syringodium_isoetifolium | 1212 | 392 | 391 | 1995 | |
|
| Thalassodendron_ciliatum | 782 | 261 | 260 | 1303 | |
|
| Useless | 579 | 193 | 193 | 965 | |
|
|
|
--- |
|
|
|
# Training procedure |
|
|
|
## Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
|
|
- **Number of Epochs**: 106.0 |
|
- **Learning Rate**: 0.001 |
|
- **Train Batch Size**: 32 |
|
- **Eval Batch Size**: 32 |
|
- **Optimizer**: Adam |
|
- **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1 |
|
- **Freeze Encoder**: Yes |
|
- **Data Augmentation**: Yes |
|
|
|
|
|
## Data Augmentation |
|
Data were augmented using the following transformations : |
|
|
|
Train Transforms |
|
- **PreProcess**: No additional parameters |
|
- **Resize**: probability=1.00 |
|
- **RandomHorizontalFlip**: probability=0.25 |
|
- **RandomVerticalFlip**: probability=0.25 |
|
- **ColorJiggle**: probability=0.25 |
|
- **RandomPerspective**: probability=0.25 |
|
- **Normalize**: probability=1.00 |
|
|
|
Val Transforms |
|
- **PreProcess**: No additional parameters |
|
- **Resize**: probability=1.00 |
|
- **Normalize**: probability=1.00 |
|
|
|
|
|
|
|
## Training results |
|
Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate |
|
--- | --- | --- | --- | --- | --- |
|
1 | 0.17755262553691864 | 0.2169092169092169 | 0.7485599799649386 | 0.5520222156367808 | 0.001 |
|
2 | 0.15282750129699707 | 0.24047124047124047 | 0.7693301744561706 | 0.5698350844395493 | 0.001 |
|
3 | 0.14834792912006378 | 0.2494802494802495 | 0.7773639211038686 | 0.6216984689952817 | 0.001 |
|
4 | 0.1467081755399704 | 0.2553707553707554 | 0.777158535539741 | 0.6272242063327677 | 0.001 |
|
5 | 0.1453460305929184 | 0.24636174636174638 | 0.7772219080349763 | 0.6281234234815563 | 0.001 |
|
6 | 0.14370940625667572 | 0.262993762993763 | 0.7809704878364948 | 0.6191104313765434 | 0.001 |
|
7 | 0.14275749027729034 | 0.2598752598752599 | 0.7811054928907905 | 0.6172423187752968 | 0.001 |
|
8 | 0.1432911604642868 | 0.25571725571725573 | 0.7799277605779154 | 0.6252572102986063 | 0.001 |
|
9 | 0.14202255010604858 | 0.25190575190575193 | 0.7892645815722739 | 0.6529741984810739 | 0.001 |
|
10 | 0.24619852006435394 | 0.2553707553707554 | 0.7441173839133935 | 0.6021759252294137 | 0.001 |
|
11 | 0.14001792669296265 | 0.2702702702702703 | 0.7882727742379886 | 0.6483450509947135 | 0.001 |
|
12 | 0.13999028503894806 | 0.2598752598752599 | 0.7905884326636748 | 0.6590290067827226 | 0.001 |
|
13 | 0.13937941193580627 | 0.2643797643797644 | 0.7875647668393783 | 0.6466696619852873 | 0.001 |
|
14 | 0.13986562192440033 | 0.26403326403326405 | 0.7879462374796679 | 0.6469191780902835 | 0.001 |
|
15 | 0.13914281129837036 | 0.2512127512127512 | 0.7881215116526777 | 0.6413408300553888 | 0.001 |
|
16 | 0.14137022197246552 | 0.26334026334026334 | 0.7906518010291596 | 0.6347762053394217 | 0.001 |
|
17 | 0.14031356573104858 | 0.2616077616077616 | 0.7903470350404312 | 0.6445091939759507 | 0.001 |
|
18 | 0.14039942622184753 | 0.26022176022176025 | 0.792909822326517 | 0.6463740590949281 | 0.001 |
|
19 | 0.14035724103450775 | 0.25848925848925847 | 0.7935513900055008 | 0.6526097604635809 | 0.001 |
|
20 | 0.1390993297100067 | 0.255024255024255 | 0.7900462566386842 | 0.6491113591903472 | 0.001 |
|
21 | 0.13827534019947052 | 0.25814275814275817 | 0.7916862276471341 | 0.6507882842254223 | 0.001 |
|
22 | 0.13889800012111664 | 0.26853776853776856 | 0.7935178133197935 | 0.6489112697391156 | 0.001 |
|
23 | 0.1385144591331482 | 0.2560637560637561 | 0.7842915050342152 | 0.6549504216490796 | 0.001 |
|
24 | 0.13674499094486237 | 0.2695772695772696 | 0.7921174652241113 | 0.6507209930045859 | 0.001 |
|
25 | 0.13791973888874054 | 0.2751212751212751 | 0.7893792922444337 | 0.6408412361180978 | 0.001 |
|
26 | 0.1375003606081009 | 0.27096327096327094 | 0.7942844956280657 | 0.6468889221342444 | 0.001 |
|
27 | 0.13920167088508606 | 0.2643797643797644 | 0.7921208479958802 | 0.6516252219978993 | 0.001 |
|
28 | 0.13846370577812195 | 0.2636867636867637 | 0.7917504599717599 | 0.6424574000169906 | 0.001 |
|
29 | 0.14017289876937866 | 0.2626472626472626 | 0.7883173722754258 | 0.6496760555529415 | 0.001 |
|
30 | 0.13773277401924133 | 0.2668052668052668 | 0.788684508761224 | 0.6553418932728605 | 0.001 |
|
31 | 0.1312580555677414 | 0.27893277893277896 | 0.8017825006427286 | 0.6718196714045327 | 0.0001 |
|
32 | 0.1318267583847046 | 0.2806652806652807 | 0.8060657118786858 | 0.6772473958552189 | 0.0001 |
|
33 | 0.1309264600276947 | 0.27754677754677753 | 0.8050148746281343 | 0.6791810095569879 | 0.0001 |
|
34 | 0.1296369731426239 | 0.28205128205128205 | 0.804873906767428 | 0.6774879025186358 | 0.0001 |
|
35 | 0.12816031277179718 | 0.28932778932778935 | 0.8084945013272659 | 0.6864816840760736 | 0.0001 |
|
36 | 0.12886497378349304 | 0.2830907830907831 | 0.8055151151836945 | 0.6827519815644174 | 0.0001 |
|
37 | 0.12768980860710144 | 0.2830907830907831 | 0.8054549328787556 | 0.677476009100308 | 0.0001 |
|
38 | 0.1274958997964859 | 0.288981288981289 | 0.8084112149532711 | 0.6882635597933275 | 0.0001 |
|
39 | 0.12658803164958954 | 0.28794178794178793 | 0.8099270562589084 | 0.6853651532137502 | 0.0001 |
|
40 | 0.12820227444171906 | 0.28863478863478864 | 0.8117343111925994 | 0.698146081234717 | 0.0001 |
|
41 | 0.12668322026729584 | 0.2882882882882883 | 0.8081688455951466 | 0.6850722645729148 | 0.0001 |
|
42 | 0.12620903551578522 | 0.2907137907137907 | 0.8112331081081081 | 0.6941693850580287 | 0.0001 |
|
43 | 0.12590545415878296 | 0.2910602910602911 | 0.8107360033550011 | 0.69077693559172 | 0.0001 |
|
44 | 0.12642185389995575 | 0.29313929313929316 | 0.8114530416894076 | 0.6925170228864211 | 0.0001 |
|
45 | 0.1257808804512024 | 0.2966042966042966 | 0.8110017663386323 | 0.6974852591826903 | 0.0001 |
|
46 | 0.12542255222797394 | 0.2986832986832987 | 0.8109046268467172 | 0.694112624204446 | 0.0001 |
|
47 | 0.1256514936685562 | 0.2920997920997921 | 0.8098117995347853 | 0.6937033329875891 | 0.0001 |
|
48 | 0.1253652125597 | 0.29140679140679143 | 0.8106740862526874 | 0.6904792900172609 | 0.0001 |
|
49 | 0.1252448409795761 | 0.2945252945252945 | 0.8136526746223817 | 0.6973960572840976 | 0.0001 |
|
50 | 0.12482810020446777 | 0.2983367983367983 | 0.814972104255142 | 0.7025806374316313 | 0.0001 |
|
51 | 0.12458823621273041 | 0.2959112959112959 | 0.815773054365459 | 0.7066639116616246 | 0.0001 |
|
52 | 0.12461517751216888 | 0.29521829521829523 | 0.8120651620566774 | 0.7008785276837475 | 0.0001 |
|
53 | 0.12417320907115936 | 0.29902979902979904 | 0.8142762173840784 | 0.697381421827107 | 0.0001 |
|
54 | 0.12411519885063171 | 0.2966042966042966 | 0.8135221716405328 | 0.7000554609843794 | 0.0001 |
|
55 | 0.12417341023683548 | 0.29521829521829523 | 0.8130868668154448 | 0.6996806217177095 | 0.0001 |
|
56 | 0.1234845370054245 | 0.30214830214830213 | 0.8178580303155802 | 0.7064033143378504 | 0.0001 |
|
57 | 0.12350637465715408 | 0.2993762993762994 | 0.8150138550675959 | 0.6963082116005005 | 0.0001 |
|
58 | 0.12311282008886337 | 0.2983367983367983 | 0.814454598809424 | 0.7011763866264856 | 0.0001 |
|
59 | 0.12340909987688065 | 0.30006930006930005 | 0.8153207895067678 | 0.7021882450681799 | 0.0001 |
|
60 | 0.12386388331651688 | 0.2972972972972973 | 0.8121788610981358 | 0.6978379224258093 | 0.0001 |
|
61 | 0.12355918437242508 | 0.30145530145530147 | 0.8157532819337362 | 0.7114090358686158 | 0.0001 |
|
62 | 0.12266429513692856 | 0.3031878031878032 | 0.8168304358780549 | 0.7119609165997448 | 0.0001 |
|
63 | 0.12305888533592224 | 0.2948717948717949 | 0.8136965505608501 | 0.7077249537938423 | 0.0001 |
|
64 | 0.12276986986398697 | 0.30561330561330563 | 0.8172187094342783 | 0.708440477929098 | 0.0001 |
|
65 | 0.12323789298534393 | 0.3076923076923077 | 0.8182651445712857 | 0.7103116175502073 | 0.0001 |
|
66 | 0.12264254689216614 | 0.30353430353430355 | 0.8178908425822006 | 0.706549053618703 | 0.0001 |
|
67 | 0.12282951176166534 | 0.3052668052668053 | 0.8184709429598117 | 0.7104724530407897 | 0.0001 |
|
68 | 0.12277437746524811 | 0.3042273042273042 | 0.818125541661508 | 0.7127667888432785 | 0.0001 |
|
69 | 0.12282923609018326 | 0.3052668052668053 | 0.8137196924896511 | 0.703803449192515 | 0.0001 |
|
70 | 0.1231524795293808 | 0.30180180180180183 | 0.8154847434705434 | 0.7079779831387449 | 0.0001 |
|
71 | 0.12310803681612015 | 0.29902979902979904 | 0.8155765340525961 | 0.7111485120209401 | 0.0001 |
|
72 | 0.12233748286962509 | 0.30076230076230076 | 0.8161826422695988 | 0.7149554589558469 | 0.0001 |
|
73 | 0.12232980877161026 | 0.3049203049203049 | 0.8173891171292027 | 0.7041677932151664 | 0.0001 |
|
74 | 0.12370481342077255 | 0.29625779625779625 | 0.8124655033329088 | 0.7009378784168091 | 0.0001 |
|
75 | 0.12251079827547073 | 0.30457380457380456 | 0.8151614807319335 | 0.7045397920320168 | 0.0001 |
|
76 | 0.12471619248390198 | 0.30076230076230076 | 0.8159596808063839 | 0.7099467053038022 | 0.0001 |
|
77 | 0.1225149855017662 | 0.29902979902979904 | 0.8179482930062977 | 0.7139220834888879 | 0.0001 |
|
78 | 0.12218895554542542 | 0.30457380457380456 | 0.8188081009296149 | 0.7060530834173799 | 0.0001 |
|
79 | 0.12460680305957794 | 0.30180180180180183 | 0.8152018201735907 | 0.7100736740896743 | 0.0001 |
|
80 | 0.12210072576999664 | 0.3038808038808039 | 0.8179503235232728 | 0.710254660316209 | 0.0001 |
|
81 | 0.12116113305091858 | 0.30180180180180183 | 0.8184843191082273 | 0.7156736962695093 | 0.0001 |
|
82 | 0.12159755080938339 | 0.30803880803880807 | 0.8151834668916069 | 0.7088973142030256 | 0.0001 |
|
83 | 0.12142453342676163 | 0.30803880803880807 | 0.8164913756836348 | 0.7090314532601811 | 0.0001 |
|
84 | 0.12157817929983139 | 0.306999306999307 | 0.8168789808917197 | 0.7100088286352488 | 0.0001 |
|
85 | 0.12155645340681076 | 0.3052668052668053 | 0.8188048474717927 | 0.7109093140233731 | 0.0001 |
|
86 | 0.12185127288103104 | 0.306999306999307 | 0.8190986316274009 | 0.7176309861117636 | 0.0001 |
|
87 | 0.12196006625890732 | 0.3063063063063063 | 0.8176601181250785 | 0.7080074645247395 | 0.0001 |
|
88 | 0.1209772601723671 | 0.3049203049203049 | 0.8210709982967056 | 0.7158455151348946 | 1e-05 |
|
89 | 0.12103869765996933 | 0.30734580734580735 | 0.82409381663113 | 0.7312085502704033 | 1e-05 |
|
90 | 0.12060839682817459 | 0.306999306999307 | 0.823466447097571 | 0.7232165780756572 | 1e-05 |
|
91 | 0.12036494165658951 | 0.3087318087318087 | 0.8190853196327803 | 0.714800773362679 | 1e-05 |
|
92 | 0.12028194963932037 | 0.3087318087318087 | 0.8194187804468336 | 0.7131694962358415 | 1e-05 |
|
93 | 0.12036142498254776 | 0.30838530838530837 | 0.8214921910601102 | 0.7183857334665726 | 1e-05 |
|
94 | 0.12012535333633423 | 0.30665280665280664 | 0.8207247828991316 | 0.7195089038891936 | 1e-05 |
|
95 | 0.12010551244020462 | 0.30838530838530837 | 0.8197421299397187 | 0.7157953029268052 | 1e-05 |
|
96 | 0.1198035180568695 | 0.3052668052668053 | 0.8217099503939306 | 0.7193329750108997 | 1e-05 |
|
97 | 0.12007978558540344 | 0.3063063063063063 | 0.8204592028773368 | 0.7210913273742954 | 1e-05 |
|
98 | 0.12013950198888779 | 0.30803880803880807 | 0.8226615276223631 | 0.7250050738866957 | 1e-05 |
|
99 | 0.12003140896558762 | 0.30734580734580735 | 0.8206050968740846 | 0.7226446473273916 | 1e-05 |
|
100 | 0.1200241968035698 | 0.30803880803880807 | 0.8216021451315569 | 0.7191058080317732 | 1e-05 |
|
101 | 0.11989927291870117 | 0.3097713097713098 | 0.8228338260398884 | 0.724185821594473 | 1e-05 |
|
102 | 0.12043397128582001 | 0.30734580734580735 | 0.8192071374463429 | 0.7216480982679994 | 1e-05 |
|
103 | 0.1199527159333229 | 0.30665280665280664 | 0.8221990402670561 | 0.7231653777895258 | 1.0000000000000002e-06 |
|
104 | 0.12043838202953339 | 0.31011781011781014 | 0.8217751676454663 | 0.7234886652066129 | 1.0000000000000002e-06 |
|
105 | 0.11985885351896286 | 0.3076923076923077 | 0.8219212232009308 | 0.7205886261735153 | 1.0000000000000002e-06 |
|
106 | 0.11984959244728088 | 0.31046431046431044 | 0.8228195739014648 | 0.7270103073654017 | 1.0000000000000002e-06 |
|
|
|
|
|
--- |
|
|
|
# CO2 Emissions |
|
|
|
The estimated CO2 emissions for training this model are documented below: |
|
|
|
- **Emissions**: 1.1530867349091423 grams of CO2 |
|
- **Source**: Code Carbon |
|
- **Training Type**: fine-tuning |
|
- **Geographical Location**: Brest, France |
|
- **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go |
|
|
|
|
|
--- |
|
|
|
# Framework Versions |
|
|
|
- **Transformers**: 4.44.2 |
|
- **Pytorch**: 2.4.1+cu121 |
|
- **Datasets**: 3.0.0 |
|
- **Tokenizers**: 0.19.1 |
|
|
|
|