lombardata commited on
Commit
4c22101
·
verified ·
1 Parent(s): 9c02ff4

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +226 -151
README.md CHANGED
@@ -1,175 +1,250 @@
 
1
  ---
2
- license: apache-2.0
3
- base_model: facebook/dinov2-base
 
4
  tags:
 
 
5
  - generated_from_trainer
6
- metrics:
7
- - accuracy
8
  model-index:
9
  - name: DinoVdeau-base-2024_09_03-batch-size32_epochs150_freeze
10
  results: []
11
  ---
12
 
13
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
- should probably proofread and complete it, then remove this comment. -->
15
 
16
- # DinoVdeau-base-2024_09_03-batch-size32_epochs150_freeze
17
-
18
- This model is a fine-tuned version of [facebook/dinov2-base](https://huggingface.co/facebook/dinov2-base) on the None dataset.
19
- It achieves the following results on the evaluation set:
20
  - Loss: 0.1260
21
  - F1 Micro: 0.8131
22
  - F1 Macro: 0.6976
23
  - Roc Auc: 0.8760
24
  - Accuracy: 0.3014
25
- - Learning Rate: 0.0000
26
 
27
- ## Model description
 
 
 
28
 
29
- More information needed
30
 
31
- ## Intended uses & limitations
32
 
33
- More information needed
 
 
 
 
 
34
 
35
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
- More information needed
38
 
39
- ## Training procedure
40
 
41
- ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 0.001
45
- - train_batch_size: 32
46
- - eval_batch_size: 32
47
- - seed: 42
48
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
- - lr_scheduler_type: linear
50
- - num_epochs: 150
51
- - mixed_precision_training: Native AMP
52
-
53
- ### Training results
54
-
55
- | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
56
- |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
57
- | No log | 1.0 | 273 | 0.1752 | 0.7311 | 0.5105 | 0.8187 | 0.2079 | 0.001 |
58
- | 0.2857 | 2.0 | 546 | 0.1578 | 0.7583 | 0.5498 | 0.8363 | 0.2349 | 0.001 |
59
- | 0.2857 | 3.0 | 819 | 0.1516 | 0.7722 | 0.6037 | 0.8505 | 0.2315 | 0.001 |
60
- | 0.1764 | 4.0 | 1092 | 0.1522 | 0.7650 | 0.6140 | 0.8387 | 0.2422 | 0.001 |
61
- | 0.1764 | 5.0 | 1365 | 0.1484 | 0.7720 | 0.6162 | 0.8403 | 0.2422 | 0.001 |
62
- | 0.1677 | 6.0 | 1638 | 0.1482 | 0.7750 | 0.6052 | 0.8435 | 0.2561 | 0.001 |
63
- | 0.1677 | 7.0 | 1911 | 0.1486 | 0.7729 | 0.6177 | 0.8431 | 0.2419 | 0.001 |
64
- | 0.1652 | 8.0 | 2184 | 0.1486 | 0.7767 | 0.6172 | 0.8485 | 0.2512 | 0.001 |
65
- | 0.1652 | 9.0 | 2457 | 0.1483 | 0.7805 | 0.6366 | 0.8570 | 0.2512 | 0.001 |
66
- | 0.1617 | 10.0 | 2730 | 0.1503 | 0.7683 | 0.6081 | 0.8352 | 0.2453 | 0.001 |
67
- | 0.1615 | 11.0 | 3003 | 0.1441 | 0.7757 | 0.6200 | 0.8409 | 0.2609 | 0.001 |
68
- | 0.1615 | 12.0 | 3276 | 0.1487 | 0.7815 | 0.6299 | 0.8543 | 0.2495 | 0.001 |
69
- | 0.1614 | 13.0 | 3549 | 0.1490 | 0.7779 | 0.6242 | 0.8446 | 0.2519 | 0.001 |
70
- | 0.1614 | 14.0 | 3822 | 0.1434 | 0.7826 | 0.6379 | 0.8475 | 0.2606 | 0.001 |
71
- | 0.1599 | 15.0 | 4095 | 0.1435 | 0.7874 | 0.6397 | 0.8552 | 0.2554 | 0.001 |
72
- | 0.1599 | 16.0 | 4368 | 0.1439 | 0.7793 | 0.6344 | 0.8464 | 0.2568 | 0.001 |
73
- | 0.1589 | 17.0 | 4641 | 0.1448 | 0.7878 | 0.6422 | 0.8596 | 0.2543 | 0.001 |
74
- | 0.1589 | 18.0 | 4914 | 0.1440 | 0.7865 | 0.6417 | 0.8552 | 0.2568 | 0.001 |
75
- | 0.1604 | 19.0 | 5187 | 0.1420 | 0.7864 | 0.6318 | 0.8550 | 0.2540 | 0.001 |
76
- | 0.1604 | 20.0 | 5460 | 0.1409 | 0.7869 | 0.6409 | 0.8522 | 0.2588 | 0.001 |
77
- | 0.1586 | 21.0 | 5733 | 0.1425 | 0.7865 | 0.6413 | 0.8561 | 0.2620 | 0.001 |
78
- | 0.1587 | 22.0 | 6006 | 0.1538 | 0.7854 | 0.6371 | 0.8608 | 0.2370 | 0.001 |
79
- | 0.1587 | 23.0 | 6279 | 0.1419 | 0.7842 | 0.6390 | 0.8497 | 0.2557 | 0.001 |
80
- | 0.1592 | 24.0 | 6552 | 0.1414 | 0.7870 | 0.6459 | 0.8561 | 0.2599 | 0.001 |
81
- | 0.1592 | 25.0 | 6825 | 0.1399 | 0.7868 | 0.6263 | 0.8523 | 0.2685 | 0.001 |
82
- | 0.1586 | 26.0 | 7098 | 0.1465 | 0.7847 | 0.6238 | 0.8561 | 0.2592 | 0.001 |
83
- | 0.1586 | 27.0 | 7371 | 0.1551 | 0.7720 | 0.6344 | 0.8433 | 0.2380 | 0.001 |
84
- | 0.16 | 28.0 | 7644 | 0.1443 | 0.7891 | 0.6430 | 0.8550 | 0.2616 | 0.001 |
85
- | 0.16 | 29.0 | 7917 | 0.1428 | 0.7874 | 0.6416 | 0.8565 | 0.2568 | 0.001 |
86
- | 0.1589 | 30.0 | 8190 | 0.1416 | 0.7799 | 0.6308 | 0.8425 | 0.2526 | 0.001 |
87
- | 0.1589 | 31.0 | 8463 | 0.1398 | 0.7895 | 0.6431 | 0.8566 | 0.2689 | 0.001 |
88
- | 0.1588 | 32.0 | 8736 | 0.1448 | 0.7891 | 0.6521 | 0.8601 | 0.2568 | 0.001 |
89
- | 0.1581 | 33.0 | 9009 | 0.1404 | 0.7896 | 0.6497 | 0.8582 | 0.2640 | 0.001 |
90
- | 0.1581 | 34.0 | 9282 | 0.1426 | 0.7871 | 0.6449 | 0.8537 | 0.2557 | 0.001 |
91
- | 0.1578 | 35.0 | 9555 | 0.1414 | 0.7846 | 0.6428 | 0.8487 | 0.2630 | 0.001 |
92
- | 0.1578 | 36.0 | 9828 | 0.1465 | 0.7834 | 0.6434 | 0.8484 | 0.2678 | 0.001 |
93
- | 0.1576 | 37.0 | 10101 | 0.1380 | 0.7924 | 0.6438 | 0.8577 | 0.2668 | 0.001 |
94
- | 0.1576 | 38.0 | 10374 | 0.1392 | 0.7892 | 0.6475 | 0.8555 | 0.2637 | 0.001 |
95
- | 0.1556 | 39.0 | 10647 | 0.1458 | 0.7872 | 0.6592 | 0.8680 | 0.2460 | 0.001 |
96
- | 0.1556 | 40.0 | 10920 | 0.1389 | 0.7946 | 0.6469 | 0.8660 | 0.2699 | 0.001 |
97
- | 0.1577 | 41.0 | 11193 | 0.1402 | 0.7848 | 0.6510 | 0.8491 | 0.2616 | 0.001 |
98
- | 0.1577 | 42.0 | 11466 | 0.1404 | 0.7928 | 0.6609 | 0.8625 | 0.2717 | 0.001 |
99
- | 0.1576 | 43.0 | 11739 | 0.1394 | 0.7931 | 0.6427 | 0.8593 | 0.2696 | 0.001 |
100
- | 0.1543 | 44.0 | 12012 | 0.1367 | 0.7989 | 0.6568 | 0.8632 | 0.2755 | 0.0001 |
101
- | 0.1543 | 45.0 | 12285 | 0.1362 | 0.8018 | 0.6686 | 0.8652 | 0.2827 | 0.0001 |
102
- | 0.1481 | 46.0 | 12558 | 0.1338 | 0.8022 | 0.6640 | 0.8656 | 0.2852 | 0.0001 |
103
- | 0.1481 | 47.0 | 12831 | 0.1410 | 0.7999 | 0.6573 | 0.8621 | 0.2786 | 0.0001 |
104
- | 0.1472 | 48.0 | 13104 | 0.1338 | 0.8044 | 0.6728 | 0.8675 | 0.2848 | 0.0001 |
105
- | 0.1472 | 49.0 | 13377 | 0.1322 | 0.8058 | 0.6742 | 0.8724 | 0.2855 | 0.0001 |
106
- | 0.1448 | 50.0 | 13650 | 0.1332 | 0.8063 | 0.6739 | 0.8703 | 0.2897 | 0.0001 |
107
- | 0.1448 | 51.0 | 13923 | 0.1306 | 0.8063 | 0.6771 | 0.8702 | 0.2897 | 0.0001 |
108
- | 0.1432 | 52.0 | 14196 | 0.1311 | 0.8044 | 0.6727 | 0.8654 | 0.2872 | 0.0001 |
109
- | 0.1432 | 53.0 | 14469 | 0.1316 | 0.8071 | 0.6703 | 0.8713 | 0.2872 | 0.0001 |
110
- | 0.1438 | 54.0 | 14742 | 0.1316 | 0.8064 | 0.6788 | 0.8688 | 0.2883 | 0.0001 |
111
- | 0.1417 | 55.0 | 15015 | 0.1308 | 0.8061 | 0.6699 | 0.8686 | 0.2876 | 0.0001 |
112
- | 0.1417 | 56.0 | 15288 | 0.1297 | 0.8094 | 0.6800 | 0.8744 | 0.2942 | 0.0001 |
113
- | 0.1415 | 57.0 | 15561 | 0.1296 | 0.8087 | 0.6717 | 0.8711 | 0.2935 | 0.0001 |
114
- | 0.1415 | 58.0 | 15834 | 0.1297 | 0.8069 | 0.6785 | 0.8708 | 0.2924 | 0.0001 |
115
- | 0.1413 | 59.0 | 16107 | 0.1300 | 0.8087 | 0.6811 | 0.8707 | 0.2911 | 0.0001 |
116
- | 0.1413 | 60.0 | 16380 | 0.1302 | 0.8056 | 0.6726 | 0.8658 | 0.2879 | 0.0001 |
117
- | 0.1404 | 61.0 | 16653 | 0.1287 | 0.8096 | 0.6843 | 0.8721 | 0.2949 | 0.0001 |
118
- | 0.1404 | 62.0 | 16926 | 0.1291 | 0.8080 | 0.6822 | 0.8690 | 0.2900 | 0.0001 |
119
- | 0.1393 | 63.0 | 17199 | 0.1287 | 0.8076 | 0.6813 | 0.8685 | 0.2980 | 0.0001 |
120
- | 0.1393 | 64.0 | 17472 | 0.1286 | 0.8091 | 0.6806 | 0.8722 | 0.2959 | 0.0001 |
121
- | 0.1395 | 65.0 | 17745 | 0.1280 | 0.8093 | 0.6838 | 0.8704 | 0.2931 | 0.0001 |
122
- | 0.1389 | 66.0 | 18018 | 0.1278 | 0.8108 | 0.6855 | 0.8744 | 0.2959 | 0.0001 |
123
- | 0.1389 | 67.0 | 18291 | 0.1282 | 0.8098 | 0.6849 | 0.8746 | 0.2949 | 0.0001 |
124
- | 0.1376 | 68.0 | 18564 | 0.1280 | 0.8123 | 0.6903 | 0.8771 | 0.2980 | 0.0001 |
125
- | 0.1376 | 69.0 | 18837 | 0.1280 | 0.8105 | 0.6800 | 0.8711 | 0.2952 | 0.0001 |
126
- | 0.1375 | 70.0 | 19110 | 0.1276 | 0.8096 | 0.6848 | 0.8709 | 0.2931 | 0.0001 |
127
- | 0.1375 | 71.0 | 19383 | 0.1279 | 0.8073 | 0.6797 | 0.8675 | 0.2904 | 0.0001 |
128
- | 0.1368 | 72.0 | 19656 | 0.1278 | 0.8103 | 0.6802 | 0.8719 | 0.2938 | 0.0001 |
129
- | 0.1368 | 73.0 | 19929 | 0.1272 | 0.8091 | 0.6806 | 0.8683 | 0.2976 | 0.0001 |
130
- | 0.137 | 74.0 | 20202 | 0.1280 | 0.8064 | 0.6777 | 0.8648 | 0.2935 | 0.0001 |
131
- | 0.137 | 75.0 | 20475 | 0.1273 | 0.8110 | 0.6885 | 0.8731 | 0.2924 | 0.0001 |
132
- | 0.1367 | 76.0 | 20748 | 0.1273 | 0.8089 | 0.6811 | 0.8696 | 0.2973 | 0.0001 |
133
- | 0.1358 | 77.0 | 21021 | 0.1275 | 0.8102 | 0.6863 | 0.8739 | 0.2924 | 0.0001 |
134
- | 0.1358 | 78.0 | 21294 | 0.1271 | 0.8122 | 0.6897 | 0.8765 | 0.2945 | 0.0001 |
135
- | 0.1352 | 79.0 | 21567 | 0.1271 | 0.8098 | 0.6882 | 0.8697 | 0.2935 | 0.0001 |
136
- | 0.1352 | 80.0 | 21840 | 0.1272 | 0.8124 | 0.6914 | 0.8773 | 0.2983 | 0.0001 |
137
- | 0.1353 | 81.0 | 22113 | 0.1265 | 0.8104 | 0.6899 | 0.8716 | 0.2966 | 0.0001 |
138
- | 0.1353 | 82.0 | 22386 | 0.1264 | 0.8105 | 0.6845 | 0.8694 | 0.2914 | 0.0001 |
139
- | 0.1337 | 83.0 | 22659 | 0.1273 | 0.8100 | 0.6832 | 0.8701 | 0.2935 | 0.0001 |
140
- | 0.1337 | 84.0 | 22932 | 0.1264 | 0.8124 | 0.6944 | 0.8756 | 0.2959 | 0.0001 |
141
- | 0.1354 | 85.0 | 23205 | 0.1265 | 0.8127 | 0.6880 | 0.8750 | 0.2973 | 0.0001 |
142
- | 0.1354 | 86.0 | 23478 | 0.1259 | 0.8136 | 0.6933 | 0.8746 | 0.2952 | 0.0001 |
143
- | 0.1334 | 87.0 | 23751 | 0.1264 | 0.8111 | 0.6882 | 0.8738 | 0.2966 | 0.0001 |
144
- | 0.1335 | 88.0 | 24024 | 0.1264 | 0.8127 | 0.6860 | 0.8754 | 0.2990 | 0.0001 |
145
- | 0.1335 | 89.0 | 24297 | 0.1269 | 0.8140 | 0.6990 | 0.8792 | 0.2983 | 0.0001 |
146
- | 0.1332 | 90.0 | 24570 | 0.1261 | 0.8155 | 0.6994 | 0.8798 | 0.2980 | 0.0001 |
147
- | 0.1332 | 91.0 | 24843 | 0.1268 | 0.8109 | 0.6828 | 0.8728 | 0.2893 | 0.0001 |
148
- | 0.1326 | 92.0 | 25116 | 0.1261 | 0.8124 | 0.6858 | 0.8724 | 0.2952 | 0.0001 |
149
- | 0.1326 | 93.0 | 25389 | 0.1258 | 0.8138 | 0.6897 | 0.8759 | 0.2966 | 1e-05 |
150
- | 0.132 | 94.0 | 25662 | 0.1268 | 0.8138 | 0.6941 | 0.8755 | 0.2976 | 1e-05 |
151
- | 0.132 | 95.0 | 25935 | 0.1257 | 0.8134 | 0.6913 | 0.8750 | 0.2949 | 1e-05 |
152
- | 0.1294 | 96.0 | 26208 | 0.1259 | 0.8147 | 0.6957 | 0.8763 | 0.2976 | 1e-05 |
153
- | 0.1294 | 97.0 | 26481 | 0.1256 | 0.8126 | 0.6941 | 0.8720 | 0.2945 | 1e-05 |
154
- | 0.1302 | 98.0 | 26754 | 0.1253 | 0.8159 | 0.6951 | 0.8785 | 0.2994 | 1e-05 |
155
- | 0.1298 | 99.0 | 27027 | 0.1249 | 0.8142 | 0.6968 | 0.8752 | 0.2994 | 1e-05 |
156
- | 0.1298 | 100.0 | 27300 | 0.1252 | 0.8135 | 0.6936 | 0.8732 | 0.2973 | 1e-05 |
157
- | 0.1304 | 101.0 | 27573 | 0.1248 | 0.8149 | 0.6961 | 0.8765 | 0.2990 | 1e-05 |
158
- | 0.1304 | 102.0 | 27846 | 0.1266 | 0.8137 | 0.6927 | 0.8738 | 0.2963 | 1e-05 |
159
- | 0.1287 | 103.0 | 28119 | 0.1249 | 0.8146 | 0.6954 | 0.8754 | 0.2990 | 1e-05 |
160
- | 0.1287 | 104.0 | 28392 | 0.1252 | 0.8149 | 0.6927 | 0.8770 | 0.2976 | 1e-05 |
161
- | 0.1282 | 105.0 | 28665 | 0.1251 | 0.8152 | 0.6962 | 0.8773 | 0.2990 | 1e-05 |
162
- | 0.1282 | 106.0 | 28938 | 0.1251 | 0.8147 | 0.6964 | 0.8770 | 0.2997 | 1e-05 |
163
- | 0.1293 | 107.0 | 29211 | 0.1250 | 0.8145 | 0.6946 | 0.8759 | 0.2980 | 1e-05 |
164
- | 0.1293 | 108.0 | 29484 | 0.1249 | 0.8145 | 0.6935 | 0.8751 | 0.2997 | 0.0000 |
165
- | 0.129 | 109.0 | 29757 | 0.1253 | 0.8116 | 0.6901 | 0.8713 | 0.2952 | 0.0000 |
166
- | 0.1293 | 110.0 | 30030 | 0.1252 | 0.8144 | 0.6949 | 0.8768 | 0.2980 | 0.0000 |
167
- | 0.1293 | 111.0 | 30303 | 0.1250 | 0.8137 | 0.6932 | 0.8755 | 0.2983 | 0.0000 |
168
-
169
-
170
- ### Framework versions
171
-
172
- - Transformers 4.41.1
173
- - Pytorch 2.3.0+cu121
174
- - Datasets 2.19.1
175
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: wtfpl
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: facebook/dinov2-base
 
11
  model-index:
12
  - name: DinoVdeau-base-2024_09_03-batch-size32_epochs150_freeze
13
  results: []
14
  ---
15
 
16
+ DinoVd'eau is a fine-tuned version of [facebook/dinov2-base](https://huggingface.co/facebook/dinov2-base). It achieves the following results on the test set:
 
17
 
 
 
 
 
18
  - Loss: 0.1260
19
  - F1 Micro: 0.8131
20
  - F1 Macro: 0.6976
21
  - Roc Auc: 0.8760
22
  - Accuracy: 0.3014
 
23
 
24
+ ---
25
+
26
+ # Model description
27
+ DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
28
 
29
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
30
 
31
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
32
 
33
+ ---
34
+
35
+ # Intended uses & limitations
36
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
37
+
38
+ ---
39
 
40
+ # Training and evaluation data
41
+ Details on the number of images for each class are given in the following table:
42
+ | Class | train | val | test | Total |
43
+ |:-------------------------|--------:|------:|-------:|--------:|
44
+ | Acropore_branched | 1469 | 464 | 475 | 2408 |
45
+ | Acropore_digitised | 568 | 160 | 160 | 888 |
46
+ | Acropore_sub_massive | 150 | 50 | 43 | 243 |
47
+ | Acropore_tabular | 999 | 297 | 293 | 1589 |
48
+ | Algae_assembly | 2546 | 847 | 845 | 4238 |
49
+ | Algae_drawn_up | 367 | 126 | 127 | 620 |
50
+ | Algae_limestone | 1652 | 557 | 563 | 2772 |
51
+ | Algae_sodding | 3148 | 984 | 985 | 5117 |
52
+ | Atra/Leucospilota | 1084 | 348 | 360 | 1792 |
53
+ | Bleached_coral | 219 | 71 | 70 | 360 |
54
+ | Blurred | 191 | 67 | 62 | 320 |
55
+ | Dead_coral | 1979 | 642 | 643 | 3264 |
56
+ | Fish | 2018 | 656 | 647 | 3321 |
57
+ | Homo_sapiens | 161 | 62 | 59 | 282 |
58
+ | Human_object | 157 | 58 | 55 | 270 |
59
+ | Living_coral | 406 | 154 | 141 | 701 |
60
+ | Millepore | 385 | 127 | 125 | 637 |
61
+ | No_acropore_encrusting | 441 | 130 | 154 | 725 |
62
+ | No_acropore_foliaceous | 204 | 36 | 46 | 286 |
63
+ | No_acropore_massive | 1031 | 336 | 338 | 1705 |
64
+ | No_acropore_solitary | 202 | 53 | 48 | 303 |
65
+ | No_acropore_sub_massive | 1401 | 433 | 422 | 2256 |
66
+ | Rock | 4489 | 1495 | 1473 | 7457 |
67
+ | Rubble | 3092 | 1030 | 1001 | 5123 |
68
+ | Sand | 5842 | 1939 | 1938 | 9719 |
69
+ | Sea_cucumber | 1408 | 439 | 447 | 2294 |
70
+ | Sea_urchins | 327 | 107 | 111 | 545 |
71
+ | Sponge | 269 | 96 | 105 | 470 |
72
+ | Syringodium_isoetifolium | 1212 | 392 | 391 | 1995 |
73
+ | Thalassodendron_ciliatum | 782 | 261 | 260 | 1303 |
74
+ | Useless | 579 | 193 | 193 | 965 |
75
 
76
+ ---
77
 
78
+ # Training procedure
79
 
80
+ ## Training hyperparameters
81
 
82
  The following hyperparameters were used during training:
83
+
84
+ - **Number of Epochs**: 150
85
+ - **Learning Rate**: 0.001
86
+ - **Train Batch Size**: 32
87
+ - **Eval Batch Size**: 32
88
+ - **Optimizer**: Adam
89
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
90
+ - **Freeze Encoder**: Yes
91
+ - **Data Augmentation**: Yes
92
+
93
+
94
+ ## Data Augmentation
95
+ Data were augmented using the following transformations :
96
+
97
+ Train Transforms
98
+ - **PreProcess**: No additional parameters
99
+ - **Resize**: probability=1.00
100
+ - **RandomHorizontalFlip**: probability=0.25
101
+ - **RandomVerticalFlip**: probability=0.25
102
+ - **ColorJiggle**: probability=0.25
103
+ - **RandomPerspective**: probability=0.25
104
+ - **Normalize**: probability=1.00
105
+
106
+ Val Transforms
107
+ - **PreProcess**: No additional parameters
108
+ - **Resize**: probability=1.00
109
+ - **Normalize**: probability=1.00
110
+
111
+
112
+
113
+ ## Training results
114
+ Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
115
+ --- | --- | --- | --- | --- | ---
116
+ 1 | 0.17516958713531494 | 0.2079002079002079 | 0.73108765167112 | 0.5105390450682302 | 0.001
117
+ 2 | 0.1577771008014679 | 0.23492723492723494 | 0.7582569600553347 | 0.5498069096094584 | 0.001
118
+ 3 | 0.15162432193756104 | 0.23146223146223147 | 0.7721545657578696 | 0.6037272443934714 | 0.001
119
+ 4 | 0.15218119323253632 | 0.24220374220374222 | 0.7649537378914902 | 0.613953187695023 | 0.001
120
+ 5 | 0.14836864173412323 | 0.24220374220374222 | 0.7719928186714542 | 0.6161642626912543 | 0.001
121
+ 6 | 0.14818257093429565 | 0.2560637560637561 | 0.775030471878809 | 0.6051867487843677 | 0.001
122
+ 7 | 0.1486394852399826 | 0.24185724185724186 | 0.7729166666666668 | 0.617739220969942 | 0.001
123
+ 8 | 0.14861202239990234 | 0.2512127512127512 | 0.7767065175472426 | 0.6171646674895677 | 0.001
124
+ 9 | 0.14834754168987274 | 0.2512127512127512 | 0.7805490458654168 | 0.6366264906922544 | 0.001
125
+ 10 | 0.15029709041118622 | 0.24532224532224534 | 0.7682759232167399 | 0.6081428044829309 | 0.001
126
+ 11 | 0.14407172799110413 | 0.2609147609147609 | 0.7756647297059341 | 0.6199915554248129 | 0.001
127
+ 12 | 0.14866559207439423 | 0.2494802494802495 | 0.781485559413907 | 0.6299207659511814 | 0.001
128
+ 13 | 0.14902691543102264 | 0.25190575190575193 | 0.7779037321241716 | 0.6241659824597257 | 0.001
129
+ 14 | 0.14337006211280823 | 0.26056826056826055 | 0.7826389795829524 | 0.6378982802249643 | 0.001
130
+ 15 | 0.14354591071605682 | 0.2553707553707554 | 0.7873585308562887 | 0.639716503598517 | 0.001
131
+ 16 | 0.1439499706029892 | 0.25675675675675674 | 0.7792974686292388 | 0.6343613127126344 | 0.001
132
+ 17 | 0.14478015899658203 | 0.2543312543312543 | 0.787784461363732 | 0.6422270697798029 | 0.001
133
+ 18 | 0.14397625625133514 | 0.25675675675675674 | 0.786493860845839 | 0.6417123667888478 | 0.001
134
+ 19 | 0.14199253916740417 | 0.253984753984754 | 0.7863510343356792 | 0.6317583185615991 | 0.001
135
+ 20 | 0.14092272520065308 | 0.2588357588357588 | 0.7868513006341401 | 0.6408966299078661 | 0.001
136
+ 21 | 0.1425119787454605 | 0.26195426195426197 | 0.7864882090503504 | 0.6412583916380257 | 0.001
137
+ 22 | 0.15379400551319122 | 0.23700623700623702 | 0.7854284761587195 | 0.6371452798177432 | 0.001
138
+ 23 | 0.1418805718421936 | 0.25571725571725573 | 0.7841676771176165 | 0.6390434486158698 | 0.001
139
+ 24 | 0.14135514199733734 | 0.2598752598752599 | 0.7869535635312129 | 0.6458978920546691 | 0.001
140
+ 25 | 0.13985148072242737 | 0.26853776853776856 | 0.786773581652009 | 0.6262981090846956 | 0.001
141
+ 26 | 0.14649754762649536 | 0.2591822591822592 | 0.7846557710221018 | 0.6237830069375186 | 0.001
142
+ 27 | 0.15506784617900848 | 0.23804573804573806 | 0.7719951506754418 | 0.6344307952131357 | 0.001
143
+ 28 | 0.14431345462799072 | 0.2616077616077616 | 0.7891238152420981 | 0.6429949936408241 | 0.001
144
+ 29 | 0.14275498688220978 | 0.25675675675675674 | 0.7873995663818392 | 0.6415824285032449 | 0.001
145
+ 30 | 0.14164045453071594 | 0.2525987525987526 | 0.7798808735936467 | 0.6308133523221491 | 0.001
146
+ 31 | 0.13976627588272095 | 0.26888426888426886 | 0.7895365707945718 | 0.6431010910213645 | 0.001
147
+ 32 | 0.1448184847831726 | 0.25675675675675674 | 0.7891036166898235 | 0.6520927708015384 | 0.001
148
+ 33 | 0.14042973518371582 | 0.26403326403326405 | 0.7895652173913044 | 0.6496848321151188 | 0.001
149
+ 34 | 0.1426127403974533 | 0.25571725571725573 | 0.7870906828033133 | 0.6448790211155284 | 0.001
150
+ 35 | 0.14135821163654327 | 0.262993762993763 | 0.7846327880264532 | 0.6428423378015612 | 0.001
151
+ 36 | 0.14652539789676666 | 0.26784476784476785 | 0.7834209497328063 | 0.6434020884943297 | 0.001
152
+ 37 | 0.13795886933803558 | 0.2668052668052668 | 0.792425408224331 | 0.6438477431550106 | 0.001
153
+ 38 | 0.13921019434928894 | 0.2636867636867637 | 0.7892280686732029 | 0.6475331965590188 | 0.001
154
+ 39 | 0.14584119617938995 | 0.24601524601524602 | 0.7871620243872598 | 0.659217552215385 | 0.001
155
+ 40 | 0.1389026641845703 | 0.26992376992376993 | 0.79463243873979 | 0.6469476365862663 | 0.001
156
+ 41 | 0.14020991325378418 | 0.2616077616077616 | 0.784842032071618 | 0.6509894683187031 | 0.001
157
+ 42 | 0.14042720198631287 | 0.27165627165627165 | 0.7927685516081564 | 0.6608924914997423 | 0.001
158
+ 43 | 0.13943640887737274 | 0.2695772695772696 | 0.7930726352070125 | 0.6427022769326964 | 0.001
159
+ 44 | 0.1367315948009491 | 0.27546777546777546 | 0.7989137353078458 | 0.6567716426576066 | 0.0001
160
+ 45 | 0.13616175949573517 | 0.28274428274428276 | 0.8018308187828446 | 0.6686203083248894 | 0.0001
161
+ 46 | 0.13375289738178253 | 0.2851697851697852 | 0.8021852369457503 | 0.6640104860714046 | 0.0001
162
+ 47 | 0.14095526933670044 | 0.2785862785862786 | 0.7998804746862461 | 0.65726703563479 | 0.0001
163
+ 48 | 0.13375185430049896 | 0.28482328482328484 | 0.8044442566853957 | 0.6728387979723557 | 0.0001
164
+ 49 | 0.13221527636051178 | 0.2855162855162855 | 0.8058309037900874 | 0.674164075762875 | 0.0001
165
+ 50 | 0.13315953314304352 | 0.28967428967428965 | 0.8062985513331933 | 0.6738599949249782 | 0.0001
166
+ 51 | 0.13057135045528412 | 0.28967428967428965 | 0.8062836021505377 | 0.6770873238469556 | 0.0001
167
+ 52 | 0.13108478486537933 | 0.2872487872487873 | 0.8043922369765066 | 0.6726562275384118 | 0.0001
168
+ 53 | 0.13161474466323853 | 0.2872487872487873 | 0.8070734160241367 | 0.6702824874792834 | 0.0001
169
+ 54 | 0.1315840184688568 | 0.2882882882882883 | 0.8064162093710426 | 0.6787531928667037 | 0.0001
170
+ 55 | 0.13084293901920319 | 0.2875952875952876 | 0.8061478697800111 | 0.6698514928377199 | 0.0001
171
+ 56 | 0.12969879806041718 | 0.29417879417879417 | 0.8094286190238215 | 0.6799502024965028 | 0.0001
172
+ 57 | 0.1296372264623642 | 0.2934857934857935 | 0.8086806577785254 | 0.6716759101412201 | 0.0001
173
+ 58 | 0.12973745167255402 | 0.29244629244629244 | 0.8068982880161129 | 0.6784509633805341 | 0.0001
174
+ 59 | 0.12995606660842896 | 0.2910602910602911 | 0.8087436297013858 | 0.6811347101829983 | 0.0001
175
+ 60 | 0.13024823367595673 | 0.28794178794178793 | 0.8056052474657126 | 0.6725887638706813 | 0.0001
176
+ 61 | 0.12872998416423798 | 0.2948717948717949 | 0.8095537925534148 | 0.6842961167409227 | 0.0001
177
+ 62 | 0.12909561395645142 | 0.29002079002079 | 0.8079526226734349 | 0.6821531683206365 | 0.0001
178
+ 63 | 0.12872986495494843 | 0.29799029799029797 | 0.8075538806791719 | 0.6812919501021206 | 0.0001
179
+ 64 | 0.12864243984222412 | 0.2959112959112959 | 0.8090726144558109 | 0.6805602232602442 | 0.0001
180
+ 65 | 0.12800218164920807 | 0.29313929313929316 | 0.809268560334276 | 0.6837997472607307 | 0.0001
181
+ 66 | 0.12777170538902283 | 0.2959112959112959 | 0.8107521495951249 | 0.685457875933014 | 0.0001
182
+ 67 | 0.12816764414310455 | 0.2948717948717949 | 0.8098450774612694 | 0.6849396578990685 | 0.0001
183
+ 68 | 0.12804801762104034 | 0.29799029799029797 | 0.8123470107455503 | 0.6903099963278952 | 0.0001
184
+ 69 | 0.12803924083709717 | 0.29521829521829523 | 0.8104663431103608 | 0.6800351861453543 | 0.0001
185
+ 70 | 0.12764029204845428 | 0.29313929313929316 | 0.8096462751380749 | 0.684802818649885 | 0.0001
186
+ 71 | 0.12794704735279083 | 0.29036729036729036 | 0.8072724183339705 | 0.6796736257485385 | 0.0001
187
+ 72 | 0.12780210375785828 | 0.2938322938322938 | 0.8102650399663442 | 0.6802343842914587 | 0.0001
188
+ 73 | 0.12723641097545624 | 0.29764379764379767 | 0.8091473263623224 | 0.6805723882610378 | 0.0001
189
+ 74 | 0.12804573774337769 | 0.2934857934857935 | 0.8064391831142698 | 0.6777188921642516 | 0.0001
190
+ 75 | 0.1273234635591507 | 0.29244629244629244 | 0.8109922383050138 | 0.6885203936930924 | 0.0001
191
+ 76 | 0.1272992193698883 | 0.2972972972972973 | 0.8088975345709815 | 0.6810578369044884 | 0.0001
192
+ 77 | 0.12745273113250732 | 0.29244629244629244 | 0.8102101349375445 | 0.6863183190306963 | 0.0001
193
+ 78 | 0.12705788016319275 | 0.2945252945252945 | 0.8121675531914894 | 0.6897104532016692 | 0.0001
194
+ 79 | 0.12710121273994446 | 0.2934857934857935 | 0.809842452990005 | 0.6881838490414868 | 0.0001
195
+ 80 | 0.12715762853622437 | 0.2983367983367983 | 0.8123911420751431 | 0.6914032136958002 | 0.0001
196
+ 81 | 0.12650521099567413 | 0.2966042966042966 | 0.810378232667846 | 0.6899389708752343 | 0.0001
197
+ 82 | 0.12635371088981628 | 0.29140679140679143 | 0.8105446364138047 | 0.6844864031747653 | 0.0001
198
+ 83 | 0.1272997260093689 | 0.2934857934857935 | 0.8099670022844573 | 0.6832392344549459 | 0.0001
199
+ 84 | 0.12640425562858582 | 0.2959112959112959 | 0.8124478558318038 | 0.6944491344986764 | 0.0001
200
+ 85 | 0.12647400796413422 | 0.2972972972972973 | 0.812659392115055 | 0.6879519222426981 | 0.0001
201
+ 86 | 0.12585221230983734 | 0.29521829521829523 | 0.8135877542461731 | 0.6933253774763921 | 0.0001
202
+ 87 | 0.12641744315624237 | 0.2966042966042966 | 0.8111366966715512 | 0.6882459007361815 | 0.0001
203
+ 88 | 0.1263686865568161 | 0.29902979902979904 | 0.8126931106471816 | 0.6859575429209334 | 0.0001
204
+ 89 | 0.12690132856369019 | 0.2983367983367983 | 0.8140188460902628 | 0.6990366097632199 | 0.0001
205
+ 90 | 0.12612390518188477 | 0.29799029799029797 | 0.8155163144617673 | 0.6994167448254883 | 0.0001
206
+ 91 | 0.1268243044614792 | 0.28932778932778935 | 0.8108811552831535 | 0.6827913109763548 | 0.0001
207
+ 92 | 0.12613284587860107 | 0.29521829521829523 | 0.8123787840458724 | 0.6858483939371968 | 0.0001
208
+ 93 | 0.1258293092250824 | 0.2966042966042966 | 0.8138213420238991 | 0.6897216822080747 | 1e-05
209
+ 94 | 0.12682591378688812 | 0.29764379764379767 | 0.8137706015226304 | 0.6940665827082791 | 1e-05
210
+ 95 | 0.1256789118051529 | 0.2948717948717949 | 0.8133975298304374 | 0.6913394393323408 | 1e-05
211
+ 96 | 0.12587758898735046 | 0.29764379764379767 | 0.8147281313996739 | 0.6957055849225957 | 1e-05
212
+ 97 | 0.1256256103515625 | 0.2945252945252945 | 0.8126029480086159 | 0.6940781337907567 | 1e-05
213
+ 98 | 0.1253080666065216 | 0.2993762993762994 | 0.8158955813276801 | 0.6951390304078455 | 1e-05
214
+ 99 | 0.12485036998987198 | 0.2993762993762994 | 0.8141971169963125 | 0.6968244403216463 | 1e-05
215
+ 100 | 0.12519583106040955 | 0.2972972972972973 | 0.8134507606084869 | 0.693647218520028 | 1e-05
216
+ 101 | 0.12475299090147018 | 0.29902979902979904 | 0.8148550421923302 | 0.6961023046950545 | 1e-05
217
+ 102 | 0.12659381330013275 | 0.29625779625779625 | 0.81366198367965 | 0.692743439816851 | 1e-05
218
+ 103 | 0.124935083091259 | 0.29902979902979904 | 0.8146347596496376 | 0.6954353634647259 | 1e-05
219
+ 104 | 0.12519653141498566 | 0.29764379764379767 | 0.8148796863922599 | 0.692659001716947 | 1e-05
220
+ 105 | 0.12513257563114166 | 0.29902979902979904 | 0.8152223750573132 | 0.6961790886935857 | 1e-05
221
+ 106 | 0.12511174380779266 | 0.29972279972279975 | 0.8147252563995664 | 0.6963861386142265 | 1e-05
222
+ 107 | 0.12498941272497177 | 0.29799029799029797 | 0.8144894800685992 | 0.694620567930595 | 1e-05
223
+ 108 | 0.1248873621225357 | 0.29972279972279975 | 0.8144792584203683 | 0.6934713387989168 | 1.0000000000000002e-06
224
+ 109 | 0.12527066469192505 | 0.29521829521829523 | 0.8116150302210575 | 0.6900779361953018 | 1.0000000000000002e-06
225
+ 110 | 0.125152125954628 | 0.29799029799029797 | 0.8143917285082964 | 0.69491512245201 | 1.0000000000000002e-06
226
+ 111 | 0.12495684623718262 | 0.2983367983367983 | 0.8137025263510123 | 0.6932228755688746 | 1.0000000000000002e-06
227
+
228
+
229
+ ---
230
+
231
+ # CO2 Emissions
232
+
233
+ The estimated CO2 emissions for training this model are documented below:
234
+
235
+ - **Emissions**: 1.3368314147555413 grams of CO2
236
+ - **Source**: Code Carbon
237
+ - **Training Type**: fine-tuning
238
+ - **Geographical Location**: Brest, France
239
+ - **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
240
+
241
+
242
+ ---
243
+
244
+ # Framework Versions
245
+
246
+ - **Transformers**: 4.41.1
247
+ - **Pytorch**: 2.3.0+cu121
248
+ - **Datasets**: 2.19.1
249
+ - **Tokenizers**: 0.19.1
250
+