lombardata
commited on
Commit
•
5a519f8
1
Parent(s):
60654e2
Upload README.md
Browse files
README.md
CHANGED
@@ -1,208 +1,283 @@
|
|
|
|
1 |
---
|
2 |
-
|
3 |
-
|
|
|
4 |
tags:
|
|
|
|
|
5 |
- generated_from_trainer
|
6 |
-
|
7 |
-
- accuracy
|
8 |
model-index:
|
9 |
- name: DinoVdeau-small-2024_08_31-batch-size32_epochs150_freeze
|
10 |
results: []
|
11 |
---
|
12 |
|
13 |
-
|
14 |
-
should probably proofread and complete it, then remove this comment. -->
|
15 |
-
|
16 |
-
# DinoVdeau-small-2024_08_31-batch-size32_epochs150_freeze
|
17 |
|
18 |
-
This model is a fine-tuned version of [facebook/dinov2-small](https://huggingface.co/facebook/dinov2-small) on the None dataset.
|
19 |
-
It achieves the following results on the evaluation set:
|
20 |
- Loss: 0.1320
|
21 |
- F1 Micro: 0.8009
|
22 |
- F1 Macro: 0.6614
|
23 |
- Roc Auc: 0.8649
|
24 |
- Accuracy: 0.2903
|
25 |
-
- Learning Rate: 0.0000
|
26 |
|
27 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
|
29 |
-
|
30 |
|
31 |
-
|
|
|
32 |
|
33 |
-
|
34 |
|
35 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
36 |
|
37 |
-
|
38 |
|
39 |
-
|
40 |
|
41 |
-
|
42 |
|
43 |
The following hyperparameters were used during training:
|
44 |
-
|
45 |
-
-
|
46 |
-
-
|
47 |
-
-
|
48 |
-
-
|
49 |
-
-
|
50 |
-
-
|
51 |
-
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
| 0.
|
78 |
-
| 0.
|
79 |
-
| 0.
|
80 |
-
| 0.
|
81 |
-
| 0.
|
82 |
-
| 0.
|
83 |
-
| 0.
|
84 |
-
| 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
-
| 0.
|
90 |
-
| 0.
|
91 |
-
| 0.
|
92 |
-
| 0.
|
93 |
-
| 0.
|
94 |
-
| 0.
|
95 |
-
| 0.
|
96 |
-
| 0.
|
97 |
-
| 0.
|
98 |
-
| 0.
|
99 |
-
| 0.
|
100 |
-
| 0.
|
101 |
-
| 0.
|
102 |
-
| 0.
|
103 |
-
| 0.
|
104 |
-
| 0.
|
105 |
-
| 0.
|
106 |
-
| 0.
|
107 |
-
| 0.
|
108 |
-
| 0.
|
109 |
-
| 0.
|
110 |
-
| 0.
|
111 |
-
| 0.
|
112 |
-
| 0.
|
113 |
-
| 0.
|
114 |
-
| 0.
|
115 |
-
| 0.
|
116 |
-
| 0.
|
117 |
-
| 0.
|
118 |
-
| 0.
|
119 |
-
| 0.
|
120 |
-
| 0.
|
121 |
-
| 0.
|
122 |
-
| 0.
|
123 |
-
| 0.
|
124 |
-
| 0.
|
125 |
-
| 0.
|
126 |
-
| 0.
|
127 |
-
| 0.
|
128 |
-
| 0.
|
129 |
-
| 0.
|
130 |
-
| 0.
|
131 |
-
| 0.
|
132 |
-
| 0.
|
133 |
-
| 0.
|
134 |
-
| 0.
|
135 |
-
| 0.
|
136 |
-
| 0.
|
137 |
-
| 0.
|
138 |
-
| 0.
|
139 |
-
| 0.
|
140 |
-
| 0.
|
141 |
-
| 0.
|
142 |
-
| 0.
|
143 |
-
| 0.
|
144 |
-
| 0.
|
145 |
-
| 0.
|
146 |
-
| 0.
|
147 |
-
| 0.
|
148 |
-
| 0.
|
149 |
-
| 0.
|
150 |
-
| 0.
|
151 |
-
| 0.
|
152 |
-
| 0.
|
153 |
-
| 0.
|
154 |
-
| 0.
|
155 |
-
| 0.
|
156 |
-
| 0.
|
157 |
-
| 0.
|
158 |
-
| 0.
|
159 |
-
| 0.
|
160 |
-
| 0.
|
161 |
-
| 0.
|
162 |
-
| 0.
|
163 |
-
| 0.
|
164 |
-
| 0.
|
165 |
-
| 0.
|
166 |
-
| 0.
|
167 |
-
| 0.
|
168 |
-
| 0.
|
169 |
-
| 0.
|
170 |
-
| 0.
|
171 |
-
| 0.
|
172 |
-
| 0.
|
173 |
-
| 0.
|
174 |
-
| 0.
|
175 |
-
| 0.
|
176 |
-
| 0.
|
177 |
-
| 0.
|
178 |
-
| 0.
|
179 |
-
| 0.
|
180 |
-
| 0.
|
181 |
-
| 0.
|
182 |
-
| 0.
|
183 |
-
| 0.
|
184 |
-
| 0.
|
185 |
-
| 0.
|
186 |
-
| 0.
|
187 |
-
| 0.
|
188 |
-
| 0.
|
189 |
-
| 0.
|
190 |
-
| 0.
|
191 |
-
| 0.
|
192 |
-
| 0.
|
193 |
-
| 0.
|
194 |
-
| 0.
|
195 |
-
| 0.
|
196 |
-
| 0.
|
197 |
-
| 0.
|
198 |
-
| 0.
|
199 |
-
| 0.
|
200 |
-
| 0.
|
201 |
-
|
202 |
-
|
203 |
-
|
204 |
-
|
205 |
-
|
206 |
-
|
207 |
-
|
208 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
---
|
3 |
+
language:
|
4 |
+
- eng
|
5 |
+
license: wtfpl
|
6 |
tags:
|
7 |
+
- multilabel-image-classification
|
8 |
+
- multilabel
|
9 |
- generated_from_trainer
|
10 |
+
base_model: facebook/dinov2-small
|
|
|
11 |
model-index:
|
12 |
- name: DinoVdeau-small-2024_08_31-batch-size32_epochs150_freeze
|
13 |
results: []
|
14 |
---
|
15 |
|
16 |
+
DinoVd'eau is a fine-tuned version of [facebook/dinov2-small](https://huggingface.co/facebook/dinov2-small). It achieves the following results on the test set:
|
|
|
|
|
|
|
17 |
|
|
|
|
|
18 |
- Loss: 0.1320
|
19 |
- F1 Micro: 0.8009
|
20 |
- F1 Macro: 0.6614
|
21 |
- Roc Auc: 0.8649
|
22 |
- Accuracy: 0.2903
|
|
|
23 |
|
24 |
+
---
|
25 |
+
|
26 |
+
# Model description
|
27 |
+
DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
|
28 |
+
|
29 |
+
The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
|
30 |
+
|
31 |
+
- **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
|
32 |
|
33 |
+
---
|
34 |
|
35 |
+
# Intended uses & limitations
|
36 |
+
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
|
37 |
|
38 |
+
---
|
39 |
|
40 |
+
# Training and evaluation data
|
41 |
+
Details on the number of images for each class are given in the following table:
|
42 |
+
| Class | train | val | test | Total |
|
43 |
+
|:-------------------------|--------:|------:|-------:|--------:|
|
44 |
+
| Acropore_branched | 1469 | 464 | 475 | 2408 |
|
45 |
+
| Acropore_digitised | 568 | 160 | 160 | 888 |
|
46 |
+
| Acropore_sub_massive | 150 | 50 | 43 | 243 |
|
47 |
+
| Acropore_tabular | 999 | 297 | 293 | 1589 |
|
48 |
+
| Algae_assembly | 2546 | 847 | 845 | 4238 |
|
49 |
+
| Algae_drawn_up | 367 | 126 | 127 | 620 |
|
50 |
+
| Algae_limestone | 1652 | 557 | 563 | 2772 |
|
51 |
+
| Algae_sodding | 3148 | 984 | 985 | 5117 |
|
52 |
+
| Atra/Leucospilota | 1084 | 348 | 360 | 1792 |
|
53 |
+
| Bleached_coral | 219 | 71 | 70 | 360 |
|
54 |
+
| Blurred | 191 | 67 | 62 | 320 |
|
55 |
+
| Dead_coral | 1979 | 642 | 643 | 3264 |
|
56 |
+
| Fish | 2018 | 656 | 647 | 3321 |
|
57 |
+
| Homo_sapiens | 161 | 62 | 59 | 282 |
|
58 |
+
| Human_object | 157 | 58 | 55 | 270 |
|
59 |
+
| Living_coral | 406 | 154 | 141 | 701 |
|
60 |
+
| Millepore | 385 | 127 | 125 | 637 |
|
61 |
+
| No_acropore_encrusting | 441 | 130 | 154 | 725 |
|
62 |
+
| No_acropore_foliaceous | 204 | 36 | 46 | 286 |
|
63 |
+
| No_acropore_massive | 1031 | 336 | 338 | 1705 |
|
64 |
+
| No_acropore_solitary | 202 | 53 | 48 | 303 |
|
65 |
+
| No_acropore_sub_massive | 1401 | 433 | 422 | 2256 |
|
66 |
+
| Rock | 4489 | 1495 | 1473 | 7457 |
|
67 |
+
| Rubble | 3092 | 1030 | 1001 | 5123 |
|
68 |
+
| Sand | 5842 | 1939 | 1938 | 9719 |
|
69 |
+
| Sea_cucumber | 1408 | 439 | 447 | 2294 |
|
70 |
+
| Sea_urchins | 327 | 107 | 111 | 545 |
|
71 |
+
| Sponge | 269 | 96 | 105 | 470 |
|
72 |
+
| Syringodium_isoetifolium | 1212 | 392 | 391 | 1995 |
|
73 |
+
| Thalassodendron_ciliatum | 782 | 261 | 260 | 1303 |
|
74 |
+
| Useless | 579 | 193 | 193 | 965 |
|
75 |
|
76 |
+
---
|
77 |
|
78 |
+
# Training procedure
|
79 |
|
80 |
+
## Training hyperparameters
|
81 |
|
82 |
The following hyperparameters were used during training:
|
83 |
+
|
84 |
+
- **Number of Epochs**: 150
|
85 |
+
- **Learning Rate**: 0.001
|
86 |
+
- **Train Batch Size**: 32
|
87 |
+
- **Eval Batch Size**: 32
|
88 |
+
- **Optimizer**: Adam
|
89 |
+
- **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
|
90 |
+
- **Freeze Encoder**: Yes
|
91 |
+
- **Data Augmentation**: Yes
|
92 |
+
|
93 |
+
|
94 |
+
## Data Augmentation
|
95 |
+
Data were augmented using the following transformations :
|
96 |
+
|
97 |
+
Train Transforms
|
98 |
+
- **PreProcess**: No additional parameters
|
99 |
+
- **Resize**: probability=1.00
|
100 |
+
- **RandomHorizontalFlip**: probability=0.25
|
101 |
+
- **RandomVerticalFlip**: probability=0.25
|
102 |
+
- **ColorJiggle**: probability=0.25
|
103 |
+
- **RandomPerspective**: probability=0.25
|
104 |
+
- **Normalize**: probability=1.00
|
105 |
+
|
106 |
+
Val Transforms
|
107 |
+
- **PreProcess**: No additional parameters
|
108 |
+
- **Resize**: probability=1.00
|
109 |
+
- **Normalize**: probability=1.00
|
110 |
+
|
111 |
+
|
112 |
+
|
113 |
+
## Training results
|
114 |
+
Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
|
115 |
+
--- | --- | --- | --- | --- | ---
|
116 |
+
1 | 0.19568666815757751 | 0.19057519057519057 | 0.7088941673264713 | 0.4058921954514261 | 0.001
|
117 |
+
2 | 0.17198018729686737 | 0.21933471933471935 | 0.738139514768845 | 0.4867943512801917 | 0.001
|
118 |
+
3 | 0.16209888458251953 | 0.23215523215523215 | 0.7578947368421052 | 0.5587016500092944 | 0.001
|
119 |
+
4 | 0.15948981046676636 | 0.22487872487872487 | 0.7463059684835497 | 0.5561953540051209 | 0.001
|
120 |
+
5 | 0.15691693127155304 | 0.23146223146223147 | 0.7510718113612004 | 0.5723046956548954 | 0.001
|
121 |
+
6 | 0.15302371978759766 | 0.2363132363132363 | 0.7634727923836142 | 0.5786669115862841 | 0.001
|
122 |
+
7 | 0.1523299366235733 | 0.23354123354123354 | 0.7651630269613162 | 0.5981729145672101 | 0.001
|
123 |
+
8 | 0.15311872959136963 | 0.24185724185724186 | 0.7655172413793103 | 0.587992292024695 | 0.001
|
124 |
+
9 | 0.14992575347423553 | 0.24012474012474014 | 0.7699542669773061 | 0.606908576330327 | 0.001
|
125 |
+
10 | 0.1509619951248169 | 0.24393624393624394 | 0.7606115107913669 | 0.5829080312220596 | 0.001
|
126 |
+
11 | 0.1520717293024063 | 0.2505197505197505 | 0.7689559002963221 | 0.5976223089766404 | 0.001
|
127 |
+
12 | 0.15027731657028198 | 0.2442827442827443 | 0.7759986516096409 | 0.607405900640871 | 0.001
|
128 |
+
13 | 0.1504218876361847 | 0.24393624393624394 | 0.7623558852444365 | 0.6003271512523337 | 0.001
|
129 |
+
14 | 0.1496724784374237 | 0.24462924462924462 | 0.7644358114073813 | 0.602811285040826 | 0.001
|
130 |
+
15 | 0.14749661087989807 | 0.2512127512127512 | 0.7751615281210703 | 0.6066013767027806 | 0.001
|
131 |
+
16 | 0.14998775720596313 | 0.24636174636174638 | 0.7645565108923241 | 0.5838354990739413 | 0.001
|
132 |
+
17 | 0.15297245979309082 | 0.24566874566874566 | 0.7719883641341547 | 0.6073459016890155 | 0.001
|
133 |
+
18 | 0.14907290041446686 | 0.24393624393624394 | 0.7751951282271207 | 0.614324753279198 | 0.001
|
134 |
+
19 | 0.14951026439666748 | 0.23458073458073458 | 0.7739734788726388 | 0.6075499214740471 | 0.001
|
135 |
+
20 | 0.14873762428760529 | 0.24532224532224534 | 0.7636993911381718 | 0.595638442008225 | 0.001
|
136 |
+
21 | 0.14705629646778107 | 0.24740124740124741 | 0.780452718426063 | 0.6164990545073296 | 0.001
|
137 |
+
22 | 0.1508719027042389 | 0.24532224532224534 | 0.7753641707130079 | 0.6073576225776433 | 0.001
|
138 |
+
23 | 0.15015815198421478 | 0.2428967428967429 | 0.771920553133395 | 0.6127152502703448 | 0.001
|
139 |
+
24 | 0.14965225756168365 | 0.24012474012474014 | 0.7698941591532732 | 0.5849380548549015 | 0.001
|
140 |
+
25 | 0.14702074229717255 | 0.24255024255024255 | 0.7761348897535668 | 0.6035289549510865 | 0.001
|
141 |
+
26 | 0.14808295667171478 | 0.24220374220374222 | 0.7751430907604253 | 0.6064603919289959 | 0.001
|
142 |
+
27 | 0.14581289887428284 | 0.24740124740124741 | 0.7689308343302761 | 0.6135774018658996 | 0.001
|
143 |
+
28 | 0.1453842669725418 | 0.24462924462924462 | 0.7751325049960902 | 0.6077297645661711 | 0.001
|
144 |
+
29 | 0.14941243827342987 | 0.24566874566874566 | 0.7735191637630662 | 0.6107922701154117 | 0.001
|
145 |
+
30 | 0.14549985527992249 | 0.24982674982674982 | 0.7705324709843182 | 0.5982833860845571 | 0.001
|
146 |
+
31 | 0.14541107416152954 | 0.2532917532917533 | 0.7784728768532008 | 0.6068619458731248 | 0.001
|
147 |
+
32 | 0.14657220244407654 | 0.24532224532224534 | 0.7746102833519939 | 0.6145316287096297 | 0.001
|
148 |
+
33 | 0.14459234476089478 | 0.253984753984754 | 0.777031154551008 | 0.6124691593400795 | 0.001
|
149 |
+
34 | 0.1468168944120407 | 0.24462924462924462 | 0.7781283769180896 | 0.6168054796129936 | 0.001
|
150 |
+
35 | 0.14858707785606384 | 0.2494802494802495 | 0.7766880749869814 | 0.6193343400891848 | 0.001
|
151 |
+
36 | 0.14637114107608795 | 0.24878724878724878 | 0.7718835224773468 | 0.6092667253949349 | 0.001
|
152 |
+
37 | 0.1448281705379486 | 0.24982674982674982 | 0.7733602776435442 | 0.6127183895875491 | 0.001
|
153 |
+
38 | 0.1450735628604889 | 0.25225225225225223 | 0.7814896880859042 | 0.6109962510638844 | 0.001
|
154 |
+
39 | 0.14469724893569946 | 0.24982674982674982 | 0.7824146207942057 | 0.6272196317832909 | 0.001
|
155 |
+
40 | 0.14824891090393066 | 0.25363825363825365 | 0.7836651178652115 | 0.6265963634718456 | 0.0001
|
156 |
+
41 | 0.14141727983951569 | 0.2616077616077616 | 0.7833456473553827 | 0.6323784470247855 | 0.0001
|
157 |
+
42 | 0.13979895412921906 | 0.26195426195426197 | 0.7884351407000686 | 0.6371841233046203 | 0.0001
|
158 |
+
43 | 0.14107641577720642 | 0.26403326403326405 | 0.7871061893724783 | 0.6366820358518588 | 0.0001
|
159 |
+
44 | 0.13898694515228271 | 0.26126126126126126 | 0.787878787878788 | 0.6256922069455233 | 0.0001
|
160 |
+
45 | 0.13859130442142487 | 0.2664587664587665 | 0.7894011202068074 | 0.6421056073559387 | 0.0001
|
161 |
+
46 | 0.139601469039917 | 0.2664587664587665 | 0.7873893327575039 | 0.6283048537279357 | 0.0001
|
162 |
+
47 | 0.13869330286979675 | 0.2636867636867637 | 0.7863567238757333 | 0.6286555138094179 | 0.0001
|
163 |
+
48 | 0.13777127861976624 | 0.26784476784476785 | 0.7913177234660741 | 0.6334934953582803 | 0.0001
|
164 |
+
49 | 0.1377096027135849 | 0.26403326403326405 | 0.7933989479042932 | 0.6381777921693204 | 0.0001
|
165 |
+
50 | 0.13755330443382263 | 0.2674982674982675 | 0.7918342891380639 | 0.6362718007605523 | 0.0001
|
166 |
+
51 | 0.13754987716674805 | 0.2661122661122661 | 0.7928808087673094 | 0.6426825970872383 | 0.0001
|
167 |
+
52 | 0.13771678507328033 | 0.26576576576576577 | 0.7871186146434616 | 0.6367912909960436 | 0.0001
|
168 |
+
53 | 0.13740690052509308 | 0.2692307692307692 | 0.7928592630284527 | 0.640555047060403 | 0.0001
|
169 |
+
54 | 0.1368684023618698 | 0.27165627165627165 | 0.7920979171140219 | 0.6412320555565514 | 0.0001
|
170 |
+
55 | 0.13703426718711853 | 0.2702702702702703 | 0.7914089347079037 | 0.6377616721633446 | 0.0001
|
171 |
+
56 | 0.1364637017250061 | 0.2643797643797644 | 0.7931107623128156 | 0.6425003998141597 | 0.0001
|
172 |
+
57 | 0.13675515353679657 | 0.2674982674982675 | 0.7926408585665006 | 0.6381793578718891 | 0.0001
|
173 |
+
58 | 0.1364695280790329 | 0.2674982674982675 | 0.791562634524322 | 0.637380953089336 | 0.0001
|
174 |
+
59 | 0.13641765713691711 | 0.2674982674982675 | 0.7922245108135942 | 0.6428884521567982 | 0.0001
|
175 |
+
60 | 0.13687649369239807 | 0.26507276507276506 | 0.7882888744307093 | 0.6357999016219877 | 0.0001
|
176 |
+
61 | 0.13638463616371155 | 0.2713097713097713 | 0.7945638702508654 | 0.6503848519713329 | 0.0001
|
177 |
+
62 | 0.13563227653503418 | 0.2751212751212751 | 0.7931640039405492 | 0.6441767594174573 | 0.0001
|
178 |
+
63 | 0.1355270892381668 | 0.27373527373527373 | 0.7966116124638174 | 0.6515952055035917 | 0.0001
|
179 |
+
64 | 0.13592010736465454 | 0.26784476784476785 | 0.7934075342465754 | 0.6450040026439422 | 0.0001
|
180 |
+
65 | 0.13569533824920654 | 0.27061677061677064 | 0.7936467053015668 | 0.64551501310817 | 0.0001
|
181 |
+
66 | 0.13565082848072052 | 0.2713097713097713 | 0.794643237940888 | 0.6477176853690674 | 0.0001
|
182 |
+
67 | 0.13533934950828552 | 0.27546777546777546 | 0.7965922095536813 | 0.6544361257862924 | 0.0001
|
183 |
+
68 | 0.1353396475315094 | 0.2733887733887734 | 0.7955772910907932 | 0.6519486064773884 | 0.0001
|
184 |
+
69 | 0.13474246859550476 | 0.26992376992376993 | 0.7966188524590164 | 0.6515714856354324 | 0.0001
|
185 |
+
70 | 0.13504748046398163 | 0.272002772002772 | 0.7944687795241776 | 0.6441608871918139 | 0.0001
|
186 |
+
71 | 0.13502468168735504 | 0.27234927234927236 | 0.7933057280883367 | 0.6441889860402124 | 0.0001
|
187 |
+
72 | 0.1344645917415619 | 0.2758142758142758 | 0.7969950486597234 | 0.6484748365424647 | 0.0001
|
188 |
+
73 | 0.1341526359319687 | 0.27616077616077617 | 0.7977006599957419 | 0.6518769914193778 | 0.0001
|
189 |
+
74 | 0.13499116897583008 | 0.2751212751212751 | 0.7914797229603171 | 0.641334935505441 | 0.0001
|
190 |
+
75 | 0.13461369276046753 | 0.2751212751212751 | 0.7946678133734681 | 0.6485229770180625 | 0.0001
|
191 |
+
76 | 0.13438266515731812 | 0.2758142758142758 | 0.7964594201659113 | 0.6478195810395848 | 0.0001
|
192 |
+
77 | 0.13460540771484375 | 0.27754677754677753 | 0.7977742853502102 | 0.6536737916153181 | 0.0001
|
193 |
+
78 | 0.13411369919776917 | 0.27754677754677753 | 0.7978169818504888 | 0.6543115985953537 | 0.0001
|
194 |
+
79 | 0.13399606943130493 | 0.2740817740817741 | 0.7953020134228188 | 0.6523004018612216 | 0.0001
|
195 |
+
80 | 0.1344238668680191 | 0.27823977823977825 | 0.7993085420355848 | 0.6545582038870168 | 0.0001
|
196 |
+
81 | 0.13405664265155792 | 0.2758142758142758 | 0.7966715529878418 | 0.6559691700651434 | 0.0001
|
197 |
+
82 | 0.13407430052757263 | 0.2765072765072765 | 0.7947541551246537 | 0.6453669674995801 | 0.0001
|
198 |
+
83 | 0.1350804716348648 | 0.2702702702702703 | 0.7924365020985678 | 0.645966570658811 | 0.0001
|
199 |
+
84 | 0.13387472927570343 | 0.27546777546777546 | 0.7957293542577825 | 0.6512285101875886 | 0.0001
|
200 |
+
85 | 0.13341927528381348 | 0.27927927927927926 | 0.7990622335890879 | 0.6531817491521362 | 0.0001
|
201 |
+
86 | 0.13337253034114838 | 0.2747747747747748 | 0.7988261313371896 | 0.6595866427349153 | 0.0001
|
202 |
+
87 | 0.1339845359325409 | 0.27442827442827444 | 0.7956179390619651 | 0.6467323251879672 | 0.0001
|
203 |
+
88 | 0.13357459008693695 | 0.2747747747747748 | 0.7981612326551459 | 0.648318545746826 | 0.0001
|
204 |
+
89 | 0.13366733491420746 | 0.2806652806652807 | 0.8014968675104065 | 0.6585340844298272 | 0.0001
|
205 |
+
90 | 0.1332736760377884 | 0.2772002772002772 | 0.8010798042854732 | 0.66211749340029 | 0.0001
|
206 |
+
91 | 0.13367226719856262 | 0.27823977823977825 | 0.7956933454403943 | 0.6528573832362276 | 0.0001
|
207 |
+
92 | 0.13348612189292908 | 0.27546777546777546 | 0.796086375587259 | 0.6513649424471982 | 0.0001
|
208 |
+
93 | 0.1330718696117401 | 0.2758142758142758 | 0.8001861094662043 | 0.6559763883082907 | 0.0001
|
209 |
+
94 | 0.13329002261161804 | 0.2758142758142758 | 0.7995090362720617 | 0.6553585917255438 | 0.0001
|
210 |
+
95 | 0.13314621150493622 | 0.2758142758142758 | 0.7979651162790697 | 0.6579543710907207 | 0.0001
|
211 |
+
96 | 0.13279949128627777 | 0.2751212751212751 | 0.7992523999660183 | 0.6556445954379041 | 0.0001
|
212 |
+
97 | 0.1332886964082718 | 0.27823977823977825 | 0.7977296181630549 | 0.6492741904723621 | 0.0001
|
213 |
+
98 | 0.13266970217227936 | 0.27546777546777546 | 0.799611141637432 | 0.6600105762308898 | 0.0001
|
214 |
+
99 | 0.13253149390220642 | 0.27165627165627165 | 0.7978809757764771 | 0.6589970862385839 | 0.0001
|
215 |
+
100 | 0.1329408884048462 | 0.27616077616077617 | 0.797143840330351 | 0.6570195655430786 | 0.0001
|
216 |
+
101 | 0.13274870812892914 | 0.28205128205128205 | 0.7991615690636095 | 0.657951499975745 | 0.0001
|
217 |
+
102 | 0.1326293796300888 | 0.2817047817047817 | 0.7986821274228745 | 0.654306822863844 | 0.0001
|
218 |
+
103 | 0.13247379660606384 | 0.2803187803187803 | 0.7993688968487486 | 0.6518495856500403 | 0.0001
|
219 |
+
104 | 0.13315415382385254 | 0.27754677754677753 | 0.8010850676047981 | 0.6612536009112525 | 0.0001
|
220 |
+
105 | 0.13218620419502258 | 0.2830907830907831 | 0.8012698412698412 | 0.6635718544409769 | 0.0001
|
221 |
+
106 | 0.13239973783493042 | 0.2830907830907831 | 0.800988243312319 | 0.6588128942023547 | 0.0001
|
222 |
+
107 | 0.13358280062675476 | 0.2785862785862786 | 0.7985513421389007 | 0.650564106362156 | 0.0001
|
223 |
+
108 | 0.13270235061645508 | 0.2796257796257796 | 0.7995554225623049 | 0.6501303094783896 | 0.0001
|
224 |
+
109 | 0.1318453699350357 | 0.2806652806652807 | 0.8000342553738118 | 0.6579556871315007 | 0.0001
|
225 |
+
110 | 0.13255637884140015 | 0.2803187803187803 | 0.7997274043785672 | 0.6582487839550253 | 0.0001
|
226 |
+
111 | 0.1319260448217392 | 0.2785862785862786 | 0.8012935069355799 | 0.6608614747058748 | 0.0001
|
227 |
+
112 | 0.13223350048065186 | 0.28101178101178104 | 0.8019278738426415 | 0.6595016342799644 | 0.0001
|
228 |
+
113 | 0.13213913142681122 | 0.27997227997227997 | 0.8024988392216453 | 0.6592029124671744 | 0.0001
|
229 |
+
114 | 0.13204564154148102 | 0.2823977823977824 | 0.8025030654094965 | 0.663088095209859 | 0.0001
|
230 |
+
115 | 0.1319342404603958 | 0.28378378378378377 | 0.8004266211604096 | 0.659797224924612 | 0.0001
|
231 |
+
116 | 0.13186337053775787 | 0.2844767844767845 | 0.8022295974810655 | 0.6627361818946377 | 1e-05
|
232 |
+
117 | 0.1317850947380066 | 0.28205128205128205 | 0.8012607547491268 | 0.6604165936303265 | 1e-05
|
233 |
+
118 | 0.13159342110157013 | 0.2796257796257796 | 0.8002395926924228 | 0.6590147410119703 | 1e-05
|
234 |
+
119 | 0.1319129317998886 | 0.28274428274428276 | 0.8036745185622182 | 0.6608406822787987 | 1e-05
|
235 |
+
120 | 0.13164088129997253 | 0.28135828135828134 | 0.803593372600534 | 0.6614581971670047 | 1e-05
|
236 |
+
121 | 0.13184630870819092 | 0.28101178101178104 | 0.8012604863092451 | 0.6610641151618838 | 1e-05
|
237 |
+
122 | 0.13215216994285583 | 0.2817047817047817 | 0.8049611099432415 | 0.6647378818356079 | 1e-05
|
238 |
+
123 | 0.13187836110591888 | 0.2817047817047817 | 0.8010107932156931 | 0.6604978306251739 | 1e-05
|
239 |
+
124 | 0.13141389191150665 | 0.2806652806652807 | 0.8018739352640545 | 0.6621515776947642 | 1e-05
|
240 |
+
125 | 0.13139639794826508 | 0.2862092862092862 | 0.804345987993574 | 0.6640721616133445 | 1e-05
|
241 |
+
126 | 0.13103623688220978 | 0.2862092862092862 | 0.804212663367593 | 0.663003919720051 | 1e-05
|
242 |
+
127 | 0.13152988255023956 | 0.28586278586278585 | 0.8038346213944846 | 0.6597731906072118 | 1e-05
|
243 |
+
128 | 0.13113313913345337 | 0.2869022869022869 | 0.8042412977357216 | 0.668197478893632 | 1e-05
|
244 |
+
129 | 0.13096605241298676 | 0.28274428274428276 | 0.8034694309287074 | 0.6652814888251478 | 1e-05
|
245 |
+
130 | 0.1310083270072937 | 0.28655578655578656 | 0.8034491503931017 | 0.6657375892895663 | 1e-05
|
246 |
+
131 | 0.13133247196674347 | 0.2834372834372834 | 0.8052362171687506 | 0.6709132204127336 | 1e-05
|
247 |
+
132 | 0.13149647414684296 | 0.2806652806652807 | 0.7985562048814026 | 0.6557913726655867 | 1e-05
|
248 |
+
133 | 0.1311328113079071 | 0.28794178794178793 | 0.8051816958277256 | 0.6689392948255155 | 1e-05
|
249 |
+
134 | 0.1308571696281433 | 0.28274428274428276 | 0.802060714437774 | 0.6648386499372343 | 1e-05
|
250 |
+
135 | 0.13148072361946106 | 0.2869022869022869 | 0.8038277511961722 | 0.6684163123065296 | 1e-05
|
251 |
+
136 | 0.13150115311145782 | 0.28274428274428276 | 0.8024591213764248 | 0.659009971789042 | 1e-05
|
252 |
+
137 | 0.1310679018497467 | 0.28586278586278585 | 0.8035592643051771 | 0.6666808903899752 | 1e-05
|
253 |
+
138 | 0.13124705851078033 | 0.2844767844767845 | 0.8035426731078905 | 0.6665598962110765 | 1e-05
|
254 |
+
139 | 0.13104070723056793 | 0.28967428967428965 | 0.8052538519828238 | 0.6661043989752415 | 1e-05
|
255 |
+
140 | 0.13169734179973602 | 0.2834372834372834 | 0.8020416843896214 | 0.663466069531375 | 1e-05
|
256 |
+
141 | 0.13089434802532196 | 0.2875952875952876 | 0.8046521463311481 | 0.6687691213000826 | 1.0000000000000002e-06
|
257 |
+
142 | 0.13103386759757996 | 0.28586278586278585 | 0.8041640110473762 | 0.6642894279153319 | 1.0000000000000002e-06
|
258 |
+
143 | 0.13144278526306152 | 0.2872487872487873 | 0.8019270122783083 | 0.6623287859816251 | 1.0000000000000002e-06
|
259 |
+
144 | 0.1311902105808258 | 0.28378378378378377 | 0.8024974515800204 | 0.6647534218687892 | 1.0000000000000002e-06
|
260 |
+
|
261 |
+
|
262 |
+
---
|
263 |
+
|
264 |
+
# CO2 Emissions
|
265 |
+
|
266 |
+
The estimated CO2 emissions for training this model are documented below:
|
267 |
+
|
268 |
+
- **Emissions**: 1.6281850650290761 grams of CO2
|
269 |
+
- **Source**: Code Carbon
|
270 |
+
- **Training Type**: fine-tuning
|
271 |
+
- **Geographical Location**: Brest, France
|
272 |
+
- **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
|
273 |
+
|
274 |
+
|
275 |
+
---
|
276 |
+
|
277 |
+
# Framework Versions
|
278 |
+
|
279 |
+
- **Transformers**: 4.41.1
|
280 |
+
- **Pytorch**: 2.3.0+cu121
|
281 |
+
- **Datasets**: 2.19.1
|
282 |
+
- **Tokenizers**: 0.19.1
|
283 |
+
|