groderg commited on
Commit
4c96f1c
1 Parent(s): 9f49f20

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +221 -147
README.md CHANGED
@@ -1,170 +1,244 @@
 
1
  ---
2
- library_name: transformers
3
- license: apache-2.0
4
- base_model: facebook/dinov2-large
5
  tags:
 
 
6
  - generated_from_trainer
7
- metrics:
8
- - accuracy
9
  model-index:
10
  - name: DinoVdeauTest-large-2024_09_24-batch-size32_freeze
11
  results: []
12
  ---
13
 
14
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
- should probably proofread and complete it, then remove this comment. -->
16
 
17
- # DinoVdeauTest-large-2024_09_24-batch-size32_freeze
18
-
19
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
20
- It achieves the following results on the evaluation set:
21
  - Loss: 0.1204
22
  - F1 Micro: 0.8214
23
  - F1 Macro: 0.7191
24
  - Accuracy: 0.3135
25
- - Learning Rate: 0.0000
26
 
27
- ## Model description
 
 
 
28
 
29
- More information needed
30
 
31
- ## Intended uses & limitations
32
 
33
- More information needed
 
 
 
 
 
34
 
35
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
- More information needed
38
 
39
- ## Training procedure
40
 
41
- ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 0.001
45
- - train_batch_size: 32
46
- - eval_batch_size: 32
47
- - seed: 42
48
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
- - lr_scheduler_type: linear
50
- - num_epochs: 150
51
- - mixed_precision_training: Native AMP
52
-
53
- ### Training results
54
-
55
- | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Accuracy | Rate |
56
- |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:--------:|:------:|
57
- | No log | 1.0 | 273 | 0.1776 | 0.7486 | 0.5520 | 0.2169 | 0.001 |
58
- | 0.2736 | 2.0 | 546 | 0.1528 | 0.7693 | 0.5698 | 0.2405 | 0.001 |
59
- | 0.2736 | 3.0 | 819 | 0.1483 | 0.7774 | 0.6217 | 0.2495 | 0.001 |
60
- | 0.1699 | 4.0 | 1092 | 0.1467 | 0.7772 | 0.6272 | 0.2554 | 0.001 |
61
- | 0.1699 | 5.0 | 1365 | 0.1453 | 0.7772 | 0.6281 | 0.2464 | 0.001 |
62
- | 0.1622 | 6.0 | 1638 | 0.1437 | 0.7810 | 0.6191 | 0.2630 | 0.001 |
63
- | 0.1622 | 7.0 | 1911 | 0.1428 | 0.7811 | 0.6172 | 0.2599 | 0.001 |
64
- | 0.1593 | 8.0 | 2184 | 0.1433 | 0.7799 | 0.6253 | 0.2557 | 0.001 |
65
- | 0.1593 | 9.0 | 2457 | 0.1420 | 0.7893 | 0.6530 | 0.2519 | 0.001 |
66
- | 0.1569 | 10.0 | 2730 | 0.2462 | 0.7441 | 0.6022 | 0.2554 | 0.001 |
67
- | 0.156 | 11.0 | 3003 | 0.1400 | 0.7883 | 0.6483 | 0.2703 | 0.001 |
68
- | 0.156 | 12.0 | 3276 | 0.1400 | 0.7906 | 0.6590 | 0.2599 | 0.001 |
69
- | 0.1547 | 13.0 | 3549 | 0.1394 | 0.7876 | 0.6467 | 0.2644 | 0.001 |
70
- | 0.1547 | 14.0 | 3822 | 0.1399 | 0.7879 | 0.6469 | 0.2640 | 0.001 |
71
- | 0.1543 | 15.0 | 4095 | 0.1391 | 0.7881 | 0.6413 | 0.2512 | 0.001 |
72
- | 0.1543 | 16.0 | 4368 | 0.1414 | 0.7907 | 0.6348 | 0.2633 | 0.001 |
73
- | 0.1536 | 17.0 | 4641 | 0.1403 | 0.7903 | 0.6445 | 0.2616 | 0.001 |
74
- | 0.1536 | 18.0 | 4914 | 0.1404 | 0.7929 | 0.6464 | 0.2602 | 0.001 |
75
- | 0.1556 | 19.0 | 5187 | 0.1404 | 0.7936 | 0.6526 | 0.2585 | 0.001 |
76
- | 0.1556 | 20.0 | 5460 | 0.1391 | 0.7900 | 0.6491 | 0.2550 | 0.001 |
77
- | 0.1534 | 21.0 | 5733 | 0.1383 | 0.7917 | 0.6508 | 0.2581 | 0.001 |
78
- | 0.1533 | 22.0 | 6006 | 0.1389 | 0.7935 | 0.6489 | 0.2685 | 0.001 |
79
- | 0.1533 | 23.0 | 6279 | 0.1385 | 0.7843 | 0.6550 | 0.2561 | 0.001 |
80
- | 0.1531 | 24.0 | 6552 | 0.1367 | 0.7921 | 0.6507 | 0.2696 | 0.001 |
81
- | 0.1531 | 25.0 | 6825 | 0.1379 | 0.7894 | 0.6408 | 0.2751 | 0.001 |
82
- | 0.1533 | 26.0 | 7098 | 0.1375 | 0.7943 | 0.6469 | 0.2710 | 0.001 |
83
- | 0.1533 | 27.0 | 7371 | 0.1392 | 0.7921 | 0.6516 | 0.2644 | 0.001 |
84
- | 0.1529 | 28.0 | 7644 | 0.1385 | 0.7918 | 0.6425 | 0.2637 | 0.001 |
85
- | 0.1529 | 29.0 | 7917 | 0.1402 | 0.7883 | 0.6497 | 0.2626 | 0.001 |
86
- | 0.153 | 30.0 | 8190 | 0.1377 | 0.7887 | 0.6553 | 0.2668 | 0.001 |
87
- | 0.153 | 31.0 | 8463 | 0.1313 | 0.8018 | 0.6718 | 0.2789 | 0.0001 |
88
- | 0.1486 | 32.0 | 8736 | 0.1318 | 0.8061 | 0.6772 | 0.2807 | 0.0001 |
89
- | 0.1415 | 33.0 | 9009 | 0.1309 | 0.8050 | 0.6792 | 0.2775 | 0.0001 |
90
- | 0.1415 | 34.0 | 9282 | 0.1296 | 0.8049 | 0.6775 | 0.2821 | 0.0001 |
91
- | 0.1395 | 35.0 | 9555 | 0.1282 | 0.8085 | 0.6865 | 0.2893 | 0.0001 |
92
- | 0.1395 | 36.0 | 9828 | 0.1289 | 0.8055 | 0.6828 | 0.2831 | 0.0001 |
93
- | 0.1387 | 37.0 | 10101 | 0.1277 | 0.8055 | 0.6775 | 0.2831 | 0.0001 |
94
- | 0.1387 | 38.0 | 10374 | 0.1275 | 0.8084 | 0.6883 | 0.2890 | 0.0001 |
95
- | 0.1354 | 39.0 | 10647 | 0.1266 | 0.8099 | 0.6854 | 0.2879 | 0.0001 |
96
- | 0.1354 | 40.0 | 10920 | 0.1282 | 0.8117 | 0.6981 | 0.2886 | 0.0001 |
97
- | 0.1355 | 41.0 | 11193 | 0.1267 | 0.8082 | 0.6851 | 0.2883 | 0.0001 |
98
- | 0.1355 | 42.0 | 11466 | 0.1262 | 0.8112 | 0.6942 | 0.2907 | 0.0001 |
99
- | 0.1347 | 43.0 | 11739 | 0.1259 | 0.8107 | 0.6908 | 0.2911 | 0.0001 |
100
- | 0.1337 | 44.0 | 12012 | 0.1264 | 0.8115 | 0.6925 | 0.2931 | 0.0001 |
101
- | 0.1337 | 45.0 | 12285 | 0.1258 | 0.8110 | 0.6975 | 0.2966 | 0.0001 |
102
- | 0.1329 | 46.0 | 12558 | 0.1254 | 0.8109 | 0.6941 | 0.2987 | 0.0001 |
103
- | 0.1329 | 47.0 | 12831 | 0.1257 | 0.8098 | 0.6937 | 0.2921 | 0.0001 |
104
- | 0.1331 | 48.0 | 13104 | 0.1254 | 0.8107 | 0.6905 | 0.2914 | 0.0001 |
105
- | 0.1331 | 49.0 | 13377 | 0.1252 | 0.8137 | 0.6974 | 0.2945 | 0.0001 |
106
- | 0.1309 | 50.0 | 13650 | 0.1248 | 0.8150 | 0.7026 | 0.2983 | 0.0001 |
107
- | 0.1309 | 51.0 | 13923 | 0.1246 | 0.8158 | 0.7067 | 0.2959 | 0.0001 |
108
- | 0.1304 | 52.0 | 14196 | 0.1246 | 0.8121 | 0.7009 | 0.2952 | 0.0001 |
109
- | 0.1304 | 53.0 | 14469 | 0.1242 | 0.8143 | 0.6974 | 0.2990 | 0.0001 |
110
- | 0.1309 | 54.0 | 14742 | 0.1241 | 0.8135 | 0.7001 | 0.2966 | 0.0001 |
111
- | 0.1289 | 55.0 | 15015 | 0.1242 | 0.8131 | 0.6997 | 0.2952 | 0.0001 |
112
- | 0.1289 | 56.0 | 15288 | 0.1235 | 0.8179 | 0.7064 | 0.3021 | 0.0001 |
113
- | 0.1286 | 57.0 | 15561 | 0.1235 | 0.8150 | 0.6963 | 0.2994 | 0.0001 |
114
- | 0.1286 | 58.0 | 15834 | 0.1231 | 0.8145 | 0.7012 | 0.2983 | 0.0001 |
115
- | 0.1282 | 59.0 | 16107 | 0.1234 | 0.8153 | 0.7022 | 0.3001 | 0.0001 |
116
- | 0.1282 | 60.0 | 16380 | 0.1239 | 0.8122 | 0.6978 | 0.2973 | 0.0001 |
117
- | 0.1282 | 61.0 | 16653 | 0.1236 | 0.8158 | 0.7114 | 0.3015 | 0.0001 |
118
- | 0.1282 | 62.0 | 16926 | 0.1227 | 0.8168 | 0.7120 | 0.3032 | 0.0001 |
119
- | 0.1265 | 63.0 | 17199 | 0.1231 | 0.8137 | 0.7077 | 0.2949 | 0.0001 |
120
- | 0.1265 | 64.0 | 17472 | 0.1228 | 0.8172 | 0.7084 | 0.3056 | 0.0001 |
121
- | 0.1273 | 65.0 | 17745 | 0.1232 | 0.8183 | 0.7103 | 0.3077 | 0.0001 |
122
- | 0.1258 | 66.0 | 18018 | 0.1226 | 0.8179 | 0.7065 | 0.3035 | 0.0001 |
123
- | 0.1258 | 67.0 | 18291 | 0.1228 | 0.8185 | 0.7105 | 0.3053 | 0.0001 |
124
- | 0.125 | 68.0 | 18564 | 0.1228 | 0.8181 | 0.7128 | 0.3042 | 0.0001 |
125
- | 0.125 | 69.0 | 18837 | 0.1228 | 0.8137 | 0.7038 | 0.3053 | 0.0001 |
126
- | 0.125 | 70.0 | 19110 | 0.1232 | 0.8155 | 0.7080 | 0.3018 | 0.0001 |
127
- | 0.125 | 71.0 | 19383 | 0.1231 | 0.8156 | 0.7111 | 0.2990 | 0.0001 |
128
- | 0.1245 | 72.0 | 19656 | 0.1223 | 0.8162 | 0.7150 | 0.3008 | 0.0001 |
129
- | 0.1245 | 73.0 | 19929 | 0.1223 | 0.8174 | 0.7042 | 0.3049 | 0.0001 |
130
- | 0.1248 | 74.0 | 20202 | 0.1237 | 0.8125 | 0.7009 | 0.2963 | 0.0001 |
131
- | 0.1248 | 75.0 | 20475 | 0.1225 | 0.8152 | 0.7045 | 0.3046 | 0.0001 |
132
- | 0.1249 | 76.0 | 20748 | 0.1247 | 0.8160 | 0.7099 | 0.3008 | 0.0001 |
133
- | 0.1238 | 77.0 | 21021 | 0.1225 | 0.8179 | 0.7139 | 0.2990 | 0.0001 |
134
- | 0.1238 | 78.0 | 21294 | 0.1222 | 0.8188 | 0.7061 | 0.3046 | 0.0001 |
135
- | 0.1233 | 79.0 | 21567 | 0.1246 | 0.8152 | 0.7101 | 0.3018 | 0.0001 |
136
- | 0.1233 | 80.0 | 21840 | 0.1221 | 0.8180 | 0.7103 | 0.3039 | 0.0001 |
137
- | 0.1225 | 81.0 | 22113 | 0.1212 | 0.8185 | 0.7157 | 0.3018 | 0.0001 |
138
- | 0.1225 | 82.0 | 22386 | 0.1216 | 0.8152 | 0.7089 | 0.3080 | 0.0001 |
139
- | 0.1216 | 83.0 | 22659 | 0.1214 | 0.8165 | 0.7090 | 0.3080 | 0.0001 |
140
- | 0.1216 | 84.0 | 22932 | 0.1216 | 0.8169 | 0.7100 | 0.3070 | 0.0001 |
141
- | 0.1232 | 85.0 | 23205 | 0.1216 | 0.8188 | 0.7109 | 0.3053 | 0.0001 |
142
- | 0.1232 | 86.0 | 23478 | 0.1219 | 0.8191 | 0.7176 | 0.3070 | 0.0001 |
143
- | 0.1221 | 87.0 | 23751 | 0.1220 | 0.8177 | 0.7080 | 0.3063 | 0.0001 |
144
- | 0.1208 | 88.0 | 24024 | 0.1210 | 0.8211 | 0.7158 | 0.3049 | 1e-05 |
145
- | 0.1208 | 89.0 | 24297 | 0.1210 | 0.8241 | 0.7312 | 0.3073 | 1e-05 |
146
- | 0.1189 | 90.0 | 24570 | 0.1206 | 0.8235 | 0.7232 | 0.3070 | 1e-05 |
147
- | 0.1189 | 91.0 | 24843 | 0.1204 | 0.8191 | 0.7148 | 0.3087 | 1e-05 |
148
- | 0.1181 | 92.0 | 25116 | 0.1203 | 0.8194 | 0.7132 | 0.3087 | 1e-05 |
149
- | 0.1181 | 93.0 | 25389 | 0.1204 | 0.8215 | 0.7184 | 0.3084 | 1e-05 |
150
- | 0.1183 | 94.0 | 25662 | 0.1201 | 0.8207 | 0.7195 | 0.3067 | 1e-05 |
151
- | 0.1183 | 95.0 | 25935 | 0.1201 | 0.8197 | 0.7158 | 0.3084 | 1e-05 |
152
- | 0.117 | 96.0 | 26208 | 0.1198 | 0.8217 | 0.7193 | 0.3053 | 1e-05 |
153
- | 0.117 | 97.0 | 26481 | 0.1201 | 0.8205 | 0.7211 | 0.3063 | 1e-05 |
154
- | 0.1176 | 98.0 | 26754 | 0.1201 | 0.8227 | 0.7250 | 0.3080 | 1e-05 |
155
- | 0.1176 | 99.0 | 27027 | 0.1200 | 0.8206 | 0.7226 | 0.3073 | 1e-05 |
156
- | 0.1176 | 100.0 | 27300 | 0.1200 | 0.8216 | 0.7191 | 0.3080 | 1e-05 |
157
- | 0.117 | 101.0 | 27573 | 0.1199 | 0.8228 | 0.7242 | 0.3098 | 1e-05 |
158
- | 0.117 | 102.0 | 27846 | 0.1204 | 0.8192 | 0.7216 | 0.3073 | 1e-05 |
159
- | 0.1159 | 103.0 | 28119 | 0.1200 | 0.8222 | 0.7232 | 0.3067 | 0.0000 |
160
- | 0.1159 | 104.0 | 28392 | 0.1204 | 0.8218 | 0.7235 | 0.3101 | 0.0000 |
161
- | 0.1151 | 105.0 | 28665 | 0.1199 | 0.8219 | 0.7206 | 0.3077 | 0.0000 |
162
- | 0.1151 | 106.0 | 28938 | 0.1198 | 0.8228 | 0.7270 | 0.3105 | 0.0000 |
163
-
164
-
165
- ### Framework versions
166
-
167
- - Transformers 4.44.2
168
- - Pytorch 2.4.1+cu121
169
- - Datasets 3.0.0
170
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: wtfpl
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: facebook/dinov2-large
 
11
  model-index:
12
  - name: DinoVdeauTest-large-2024_09_24-batch-size32_freeze
13
  results: []
14
  ---
15
 
16
+ DinoVdeauTest is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large). It achieves the following results on the test set:
 
17
 
 
 
 
 
18
  - Loss: 0.1204
19
  - F1 Micro: 0.8214
20
  - F1 Macro: 0.7191
21
  - Accuracy: 0.3135
 
22
 
23
+ ---
24
+
25
+ # Model description
26
+ DinoVdeauTest is a model built on top of facebook/dinov2-large model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
27
 
28
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
29
 
30
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
31
 
32
+ ---
33
+
34
+ # Intended uses & limitations
35
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
36
+
37
+ ---
38
 
39
+ # Training and evaluation data
40
+ Details on the number of images for each class are given in the following table:
41
+ | Class | train | val | test | Total |
42
+ |:-------------------------|--------:|------:|-------:|--------:|
43
+ | Acropore_branched | 1469 | 464 | 475 | 2408 |
44
+ | Acropore_digitised | 568 | 160 | 160 | 888 |
45
+ | Acropore_sub_massive | 150 | 50 | 43 | 243 |
46
+ | Acropore_tabular | 999 | 297 | 293 | 1589 |
47
+ | Algae_assembly | 2546 | 847 | 845 | 4238 |
48
+ | Algae_drawn_up | 367 | 126 | 127 | 620 |
49
+ | Algae_limestone | 1652 | 557 | 563 | 2772 |
50
+ | Algae_sodding | 3148 | 984 | 985 | 5117 |
51
+ | Atra/Leucospilota | 1084 | 348 | 360 | 1792 |
52
+ | Bleached_coral | 219 | 71 | 70 | 360 |
53
+ | Blurred | 191 | 67 | 62 | 320 |
54
+ | Dead_coral | 1979 | 642 | 643 | 3264 |
55
+ | Fish | 2018 | 656 | 647 | 3321 |
56
+ | Homo_sapiens | 161 | 62 | 59 | 282 |
57
+ | Human_object | 157 | 58 | 55 | 270 |
58
+ | Living_coral | 406 | 154 | 141 | 701 |
59
+ | Millepore | 385 | 127 | 125 | 637 |
60
+ | No_acropore_encrusting | 441 | 130 | 154 | 725 |
61
+ | No_acropore_foliaceous | 204 | 36 | 46 | 286 |
62
+ | No_acropore_massive | 1031 | 336 | 338 | 1705 |
63
+ | No_acropore_solitary | 202 | 53 | 48 | 303 |
64
+ | No_acropore_sub_massive | 1401 | 433 | 422 | 2256 |
65
+ | Rock | 4489 | 1495 | 1473 | 7457 |
66
+ | Rubble | 3092 | 1030 | 1001 | 5123 |
67
+ | Sand | 5842 | 1939 | 1938 | 9719 |
68
+ | Sea_cucumber | 1408 | 439 | 447 | 2294 |
69
+ | Sea_urchins | 327 | 107 | 111 | 545 |
70
+ | Sponge | 269 | 96 | 105 | 470 |
71
+ | Syringodium_isoetifolium | 1212 | 392 | 391 | 1995 |
72
+ | Thalassodendron_ciliatum | 782 | 261 | 260 | 1303 |
73
+ | Useless | 579 | 193 | 193 | 965 |
74
 
75
+ ---
76
 
77
+ # Training procedure
78
 
79
+ ## Training hyperparameters
80
 
81
  The following hyperparameters were used during training:
82
+
83
+ - **Number of Epochs**: 106.0
84
+ - **Learning Rate**: 0.001
85
+ - **Train Batch Size**: 32
86
+ - **Eval Batch Size**: 32
87
+ - **Optimizer**: Adam
88
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
89
+ - **Freeze Encoder**: Yes
90
+ - **Data Augmentation**: Yes
91
+
92
+
93
+ ## Data Augmentation
94
+ Data were augmented using the following transformations :
95
+
96
+ Train Transforms
97
+ - **PreProcess**: No additional parameters
98
+ - **Resize**: probability=1.00
99
+ - **RandomHorizontalFlip**: probability=0.25
100
+ - **RandomVerticalFlip**: probability=0.25
101
+ - **ColorJiggle**: probability=0.25
102
+ - **RandomPerspective**: probability=0.25
103
+ - **Normalize**: probability=1.00
104
+
105
+ Val Transforms
106
+ - **PreProcess**: No additional parameters
107
+ - **Resize**: probability=1.00
108
+ - **Normalize**: probability=1.00
109
+
110
+
111
+
112
+ ## Training results
113
+ Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
114
+ --- | --- | --- | --- | --- | ---
115
+ 1 | 0.17755262553691864 | 0.2169092169092169 | 0.7485599799649386 | 0.5520222156367808 | 0.001
116
+ 2 | 0.15282750129699707 | 0.24047124047124047 | 0.7693301744561706 | 0.5698350844395493 | 0.001
117
+ 3 | 0.14834792912006378 | 0.2494802494802495 | 0.7773639211038686 | 0.6216984689952817 | 0.001
118
+ 4 | 0.1467081755399704 | 0.2553707553707554 | 0.777158535539741 | 0.6272242063327677 | 0.001
119
+ 5 | 0.1453460305929184 | 0.24636174636174638 | 0.7772219080349763 | 0.6281234234815563 | 0.001
120
+ 6 | 0.14370940625667572 | 0.262993762993763 | 0.7809704878364948 | 0.6191104313765434 | 0.001
121
+ 7 | 0.14275749027729034 | 0.2598752598752599 | 0.7811054928907905 | 0.6172423187752968 | 0.001
122
+ 8 | 0.1432911604642868 | 0.25571725571725573 | 0.7799277605779154 | 0.6252572102986063 | 0.001
123
+ 9 | 0.14202255010604858 | 0.25190575190575193 | 0.7892645815722739 | 0.6529741984810739 | 0.001
124
+ 10 | 0.24619852006435394 | 0.2553707553707554 | 0.7441173839133935 | 0.6021759252294137 | 0.001
125
+ 11 | 0.14001792669296265 | 0.2702702702702703 | 0.7882727742379886 | 0.6483450509947135 | 0.001
126
+ 12 | 0.13999028503894806 | 0.2598752598752599 | 0.7905884326636748 | 0.6590290067827226 | 0.001
127
+ 13 | 0.13937941193580627 | 0.2643797643797644 | 0.7875647668393783 | 0.6466696619852873 | 0.001
128
+ 14 | 0.13986562192440033 | 0.26403326403326405 | 0.7879462374796679 | 0.6469191780902835 | 0.001
129
+ 15 | 0.13914281129837036 | 0.2512127512127512 | 0.7881215116526777 | 0.6413408300553888 | 0.001
130
+ 16 | 0.14137022197246552 | 0.26334026334026334 | 0.7906518010291596 | 0.6347762053394217 | 0.001
131
+ 17 | 0.14031356573104858 | 0.2616077616077616 | 0.7903470350404312 | 0.6445091939759507 | 0.001
132
+ 18 | 0.14039942622184753 | 0.26022176022176025 | 0.792909822326517 | 0.6463740590949281 | 0.001
133
+ 19 | 0.14035724103450775 | 0.25848925848925847 | 0.7935513900055008 | 0.6526097604635809 | 0.001
134
+ 20 | 0.1390993297100067 | 0.255024255024255 | 0.7900462566386842 | 0.6491113591903472 | 0.001
135
+ 21 | 0.13827534019947052 | 0.25814275814275817 | 0.7916862276471341 | 0.6507882842254223 | 0.001
136
+ 22 | 0.13889800012111664 | 0.26853776853776856 | 0.7935178133197935 | 0.6489112697391156 | 0.001
137
+ 23 | 0.1385144591331482 | 0.2560637560637561 | 0.7842915050342152 | 0.6549504216490796 | 0.001
138
+ 24 | 0.13674499094486237 | 0.2695772695772696 | 0.7921174652241113 | 0.6507209930045859 | 0.001
139
+ 25 | 0.13791973888874054 | 0.2751212751212751 | 0.7893792922444337 | 0.6408412361180978 | 0.001
140
+ 26 | 0.1375003606081009 | 0.27096327096327094 | 0.7942844956280657 | 0.6468889221342444 | 0.001
141
+ 27 | 0.13920167088508606 | 0.2643797643797644 | 0.7921208479958802 | 0.6516252219978993 | 0.001
142
+ 28 | 0.13846370577812195 | 0.2636867636867637 | 0.7917504599717599 | 0.6424574000169906 | 0.001
143
+ 29 | 0.14017289876937866 | 0.2626472626472626 | 0.7883173722754258 | 0.6496760555529415 | 0.001
144
+ 30 | 0.13773277401924133 | 0.2668052668052668 | 0.788684508761224 | 0.6553418932728605 | 0.001
145
+ 31 | 0.1312580555677414 | 0.27893277893277896 | 0.8017825006427286 | 0.6718196714045327 | 0.0001
146
+ 32 | 0.1318267583847046 | 0.2806652806652807 | 0.8060657118786858 | 0.6772473958552189 | 0.0001
147
+ 33 | 0.1309264600276947 | 0.27754677754677753 | 0.8050148746281343 | 0.6791810095569879 | 0.0001
148
+ 34 | 0.1296369731426239 | 0.28205128205128205 | 0.804873906767428 | 0.6774879025186358 | 0.0001
149
+ 35 | 0.12816031277179718 | 0.28932778932778935 | 0.8084945013272659 | 0.6864816840760736 | 0.0001
150
+ 36 | 0.12886497378349304 | 0.2830907830907831 | 0.8055151151836945 | 0.6827519815644174 | 0.0001
151
+ 37 | 0.12768980860710144 | 0.2830907830907831 | 0.8054549328787556 | 0.677476009100308 | 0.0001
152
+ 38 | 0.1274958997964859 | 0.288981288981289 | 0.8084112149532711 | 0.6882635597933275 | 0.0001
153
+ 39 | 0.12658803164958954 | 0.28794178794178793 | 0.8099270562589084 | 0.6853651532137502 | 0.0001
154
+ 40 | 0.12820227444171906 | 0.28863478863478864 | 0.8117343111925994 | 0.698146081234717 | 0.0001
155
+ 41 | 0.12668322026729584 | 0.2882882882882883 | 0.8081688455951466 | 0.6850722645729148 | 0.0001
156
+ 42 | 0.12620903551578522 | 0.2907137907137907 | 0.8112331081081081 | 0.6941693850580287 | 0.0001
157
+ 43 | 0.12590545415878296 | 0.2910602910602911 | 0.8107360033550011 | 0.69077693559172 | 0.0001
158
+ 44 | 0.12642185389995575 | 0.29313929313929316 | 0.8114530416894076 | 0.6925170228864211 | 0.0001
159
+ 45 | 0.1257808804512024 | 0.2966042966042966 | 0.8110017663386323 | 0.6974852591826903 | 0.0001
160
+ 46 | 0.12542255222797394 | 0.2986832986832987 | 0.8109046268467172 | 0.694112624204446 | 0.0001
161
+ 47 | 0.1256514936685562 | 0.2920997920997921 | 0.8098117995347853 | 0.6937033329875891 | 0.0001
162
+ 48 | 0.1253652125597 | 0.29140679140679143 | 0.8106740862526874 | 0.6904792900172609 | 0.0001
163
+ 49 | 0.1252448409795761 | 0.2945252945252945 | 0.8136526746223817 | 0.6973960572840976 | 0.0001
164
+ 50 | 0.12482810020446777 | 0.2983367983367983 | 0.814972104255142 | 0.7025806374316313 | 0.0001
165
+ 51 | 0.12458823621273041 | 0.2959112959112959 | 0.815773054365459 | 0.7066639116616246 | 0.0001
166
+ 52 | 0.12461517751216888 | 0.29521829521829523 | 0.8120651620566774 | 0.7008785276837475 | 0.0001
167
+ 53 | 0.12417320907115936 | 0.29902979902979904 | 0.8142762173840784 | 0.697381421827107 | 0.0001
168
+ 54 | 0.12411519885063171 | 0.2966042966042966 | 0.8135221716405328 | 0.7000554609843794 | 0.0001
169
+ 55 | 0.12417341023683548 | 0.29521829521829523 | 0.8130868668154448 | 0.6996806217177095 | 0.0001
170
+ 56 | 0.1234845370054245 | 0.30214830214830213 | 0.8178580303155802 | 0.7064033143378504 | 0.0001
171
+ 57 | 0.12350637465715408 | 0.2993762993762994 | 0.8150138550675959 | 0.6963082116005005 | 0.0001
172
+ 58 | 0.12311282008886337 | 0.2983367983367983 | 0.814454598809424 | 0.7011763866264856 | 0.0001
173
+ 59 | 0.12340909987688065 | 0.30006930006930005 | 0.8153207895067678 | 0.7021882450681799 | 0.0001
174
+ 60 | 0.12386388331651688 | 0.2972972972972973 | 0.8121788610981358 | 0.6978379224258093 | 0.0001
175
+ 61 | 0.12355918437242508 | 0.30145530145530147 | 0.8157532819337362 | 0.7114090358686158 | 0.0001
176
+ 62 | 0.12266429513692856 | 0.3031878031878032 | 0.8168304358780549 | 0.7119609165997448 | 0.0001
177
+ 63 | 0.12305888533592224 | 0.2948717948717949 | 0.8136965505608501 | 0.7077249537938423 | 0.0001
178
+ 64 | 0.12276986986398697 | 0.30561330561330563 | 0.8172187094342783 | 0.708440477929098 | 0.0001
179
+ 65 | 0.12323789298534393 | 0.3076923076923077 | 0.8182651445712857 | 0.7103116175502073 | 0.0001
180
+ 66 | 0.12264254689216614 | 0.30353430353430355 | 0.8178908425822006 | 0.706549053618703 | 0.0001
181
+ 67 | 0.12282951176166534 | 0.3052668052668053 | 0.8184709429598117 | 0.7104724530407897 | 0.0001
182
+ 68 | 0.12277437746524811 | 0.3042273042273042 | 0.818125541661508 | 0.7127667888432785 | 0.0001
183
+ 69 | 0.12282923609018326 | 0.3052668052668053 | 0.8137196924896511 | 0.703803449192515 | 0.0001
184
+ 70 | 0.1231524795293808 | 0.30180180180180183 | 0.8154847434705434 | 0.7079779831387449 | 0.0001
185
+ 71 | 0.12310803681612015 | 0.29902979902979904 | 0.8155765340525961 | 0.7111485120209401 | 0.0001
186
+ 72 | 0.12233748286962509 | 0.30076230076230076 | 0.8161826422695988 | 0.7149554589558469 | 0.0001
187
+ 73 | 0.12232980877161026 | 0.3049203049203049 | 0.8173891171292027 | 0.7041677932151664 | 0.0001
188
+ 74 | 0.12370481342077255 | 0.29625779625779625 | 0.8124655033329088 | 0.7009378784168091 | 0.0001
189
+ 75 | 0.12251079827547073 | 0.30457380457380456 | 0.8151614807319335 | 0.7045397920320168 | 0.0001
190
+ 76 | 0.12471619248390198 | 0.30076230076230076 | 0.8159596808063839 | 0.7099467053038022 | 0.0001
191
+ 77 | 0.1225149855017662 | 0.29902979902979904 | 0.8179482930062977 | 0.7139220834888879 | 0.0001
192
+ 78 | 0.12218895554542542 | 0.30457380457380456 | 0.8188081009296149 | 0.7060530834173799 | 0.0001
193
+ 79 | 0.12460680305957794 | 0.30180180180180183 | 0.8152018201735907 | 0.7100736740896743 | 0.0001
194
+ 80 | 0.12210072576999664 | 0.3038808038808039 | 0.8179503235232728 | 0.710254660316209 | 0.0001
195
+ 81 | 0.12116113305091858 | 0.30180180180180183 | 0.8184843191082273 | 0.7156736962695093 | 0.0001
196
+ 82 | 0.12159755080938339 | 0.30803880803880807 | 0.8151834668916069 | 0.7088973142030256 | 0.0001
197
+ 83 | 0.12142453342676163 | 0.30803880803880807 | 0.8164913756836348 | 0.7090314532601811 | 0.0001
198
+ 84 | 0.12157817929983139 | 0.306999306999307 | 0.8168789808917197 | 0.7100088286352488 | 0.0001
199
+ 85 | 0.12155645340681076 | 0.3052668052668053 | 0.8188048474717927 | 0.7109093140233731 | 0.0001
200
+ 86 | 0.12185127288103104 | 0.306999306999307 | 0.8190986316274009 | 0.7176309861117636 | 0.0001
201
+ 87 | 0.12196006625890732 | 0.3063063063063063 | 0.8176601181250785 | 0.7080074645247395 | 0.0001
202
+ 88 | 0.1209772601723671 | 0.3049203049203049 | 0.8210709982967056 | 0.7158455151348946 | 1e-05
203
+ 89 | 0.12103869765996933 | 0.30734580734580735 | 0.82409381663113 | 0.7312085502704033 | 1e-05
204
+ 90 | 0.12060839682817459 | 0.306999306999307 | 0.823466447097571 | 0.7232165780756572 | 1e-05
205
+ 91 | 0.12036494165658951 | 0.3087318087318087 | 0.8190853196327803 | 0.714800773362679 | 1e-05
206
+ 92 | 0.12028194963932037 | 0.3087318087318087 | 0.8194187804468336 | 0.7131694962358415 | 1e-05
207
+ 93 | 0.12036142498254776 | 0.30838530838530837 | 0.8214921910601102 | 0.7183857334665726 | 1e-05
208
+ 94 | 0.12012535333633423 | 0.30665280665280664 | 0.8207247828991316 | 0.7195089038891936 | 1e-05
209
+ 95 | 0.12010551244020462 | 0.30838530838530837 | 0.8197421299397187 | 0.7157953029268052 | 1e-05
210
+ 96 | 0.1198035180568695 | 0.3052668052668053 | 0.8217099503939306 | 0.7193329750108997 | 1e-05
211
+ 97 | 0.12007978558540344 | 0.3063063063063063 | 0.8204592028773368 | 0.7210913273742954 | 1e-05
212
+ 98 | 0.12013950198888779 | 0.30803880803880807 | 0.8226615276223631 | 0.7250050738866957 | 1e-05
213
+ 99 | 0.12003140896558762 | 0.30734580734580735 | 0.8206050968740846 | 0.7226446473273916 | 1e-05
214
+ 100 | 0.1200241968035698 | 0.30803880803880807 | 0.8216021451315569 | 0.7191058080317732 | 1e-05
215
+ 101 | 0.11989927291870117 | 0.3097713097713098 | 0.8228338260398884 | 0.724185821594473 | 1e-05
216
+ 102 | 0.12043397128582001 | 0.30734580734580735 | 0.8192071374463429 | 0.7216480982679994 | 1e-05
217
+ 103 | 0.1199527159333229 | 0.30665280665280664 | 0.8221990402670561 | 0.7231653777895258 | 1.0000000000000002e-06
218
+ 104 | 0.12043838202953339 | 0.31011781011781014 | 0.8217751676454663 | 0.7234886652066129 | 1.0000000000000002e-06
219
+ 105 | 0.11985885351896286 | 0.3076923076923077 | 0.8219212232009308 | 0.7205886261735153 | 1.0000000000000002e-06
220
+ 106 | 0.11984959244728088 | 0.31046431046431044 | 0.8228195739014648 | 0.7270103073654017 | 1.0000000000000002e-06
221
+
222
+
223
+ ---
224
+
225
+ # CO2 Emissions
226
+
227
+ The estimated CO2 emissions for training this model are documented below:
228
+
229
+ - **Emissions**: 1.1530867349091423 grams of CO2
230
+ - **Source**: Code Carbon
231
+ - **Training Type**: fine-tuning
232
+ - **Geographical Location**: Brest, France
233
+ - **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
234
+
235
+
236
+ ---
237
+
238
+ # Framework Versions
239
+
240
+ - **Transformers**: 4.44.2
241
+ - **Pytorch**: 2.4.1+cu121
242
+ - **Datasets**: 3.0.0
243
+ - **Tokenizers**: 0.19.1
244
+