Upload README.md
Browse files
README.md
CHANGED
@@ -1,214 +1,296 @@
|
|
|
|
1 |
---
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
tags:
|
|
|
|
|
6 |
- generated_from_trainer
|
7 |
-
|
8 |
-
- accuracy
|
9 |
model-index:
|
10 |
- name: Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel
|
11 |
results: []
|
12 |
---
|
13 |
|
14 |
-
|
15 |
-
should probably proofread and complete it, then remove this comment. -->
|
16 |
|
17 |
-
# Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel
|
18 |
|
19 |
-
This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
|
20 |
-
It achieves the following results on the evaluation set:
|
21 |
- Loss: 0.0494
|
22 |
- F1 Micro: 0.7640
|
23 |
- F1 Macro: 0.3461
|
24 |
- Accuracy: 0.7130
|
25 |
-
- Learning Rate: 0.0000
|
26 |
|
27 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
|
29 |
-
|
30 |
|
31 |
-
|
|
|
|
|
|
|
32 |
|
33 |
-
|
34 |
|
35 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
36 |
|
37 |
-
|
38 |
|
39 |
-
|
40 |
|
41 |
-
|
42 |
|
43 |
The following hyperparameters were used during training:
|
44 |
-
|
45 |
-
-
|
46 |
-
-
|
47 |
-
-
|
48 |
-
-
|
49 |
-
-
|
50 |
-
-
|
51 |
-
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
| 0.
|
79 |
-
| 0.
|
80 |
-
| 0.
|
81 |
-
| 0.
|
82 |
-
| 0.
|
83 |
-
| 0.
|
84 |
-
| 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
-
| 0.
|
90 |
-
| 0.
|
91 |
-
| 0.
|
92 |
-
| 0.
|
93 |
-
| 0.
|
94 |
-
| 0.
|
95 |
-
| 0.
|
96 |
-
| 0.
|
97 |
-
| 0.
|
98 |
-
| 0.
|
99 |
-
| 0.
|
100 |
-
| 0.
|
101 |
-
| 0.
|
102 |
-
| 0.
|
103 |
-
| 0.
|
104 |
-
| 0.
|
105 |
-
| 0.
|
106 |
-
| 0.
|
107 |
-
| 0.
|
108 |
-
| 0.
|
109 |
-
| 0.
|
110 |
-
| 0.
|
111 |
-
| 0.
|
112 |
-
| 0.
|
113 |
-
| 0.
|
114 |
-
| 0.
|
115 |
-
| 0.
|
116 |
-
| 0.
|
117 |
-
| 0.
|
118 |
-
| 0.
|
119 |
-
| 0.
|
120 |
-
| 0.
|
121 |
-
| 0.
|
122 |
-
| 0.
|
123 |
-
| 0.
|
124 |
-
| 0.
|
125 |
-
| 0.
|
126 |
-
| 0.
|
127 |
-
| 0.
|
128 |
-
| 0.
|
129 |
-
| 0.
|
130 |
-
| 0.
|
131 |
-
| 0.
|
132 |
-
| 0.
|
133 |
-
| 0.
|
134 |
-
| 0.
|
135 |
-
| 0.
|
136 |
-
| 0.
|
137 |
-
| 0.
|
138 |
-
| 0.
|
139 |
-
| 0.
|
140 |
-
| 0.
|
141 |
-
| 0.
|
142 |
-
| 0.
|
143 |
-
| 0.
|
144 |
-
| 0.
|
145 |
-
| 0.
|
146 |
-
| 0.
|
147 |
-
| 0.
|
148 |
-
| 0.
|
149 |
-
| 0.
|
150 |
-
| 0.
|
151 |
-
| 0.
|
152 |
-
| 0.
|
153 |
-
| 0.
|
154 |
-
| 0.
|
155 |
-
| 0.
|
156 |
-
| 0.
|
157 |
-
| 0.
|
158 |
-
| 0.
|
159 |
-
| 0.
|
160 |
-
| 0.
|
161 |
-
| 0.
|
162 |
-
| 0.
|
163 |
-
| 0.
|
164 |
-
| 0.
|
165 |
-
| 0.
|
166 |
-
| 0.
|
167 |
-
| 0.
|
168 |
-
| 0.
|
169 |
-
| 0.
|
170 |
-
| 0.
|
171 |
-
| 0.
|
172 |
-
| 0.
|
173 |
-
| 0.
|
174 |
-
| 0.
|
175 |
-
| 0.
|
176 |
-
| 0.
|
177 |
-
| 0.
|
178 |
-
| 0.
|
179 |
-
| 0.
|
180 |
-
| 0.
|
181 |
-
| 0.
|
182 |
-
| 0.
|
183 |
-
| 0.
|
184 |
-
| 0.
|
185 |
-
| 0.
|
186 |
-
| 0.
|
187 |
-
| 0.
|
188 |
-
| 0.
|
189 |
-
| 0.
|
190 |
-
| 0.
|
191 |
-
| 0.
|
192 |
-
| 0.
|
193 |
-
| 0.
|
194 |
-
| 0.
|
195 |
-
| 0.
|
196 |
-
| 0.
|
197 |
-
| 0.
|
198 |
-
| 0.
|
199 |
-
| 0.
|
200 |
-
| 0.
|
201 |
-
| 0.
|
202 |
-
| 0.
|
203 |
-
| 0.
|
204 |
-
| 0.
|
205 |
-
| 0.
|
206 |
-
| 0.
|
207 |
-
|
208 |
-
|
209 |
-
|
210 |
-
|
211 |
-
|
212 |
-
|
213 |
-
|
214 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
---
|
3 |
+
language:
|
4 |
+
- eng
|
5 |
+
license: cc0-1.0
|
6 |
tags:
|
7 |
+
- multilabel-image-classification
|
8 |
+
- multilabel
|
9 |
- generated_from_trainer
|
10 |
+
base_model: Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel
|
|
|
11 |
model-index:
|
12 |
- name: Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel
|
13 |
results: []
|
14 |
---
|
15 |
|
16 |
+
DinoVdeau is a fine-tuned version of [Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel](https://huggingface.co/Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel). It achieves the following results on the test set:
|
|
|
17 |
|
|
|
18 |
|
|
|
|
|
19 |
- Loss: 0.0494
|
20 |
- F1 Micro: 0.7640
|
21 |
- F1 Macro: 0.3461
|
22 |
- Accuracy: 0.7130
|
|
|
23 |
|
24 |
+
| Class | F1 per class |
|
25 |
+
|----------|-------|
|
26 |
+
| ALGAE | 0.7961 |
|
27 |
+
| Acr | 0.7462 |
|
28 |
+
| Acr_Br | 0.3797 |
|
29 |
+
| Anem | 0.6767 |
|
30 |
+
| CCA | 0.2710 |
|
31 |
+
| Ech | 0.3610 |
|
32 |
+
| Fts | 0.3889 |
|
33 |
+
| Gal | 0.4667 |
|
34 |
+
| Gon | 0.2222 |
|
35 |
+
| Mtp | 0.5521 |
|
36 |
+
| P | 0.3615 |
|
37 |
+
| Poc | 0.4367 |
|
38 |
+
| Por | 0.5018 |
|
39 |
+
| R | 0.7153 |
|
40 |
+
| RDC | 0.1781 |
|
41 |
+
| S | 0.8252 |
|
42 |
+
| SG | 0.8504 |
|
43 |
+
| Sarg | 0.6303 |
|
44 |
+
| Ser | 0.3252 |
|
45 |
+
| Slt | 0.4188 |
|
46 |
+
| Sp | 0.4198 |
|
47 |
+
| Turf | 0.6045 |
|
48 |
+
| UNK | 0.3763 |
|
49 |
+
|
50 |
+
|
51 |
+
---
|
52 |
+
|
53 |
+
# Model description
|
54 |
+
DinoVdeau is a model built on top of Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
|
55 |
+
|
56 |
+
The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
|
57 |
|
58 |
+
- **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
|
59 |
|
60 |
+
---
|
61 |
+
|
62 |
+
# Intended uses & limitations
|
63 |
+
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
|
64 |
|
65 |
+
---
|
66 |
|
67 |
+
# Training and evaluation data
|
68 |
+
Details on the number of images for each class are given in the following table:
|
69 |
+
| Class | train | test | val | Total |
|
70 |
+
|:--------|--------:|-------:|------:|--------:|
|
71 |
+
| ALGAE | 36874 | 12292 | 12292 | 61458 |
|
72 |
+
| Acr | 5358 | 1787 | 1786 | 8931 |
|
73 |
+
| Acr_Br | 123 | 42 | 42 | 207 |
|
74 |
+
| Anem | 235 | 79 | 79 | 393 |
|
75 |
+
| CCA | 918 | 306 | 306 | 1530 |
|
76 |
+
| Ech | 618 | 206 | 206 | 1030 |
|
77 |
+
| Fts | 168 | 57 | 57 | 282 |
|
78 |
+
| Gal | 465 | 155 | 155 | 775 |
|
79 |
+
| Gon | 158 | 53 | 53 | 264 |
|
80 |
+
| Mtp | 2370 | 791 | 790 | 3951 |
|
81 |
+
| P | 2658 | 887 | 886 | 4431 |
|
82 |
+
| Poc | 549 | 184 | 183 | 916 |
|
83 |
+
| Por | 1059 | 354 | 353 | 1766 |
|
84 |
+
| R | 31437 | 10480 | 10479 | 52396 |
|
85 |
+
| RDC | 930 | 310 | 310 | 1550 |
|
86 |
+
| S | 57624 | 19209 | 19209 | 96042 |
|
87 |
+
| SG | 25539 | 8513 | 8513 | 42565 |
|
88 |
+
| Sarg | 285 | 96 | 96 | 477 |
|
89 |
+
| Ser | 261 | 87 | 87 | 435 |
|
90 |
+
| Slt | 2730 | 911 | 911 | 4552 |
|
91 |
+
| Sp | 132 | 44 | 44 | 220 |
|
92 |
+
| Turf | 1395 | 466 | 466 | 2327 |
|
93 |
+
| UNK | 292 | 98 | 98 | 488 |
|
94 |
|
95 |
+
---
|
96 |
|
97 |
+
# Training procedure
|
98 |
|
99 |
+
## Training hyperparameters
|
100 |
|
101 |
The following hyperparameters were used during training:
|
102 |
+
|
103 |
+
- **Number of Epochs**: 150.0
|
104 |
+
- **Learning Rate**: 0.001
|
105 |
+
- **Train Batch Size**: 64
|
106 |
+
- **Eval Batch Size**: 64
|
107 |
+
- **Optimizer**: Adam
|
108 |
+
- **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
|
109 |
+
- **Freeze Encoder**: Yes
|
110 |
+
- **Data Augmentation**: Yes
|
111 |
+
|
112 |
+
|
113 |
+
## Data Augmentation
|
114 |
+
Data were augmented using the following transformations :
|
115 |
+
|
116 |
+
Train Transforms
|
117 |
+
- **PreProcess**: No additional parameters
|
118 |
+
- **Resize**: probability=1.00
|
119 |
+
- **RandomHorizontalFlip**: probability=0.25
|
120 |
+
- **RandomVerticalFlip**: probability=0.25
|
121 |
+
- **ColorJiggle**: probability=0.25
|
122 |
+
- **RandomPerspective**: probability=0.25
|
123 |
+
- **Normalize**: probability=1.00
|
124 |
+
|
125 |
+
Val Transforms
|
126 |
+
- **PreProcess**: No additional parameters
|
127 |
+
- **Resize**: probability=1.00
|
128 |
+
- **Normalize**: probability=1.00
|
129 |
+
|
130 |
+
|
131 |
+
|
132 |
+
## Training results
|
133 |
+
Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
|
134 |
+
--- | --- | --- | --- | --- | ---
|
135 |
+
0 | N/A | N/A | N/A | N/A | 0.001
|
136 |
+
1 | 0.8028618097305298 | 0.7326527412414418 | 0.7326527412414418 | 0.2723318326456157 | 0.001
|
137 |
+
2 | 0.8038854002952576 | 0.7288723192975732 | 0.7288723192975732 | 0.290650504066453 | 0.001
|
138 |
+
3 | 0.7705450654029846 | 0.7408581732025574 | 0.7408581732025574 | 0.326985383349658 | 0.001
|
139 |
+
4 | 0.7623223066329956 | 0.7417118168673019 | 0.7417118168673019 | 0.310238611675961 | 0.001
|
140 |
+
5 | 0.7626621127128601 | 0.7383495061061653 | 0.7383495061061653 | 0.3107947240093857 | 0.001
|
141 |
+
6 | 0.7451828122138977 | 0.7447257016428285 | 0.7447257016428285 | 0.34331624846537584 | 0.001
|
142 |
+
7 | 0.7458378672599792 | 0.7467291510600861 | 0.7467291510600861 | 0.3283260141196405 | 0.001
|
143 |
+
8 | 0.7398682832717896 | 0.7458929286946221 | 0.7458929286946221 | 0.3352756381207012 | 0.001
|
144 |
+
9 | 0.7424591779708862 | 0.7455270814097316 | 0.7455270814097316 | 0.329402464136349 | 0.001
|
145 |
+
10 | 0.7364382147789001 | 0.7474782669291475 | 0.7474782669291475 | 0.31573051576045374 | 0.001
|
146 |
+
11 | 0.7368418574333191 | 0.7465375167680005 | 0.7465375167680005 | 0.34419509138343996 | 0.001
|
147 |
+
12 | 0.7442134022712708 | 0.7428093587219735 | 0.7428093587219735 | 0.3321199292959056 | 0.001
|
148 |
+
13 | 0.7384127378463745 | 0.7479312207104406 | 0.7479312207104406 | 0.35283143267744377 | 0.001
|
149 |
+
14 | 0.7464041113853455 | 0.7463633037751956 | 0.7463633037751956 | 0.33455884660313173 | 0.001
|
150 |
+
15 | 0.7394037842750549 | 0.7446734377449871 | 0.7446734377449871 | 0.34277495445998946 | 0.001
|
151 |
+
16 | 0.7397111058235168 | 0.7478789568125991 | 0.7478789568125991 | 0.3506456767847629 | 0.001
|
152 |
+
17 | 0.7110718488693237 | 0.7554398007003362 | 0.7554398007003362 | 0.3747287292827094 | 0.0001
|
153 |
+
18 | 0.7041681408882141 | 0.7567463981463738 | 0.7567463981463738 | 0.3792648315920964 | 0.0001
|
154 |
+
19 | 0.7004917860031128 | 0.7582446298844968 | 0.7582446298844968 | 0.38576725361345504 | 0.0001
|
155 |
+
20 | 0.6942671537399292 | 0.7602829219003153 | 0.7602829219003153 | 0.39339544323315423 | 0.0001
|
156 |
+
21 | 0.6919424533843994 | 0.7592899078413268 | 0.7592899078413268 | 0.3942710537489151 | 0.0001
|
157 |
+
22 | 0.6903713941574097 | 0.7606313478859253 | 0.7606313478859253 | 0.3925331634038099 | 0.0001
|
158 |
+
23 | 0.6874070167541504 | 0.7607010330830474 | 0.7607010330830474 | 0.39534246826429575 | 0.0001
|
159 |
+
24 | 0.6864963173866272 | 0.7612236720614624 | 0.7612236720614624 | 0.3933203620961568 | 0.0001
|
160 |
+
25 | 0.684335470199585 | 0.7614153063535478 | 0.7614153063535478 | 0.402343965759826 | 0.0001
|
161 |
+
26 | 0.6830293536186218 | 0.7629309593909513 | 0.7629309593909513 | 0.4055175650763901 | 0.0001
|
162 |
+
27 | 0.6827249526977539 | 0.7630703297851954 | 0.7630703297851954 | 0.40742696523740624 | 0.0001
|
163 |
+
28 | 0.6805527210235596 | 0.7629309593909513 | 0.7629309593909513 | 0.413578373717749 | 0.0001
|
164 |
+
29 | 0.6796479225158691 | 0.7625825334053413 | 0.7625825334053413 | 0.4138447052049864 | 0.0001
|
165 |
+
30 | 0.6774595379829407 | 0.7635929687636104 | 0.7635929687636104 | 0.41283784117218203 | 0.0001
|
166 |
+
31 | 0.677918553352356 | 0.7643769272312328 | 0.7643769272312328 | 0.4100283180344899 | 0.0001
|
167 |
+
32 | 0.6754601001739502 | 0.7641504503405864 | 0.7641504503405864 | 0.41093585032897456 | 0.0001
|
168 |
+
33 | 0.6749601364135742 | 0.7645162976254769 | 0.7645162976254769 | 0.4185852176848548 | 0.0001
|
169 |
+
34 | 0.6746455430984497 | 0.7650040940053309 | 0.7650040940053309 | 0.41458520697205553 | 0.0001
|
170 |
+
35 | 0.6740487813949585 | 0.7641852929391474 | 0.7641852929391474 | 0.42549626405712787 | 0.0001
|
171 |
+
36 | 0.6740365624427795 | 0.7646905106182819 | 0.7646905106182819 | 0.42042185334833837 | 0.0001
|
172 |
+
37 | 0.6731483936309814 | 0.7641678716398669 | 0.7641678716398669 | 0.4194338951085209 | 0.0001
|
173 |
+
38 | 0.671998918056488 | 0.7648821449103674 | 0.7648821449103674 | 0.42433192329853336 | 0.0001
|
174 |
+
39 | 0.6694707870483398 | 0.7659274228671974 | 0.7659274228671974 | 0.42040362773805245 | 0.0001
|
175 |
+
40 | 0.6693674325942993 | 0.7665023257434539 | 0.7665023257434539 | 0.42052839725843943 | 0.0001
|
176 |
+
41 | 0.6682748198509216 | 0.7658228950715145 | 0.7658228950715145 | 0.4176095837463332 | 0.0001
|
177 |
+
42 | 0.6682831645011902 | 0.7665371683420149 | 0.7665371683420149 | 0.4307432846853069 | 0.0001
|
178 |
+
43 | 0.6695142984390259 | 0.7664152192470515 | 0.7664152192470515 | 0.42637174689527435 | 0.0001
|
179 |
+
44 | 0.669391393661499 | 0.765840316370795 | 0.765840316370795 | 0.42609728385101026 | 0.0001
|
180 |
+
45 | 0.6695447564125061 | 0.7649518301074895 | 0.7649518301074895 | 0.4307801257532618 | 0.0001
|
181 |
+
46 | 0.6653340458869934 | 0.7673908120067595 | 0.7673908120067595 | 0.4362286100546047 | 0.0001
|
182 |
+
47 | 0.6659862995147705 | 0.7674430759046009 | 0.7674430759046009 | 0.43279477858934173 | 0.0001
|
183 |
+
48 | 0.665557861328125 | 0.7668855943276249 | 0.7668855943276249 | 0.43077383038275147 | 0.0001
|
184 |
+
49 | 0.6672787666320801 | 0.7666242748384174 | 0.7666242748384174 | 0.4241681801855841 | 0.0001
|
185 |
+
50 | 0.6661437749862671 | 0.7662584275535269 | 0.7662584275535269 | 0.43151276516162357 | 0.0001
|
186 |
+
51 | 0.6638755798339844 | 0.7667288026341005 | 0.7667288026341005 | 0.4307690989303114 | 0.0001
|
187 |
+
52 | 0.6654694676399231 | 0.7679134509851745 | 0.7679134509851745 | 0.4427799679621408 | 0.0001
|
188 |
+
53 | 0.6643231511116028 | 0.767234020313235 | 0.767234020313235 | 0.4341825403324277 | 0.0001
|
189 |
+
54 | 0.667382001876831 | 0.7663455340499294 | 0.7663455340499294 | 0.4459616186399035 | 0.0001
|
190 |
+
55 | 0.6627440452575684 | 0.7684709325621505 | 0.7684709325621505 | 0.4389385481332223 | 0.0001
|
191 |
+
56 | 0.6627209186553955 | 0.767094649918991 | 0.767094649918991 | 0.43857797557707 | 0.0001
|
192 |
+
57 | 0.6640397310256958 | 0.7669204369261859 | 0.7669204369261859 | 0.43847119749006624 | 0.0001
|
193 |
+
58 | 0.6627684235572815 | 0.7672862842110765 | 0.7672862842110765 | 0.43760916892251167 | 0.0001
|
194 |
+
59 | 0.6614954471588135 | 0.7679134509851745 | 0.7679134509851745 | 0.439932052501894 | 0.0001
|
195 |
+
60 | 0.6633245944976807 | 0.766990122123308 | 0.766990122123308 | 0.44188579219640917 | 0.0001
|
196 |
+
61 | 0.6611309051513672 | 0.7685754603578335 | 0.7685754603578335 | 0.4370683373238057 | 0.0001
|
197 |
+
62 | 0.660831093788147 | 0.7684360899635895 | 0.7684360899635895 | 0.45352172583851785 | 0.0001
|
198 |
+
63 | 0.6621896028518677 | 0.767791501890211 | 0.767791501890211 | 0.44611945476580944 | 0.0001
|
199 |
+
64 | 0.6610415577888489 | 0.767547603700284 | 0.767547603700284 | 0.4439334258360575 | 0.0001
|
200 |
+
65 | 0.6589834690093994 | 0.7680354000801379 | 0.7680354000801379 | 0.434573899819693 | 0.0001
|
201 |
+
66 | 0.6599727272987366 | 0.76845351126287 | 0.76845351126287 | 0.4397010901020239 | 0.0001
|
202 |
+
67 | 0.6572328209877014 | 0.769045835438407 | 0.769045835438407 | 0.44838573955584105 | 0.0001
|
203 |
+
68 | 0.658860445022583 | 0.7686799881535165 | 0.7686799881535165 | 0.4442341389313245 | 0.0001
|
204 |
+
69 | 0.659292995929718 | 0.7688542011463215 | 0.7688542011463215 | 0.43926108990057783 | 0.0001
|
205 |
+
70 | 0.658970832824707 | 0.7679134509851745 | 0.7679134509851745 | 0.4357439201261406 | 0.0001
|
206 |
+
71 | 0.6567061543464661 | 0.768244455671504 | 0.768244455671504 | 0.4432082950234043 | 0.0001
|
207 |
+
72 | 0.65887850522995 | 0.7681225065765405 | 0.7681225065765405 | 0.4369170898714745 | 0.0001
|
208 |
+
73 | 0.6611541509628296 | 0.7675998675981255 | 0.7675998675981255 | 0.4436833314710758 | 0.0001
|
209 |
+
74 | 0.6570971012115479 | 0.768488353861431 | 0.768488353861431 | 0.44906341810873124 | 0.0001
|
210 |
+
75 | 0.6557245254516602 | 0.768174770474382 | 0.768174770474382 | 0.44439364748025234 | 0.0001
|
211 |
+
76 | 0.658838152885437 | 0.7683489834671869 | 0.7683489834671869 | 0.4478607285233603 | 0.0001
|
212 |
+
77 | 0.6572225093841553 | 0.7686277242556749 | 0.7686277242556749 | 0.44887316801028765 | 0.0001
|
213 |
+
78 | 0.6562930941581726 | 0.768767094649919 | 0.768767094649919 | 0.4439839848601264 | 0.0001
|
214 |
+
79 | 0.6564787030220032 | 0.76810508527726 | 0.76810508527726 | 0.4379298766193662 | 0.0001
|
215 |
+
80 | 0.661143958568573 | 0.768383826065748 | 0.768383826065748 | 0.4460529321244195 | 0.0001
|
216 |
+
81 | 0.660437285900116 | 0.768941307642724 | 0.768941307642724 | 0.44750591322776384 | 0.0001
|
217 |
+
82 | 0.6531779766082764 | 0.7705614884758105 | 0.7705614884758105 | 0.4526720456188751 | 1e-05
|
218 |
+
83 | 0.6532895565032959 | 0.77019564119092 | 0.77019564119092 | 0.4489367718812771 | 1e-05
|
219 |
+
84 | 0.6505005359649658 | 0.7705266458772495 | 0.7705266458772495 | 0.45139096558153424 | 1e-05
|
220 |
+
85 | 0.6501905918121338 | 0.7708053866657375 | 0.7708053866657375 | 0.4565625671001629 | 1e-05
|
221 |
+
86 | 0.6507149338722229 | 0.7707182801693351 | 0.7707182801693351 | 0.4559124472677571 | 1e-05
|
222 |
+
87 | 0.6484472751617432 | 0.770927335760701 | 0.770927335760701 | 0.45838476700319497 | 1e-05
|
223 |
+
88 | 0.6496042013168335 | 0.77054406717653 | 0.77054406717653 | 0.4569340367642959 | 1e-05
|
224 |
+
89 | 0.6486304402351379 | 0.7713977108412745 | 0.7713977108412745 | 0.45601319436503734 | 1e-05
|
225 |
+
90 | 0.6490767598152161 | 0.7711886552499085 | 0.7711886552499085 | 0.45742417795749246 | 1e-05
|
226 |
+
91 | 0.6481940746307373 | 0.7704221180815666 | 0.7704221180815666 | 0.45182909085509243 | 1e-05
|
227 |
+
92 | 0.6477252244949341 | 0.7715545025347991 | 0.7715545025347991 | 0.45503456574154166 | 1e-05
|
228 |
+
93 | 0.6489835381507874 | 0.771206076549189 | 0.771206076549189 | 0.4518083431267937 | 1e-05
|
229 |
+
94 | 0.6485304832458496 | 0.770753122767896 | 0.770753122767896 | 0.45105746851856937 | 1e-05
|
230 |
+
95 | 0.647895336151123 | 0.771659030330482 | 0.771659030330482 | 0.45671492126995916 | 1e-05
|
231 |
+
96 | 0.6472702622413635 | 0.7715022386369575 | 0.7715022386369575 | 0.4596959338122155 | 1e-05
|
232 |
+
97 | 0.6460831165313721 | 0.7713977108412745 | 0.7713977108412745 | 0.4625401473178366 | 1e-05
|
233 |
+
98 | 0.6463102698326111 | 0.7722165119074581 | 0.7722165119074581 | 0.45892155771298887 | 1e-05
|
234 |
+
99 | 0.646852433681488 | 0.77089249316214 | 0.77089249316214 | 0.4549241891767946 | 1e-05
|
235 |
+
100 | 0.6456441879272461 | 0.7723384610024215 | 0.7723384610024215 | 0.45970146016594643 | 1e-05
|
236 |
+
101 | 0.6469387412071228 | 0.7717461368268845 | 0.7717461368268845 | 0.4573593202819609 | 1e-05
|
237 |
+
102 | 0.646738588809967 | 0.7717809794254455 | 0.7717809794254455 | 0.4593480600391769 | 1e-05
|
238 |
+
103 | 0.6467755436897278 | 0.7723036184038605 | 0.7723036184038605 | 0.4576617106244536 | 1e-05
|
239 |
+
104 | 0.6456966400146484 | 0.7725475165937876 | 0.7725475165937876 | 0.4578893476938316 | 1e-05
|
240 |
+
105 | 0.6455578804016113 | 0.7719377711189701 | 0.7719377711189701 | 0.45556962818046953 | 1e-05
|
241 |
+
106 | 0.6444206237792969 | 0.7723384610024215 | 0.7723384610024215 | 0.4644160307431161 | 1e-05
|
242 |
+
107 | 0.26552170515060425 | 0.04797825821849794 | 0.4910938804941607 | 0.360721302464235 | 1e-05
|
243 |
+
108 | 0.1419014185667038 | 0.44983536872179924 | 0.6693680656054029 | 0.22462065038139475 | 1e-05
|
244 |
+
109 | 0.07755623757839203 | 0.6714691381683245 | 0.7449736568518617 | 0.2136927959803109 | 1e-05
|
245 |
+
110 | 0.05802077427506447 | 0.6837163115625163 | 0.7489470111853911 | 0.26414762709864326 | 1e-05
|
246 |
+
111 | 0.053473543375730515 | 0.6935245030574381 | 0.7547299175391458 | 0.331603552491686 | 1e-05
|
247 |
+
112 | 0.05167479068040848 | 0.6998135920976987 | 0.7585280588776449 | 0.3483537603081725 | 1e-05
|
248 |
+
113 | 0.05106380954384804 | 0.7042734447135067 | 0.7611001027447527 | 0.3378893620669972 | 1e-05
|
249 |
+
114 | 0.05065497010946274 | 0.7053535652688978 | 0.7622047244094489 | 0.35703913789456687 | 1e-05
|
250 |
+
115 | 0.05039990693330765 | 0.7117820247034023 | 0.7647240545893983 | 0.36428903036337407 | 1e-05
|
251 |
+
116 | 0.05015714839100838 | 0.7101966864688769 | 0.7647289615591668 | 0.3622993891059384 | 1e-05
|
252 |
+
117 | 0.050175271928310394 | 0.712914409156635 | 0.7657223847509677 | 0.36544151863175506 | 1e-05
|
253 |
+
118 | 0.050468478351831436 | 0.7141687427048309 | 0.7654782537680462 | 0.3524192831401073 | 1e-05
|
254 |
+
119 | 0.049900032579898834 | 0.7127053535652689 | 0.7658673932788375 | 0.34416444697858145 | 1e-05
|
255 |
+
120 | 0.049903545528650284 | 0.7130886221494399 | 0.7657258505633957 | 0.35077501544817247 | 1e-05
|
256 |
+
121 | 0.04957958310842514 | 0.7140816362084285 | 0.7665756914119359 | 0.3627670615797559 | 1e-05
|
257 |
+
122 | 0.04973344877362251 | 0.7163812477134545 | 0.7672263726699065 | 0.35293584502475456 | 1e-05
|
258 |
+
123 | 0.04949206858873367 | 0.7153533910559049 | 0.7661930650098223 | 0.36741171960145996 | 1e-05
|
259 |
+
124 | 0.049613192677497864 | 0.7160676643264055 | 0.7673595994775795 | 0.36413807211107735 | 1e-05
|
260 |
+
125 | 0.04959910735487938 | 0.7124440340760614 | 0.7658070643240676 | 0.3509397162428964 | 1e-05
|
261 |
+
126 | 0.0494619682431221 | 0.7152662845595025 | 0.7660751240774316 | 0.37424111866414195 | 1e-05
|
262 |
+
127 | 0.049399666488170624 | 0.7149178585738925 | 0.766314294299216 | 0.35735030768195364 | 1e-05
|
263 |
+
128 | 0.04938925430178642 | 0.714412640894758 | 0.7664776721721585 | 0.36010176970077795 | 1e-05
|
264 |
+
129 | 0.049368634819984436 | 0.717931743349419 | 0.7674323253122921 | 0.3641550142658243 | 1e-05
|
265 |
+
130 | 0.049409620463848114 | 0.717687845159492 | 0.7667705923765463 | 0.3602711408206009 | 1e-05
|
266 |
+
131 | 0.04939533770084381 | 0.718210484137907 | 0.7665082507046622 | 0.3664294602272974 | 1e-05
|
267 |
+
132 | 0.04943186417222023 | 0.717583317363809 | 0.7664564319910658 | 0.3651176446191739 | 1e-05
|
268 |
+
133 | 0.049288176000118256 | 0.714621696486124 | 0.7658437005098911 | 0.36115748131858555 | 1e-05
|
269 |
+
134 | 0.04927274212241173 | 0.7154753401508684 | 0.7659699195779215 | 0.3677607284274943 | 1e-05
|
270 |
+
135 | 0.04929700121283531 | 0.7189944426055295 | 0.7674080308866179 | 0.37226237632641596 | 1e-05
|
271 |
+
136 | 0.049187980592250824 | 0.7151094928659779 | 0.766711291239524 | 0.35969130867283144 | 1e-05
|
272 |
+
137 | 0.0491538941860199 | 0.7157889235379175 | 0.7664713487937058 | 0.3631043131303743 | 1e-05
|
273 |
+
138 | 0.04930136725306511 | 0.7178446368530165 | 0.7665450277813434 | 0.3687935119842292 | 1e-05
|
274 |
+
139 | 0.04927237331867218 | 0.7182279054371875 | 0.766155421092079 | 0.35626916649899915 | 1e-05
|
275 |
+
140 | 0.04918988421559334 | 0.7197958223724326 | 0.767376184687937 | 0.3699733355385332 | 1e-05
|
276 |
+
141 | 0.04920462518930435 | 0.716468354209857 | 0.7666041104041745 | 0.35072596287625124 | 1e-05
|
277 |
+
142 | 0.04919710010290146 | 0.7194473963868225 | 0.7669340748803981 | 0.36600085102264546 | 1e-05
|
278 |
+
143 | 0.04930509999394417 | 0.7168342014947475 | 0.765517685242224 | 0.3673237139632794 | 1e-05
|
279 |
+
144 | 0.0490318201482296 | 0.7171477848817965 | 0.7667940015206897 | 0.3554021435508309 | 1.0000000000000002e-06
|
280 |
+
145 | 0.04918621480464935 | 0.7201616696573231 | 0.7677822164123848 | 0.3711029550898432 | 1.0000000000000002e-06
|
281 |
+
146 | 0.04903709515929222 | 0.717130363582516 | 0.7665065530257804 | 0.368326447075977 | 1.0000000000000002e-06
|
282 |
+
147 | 0.049094948917627335 | 0.720823679029982 | 0.768544776459646 | 0.37476196073915324 | 1.0000000000000002e-06
|
283 |
+
148 | 0.04907181113958359 | 0.7167296736990645 | 0.7667018106807243 | 0.3649988602534311 | 1.0000000000000002e-06
|
284 |
+
149 | 0.04904184117913246 | 0.718210484137907 | 0.7671139893046166 | 0.3787860055208151 | 1.0000000000000002e-06
|
285 |
+
150 | 0.04912904277443886 | 0.7154404975523074 | 0.7667920374277589 | 0.3726446424747262 | 1.0000000000000002e-06
|
286 |
+
|
287 |
+
|
288 |
+
---
|
289 |
+
|
290 |
+
# Framework Versions
|
291 |
+
|
292 |
+
- **Transformers**: 4.44.2
|
293 |
+
- **Pytorch**: 2.4.1+cu121
|
294 |
+
- **Datasets**: 3.0.0
|
295 |
+
- **Tokenizers**: 0.19.1
|
296 |
+
|