groderg commited on
Commit
b5ab656
·
verified ·
1 Parent(s): 83f9cb9

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +272 -190
README.md CHANGED
@@ -1,214 +1,296 @@
 
1
  ---
2
- library_name: transformers
3
- license: apache-2.0
4
- base_model: facebook/dinov2-large
5
  tags:
 
 
6
  - generated_from_trainer
7
- metrics:
8
- - accuracy
9
  model-index:
10
  - name: Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel
11
  results: []
12
  ---
13
 
14
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
- should probably proofread and complete it, then remove this comment. -->
16
 
17
- # Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel
18
 
19
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
20
- It achieves the following results on the evaluation set:
21
  - Loss: 0.0494
22
  - F1 Micro: 0.7640
23
  - F1 Macro: 0.3461
24
  - Accuracy: 0.7130
25
- - Learning Rate: 0.0000
26
 
27
- ## Model description
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
28
 
29
- More information needed
30
 
31
- ## Intended uses & limitations
 
 
 
32
 
33
- More information needed
34
 
35
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
- More information needed
38
 
39
- ## Training procedure
40
 
41
- ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 0.001
45
- - train_batch_size: 64
46
- - eval_batch_size: 64
47
- - seed: 42
48
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
- - lr_scheduler_type: linear
50
- - num_epochs: 150
51
- - mixed_precision_training: Native AMP
52
-
53
- ### Training results
54
-
55
- | Training Loss | Epoch | Step | Accuracy | F1 Macro | F1 Micro | Validation Loss | Rate |
56
- |:-------------:|:-----:|:------:|:--------:|:--------:|:--------:|:---------------:|:------:|
57
- | 0.9079 | 1.0 | 2691 | 0.7327 | 0.2723 | 0.7327 | 0.8029 | 0.001 |
58
- | 0.8824 | 2.0 | 5382 | 0.7289 | 0.2907 | 0.7289 | 0.8039 | 0.001 |
59
- | 0.8655 | 3.0 | 8073 | 0.7409 | 0.3270 | 0.7409 | 0.7705 | 0.001 |
60
- | 0.8514 | 4.0 | 10764 | 0.7417 | 0.3102 | 0.7417 | 0.7623 | 0.001 |
61
- | 0.844 | 5.0 | 13455 | 0.7383 | 0.3108 | 0.7383 | 0.7627 | 0.001 |
62
- | 0.8437 | 6.0 | 16146 | 0.7447 | 0.3433 | 0.7447 | 0.7452 | 0.001 |
63
- | 0.8289 | 7.0 | 18837 | 0.7467 | 0.3283 | 0.7467 | 0.7458 | 0.001 |
64
- | 0.8402 | 8.0 | 21528 | 0.7459 | 0.3353 | 0.7459 | 0.7399 | 0.001 |
65
- | 0.8274 | 9.0 | 24219 | 0.7455 | 0.3294 | 0.7455 | 0.7425 | 0.001 |
66
- | 0.8289 | 10.0 | 26910 | 0.7475 | 0.3157 | 0.7475 | 0.7364 | 0.001 |
67
- | 0.8368 | 11.0 | 29601 | 0.7465 | 0.3442 | 0.7465 | 0.7368 | 0.001 |
68
- | 0.8329 | 12.0 | 32292 | 0.7428 | 0.3321 | 0.7428 | 0.7442 | 0.001 |
69
- | 0.8359 | 13.0 | 34983 | 0.7479 | 0.3528 | 0.7479 | 0.7384 | 0.001 |
70
- | 0.8388 | 14.0 | 37674 | 0.7464 | 0.3346 | 0.7464 | 0.7464 | 0.001 |
71
- | 0.8306 | 15.0 | 40365 | 0.7447 | 0.3428 | 0.7447 | 0.7394 | 0.001 |
72
- | 0.8304 | 16.0 | 43056 | 0.7479 | 0.3506 | 0.7479 | 0.7397 | 0.001 |
73
- | 0.7886 | 17.0 | 45747 | 0.7554 | 0.3747 | 0.7554 | 0.7111 | 0.0001 |
74
- | 0.7815 | 18.0 | 48438 | 0.7567 | 0.3793 | 0.7567 | 0.7042 | 0.0001 |
75
- | 0.7682 | 19.0 | 51129 | 0.7582 | 0.3858 | 0.7582 | 0.7005 | 0.0001 |
76
- | 0.7788 | 20.0 | 53820 | 0.7603 | 0.3934 | 0.7603 | 0.6943 | 0.0001 |
77
- | 0.7735 | 21.0 | 56511 | 0.7593 | 0.3943 | 0.7593 | 0.6919 | 0.0001 |
78
- | 0.7602 | 22.0 | 59202 | 0.7606 | 0.3925 | 0.7606 | 0.6904 | 0.0001 |
79
- | 0.7572 | 23.0 | 61893 | 0.7607 | 0.3953 | 0.7607 | 0.6874 | 0.0001 |
80
- | 0.7593 | 24.0 | 64584 | 0.7612 | 0.3933 | 0.7612 | 0.6865 | 0.0001 |
81
- | 0.7548 | 25.0 | 67275 | 0.7614 | 0.4023 | 0.7614 | 0.6843 | 0.0001 |
82
- | 0.7557 | 26.0 | 69966 | 0.7629 | 0.4055 | 0.7629 | 0.6830 | 0.0001 |
83
- | 0.7534 | 27.0 | 72657 | 0.7631 | 0.4074 | 0.7631 | 0.6827 | 0.0001 |
84
- | 0.7609 | 28.0 | 75348 | 0.7629 | 0.4136 | 0.7629 | 0.6806 | 0.0001 |
85
- | 0.7537 | 29.0 | 78039 | 0.7626 | 0.4138 | 0.7626 | 0.6796 | 0.0001 |
86
- | 0.7533 | 30.0 | 80730 | 0.7636 | 0.4128 | 0.7636 | 0.6775 | 0.0001 |
87
- | 0.7481 | 31.0 | 83421 | 0.7644 | 0.4100 | 0.7644 | 0.6779 | 0.0001 |
88
- | 0.7523 | 32.0 | 86112 | 0.7642 | 0.4109 | 0.7642 | 0.6755 | 0.0001 |
89
- | 0.7459 | 33.0 | 88803 | 0.7645 | 0.4186 | 0.7645 | 0.6750 | 0.0001 |
90
- | 0.7454 | 34.0 | 91494 | 0.7650 | 0.4146 | 0.7650 | 0.6746 | 0.0001 |
91
- | 0.7426 | 35.0 | 94185 | 0.7642 | 0.4255 | 0.7642 | 0.6740 | 0.0001 |
92
- | 0.7446 | 36.0 | 96876 | 0.7647 | 0.4204 | 0.7647 | 0.6740 | 0.0001 |
93
- | 0.7431 | 37.0 | 99567 | 0.7642 | 0.4194 | 0.7642 | 0.6731 | 0.0001 |
94
- | 0.7468 | 38.0 | 102258 | 0.7649 | 0.4243 | 0.7649 | 0.6720 | 0.0001 |
95
- | 0.7307 | 39.0 | 104949 | 0.7659 | 0.4204 | 0.7659 | 0.6695 | 0.0001 |
96
- | 0.7404 | 40.0 | 107640 | 0.7665 | 0.4205 | 0.7665 | 0.6694 | 0.0001 |
97
- | 0.7355 | 41.0 | 110331 | 0.7658 | 0.4176 | 0.7658 | 0.6683 | 0.0001 |
98
- | 0.7508 | 42.0 | 113022 | 0.7665 | 0.4307 | 0.7665 | 0.6683 | 0.0001 |
99
- | 0.7368 | 43.0 | 115713 | 0.7664 | 0.4264 | 0.7664 | 0.6695 | 0.0001 |
100
- | 0.7362 | 44.0 | 118404 | 0.7658 | 0.4261 | 0.7658 | 0.6694 | 0.0001 |
101
- | 0.7287 | 45.0 | 121095 | 0.7650 | 0.4308 | 0.7650 | 0.6695 | 0.0001 |
102
- | 0.7374 | 46.0 | 123786 | 0.7674 | 0.4362 | 0.7674 | 0.6653 | 0.0001 |
103
- | 0.7321 | 47.0 | 126477 | 0.7674 | 0.4328 | 0.7674 | 0.6660 | 0.0001 |
104
- | 0.7352 | 48.0 | 129168 | 0.7669 | 0.4308 | 0.7669 | 0.6656 | 0.0001 |
105
- | 0.7373 | 49.0 | 131859 | 0.7666 | 0.4242 | 0.7666 | 0.6673 | 0.0001 |
106
- | 0.7307 | 50.0 | 134550 | 0.7663 | 0.4315 | 0.7663 | 0.6661 | 0.0001 |
107
- | 0.7235 | 51.0 | 137241 | 0.7667 | 0.4308 | 0.7667 | 0.6639 | 0.0001 |
108
- | 0.7295 | 52.0 | 139932 | 0.7679 | 0.4428 | 0.7679 | 0.6655 | 0.0001 |
109
- | 0.7267 | 53.0 | 142623 | 0.7672 | 0.4342 | 0.7672 | 0.6643 | 0.0001 |
110
- | 0.724 | 54.0 | 145314 | 0.7663 | 0.4460 | 0.7663 | 0.6674 | 0.0001 |
111
- | 0.734 | 55.0 | 148005 | 0.7685 | 0.4389 | 0.7685 | 0.6627 | 0.0001 |
112
- | 0.7285 | 56.0 | 150696 | 0.7671 | 0.4386 | 0.7671 | 0.6627 | 0.0001 |
113
- | 0.729 | 57.0 | 153387 | 0.7669 | 0.4385 | 0.7669 | 0.6640 | 0.0001 |
114
- | 0.7179 | 58.0 | 156078 | 0.7673 | 0.4376 | 0.7673 | 0.6628 | 0.0001 |
115
- | 0.7257 | 59.0 | 158769 | 0.7679 | 0.4399 | 0.7679 | 0.6615 | 0.0001 |
116
- | 0.7297 | 60.0 | 161460 | 0.7670 | 0.4419 | 0.7670 | 0.6633 | 0.0001 |
117
- | 0.7297 | 61.0 | 164151 | 0.7686 | 0.4371 | 0.7686 | 0.6611 | 0.0001 |
118
- | 0.7262 | 62.0 | 166842 | 0.7684 | 0.4535 | 0.7684 | 0.6608 | 0.0001 |
119
- | 0.7204 | 63.0 | 169533 | 0.7678 | 0.4461 | 0.7678 | 0.6622 | 0.0001 |
120
- | 0.7296 | 64.0 | 172224 | 0.7675 | 0.4439 | 0.7675 | 0.6610 | 0.0001 |
121
- | 0.7253 | 65.0 | 174915 | 0.7680 | 0.4346 | 0.7680 | 0.6590 | 0.0001 |
122
- | 0.723 | 66.0 | 177606 | 0.7685 | 0.4397 | 0.7685 | 0.6600 | 0.0001 |
123
- | 0.7257 | 67.0 | 180297 | 0.7690 | 0.4484 | 0.7690 | 0.6572 | 0.0001 |
124
- | 0.7257 | 68.0 | 182988 | 0.7687 | 0.4442 | 0.7687 | 0.6589 | 0.0001 |
125
- | 0.7299 | 69.0 | 185679 | 0.7689 | 0.4393 | 0.7689 | 0.6593 | 0.0001 |
126
- | 0.7289 | 70.0 | 188370 | 0.7679 | 0.4357 | 0.7679 | 0.6590 | 0.0001 |
127
- | 0.7179 | 71.0 | 191061 | 0.7682 | 0.4432 | 0.7682 | 0.6567 | 0.0001 |
128
- | 0.7292 | 72.0 | 193752 | 0.7681 | 0.4369 | 0.7681 | 0.6589 | 0.0001 |
129
- | 0.7139 | 73.0 | 196443 | 0.7676 | 0.4437 | 0.7676 | 0.6612 | 0.0001 |
130
- | 0.7307 | 74.0 | 199134 | 0.7685 | 0.4491 | 0.7685 | 0.6571 | 0.0001 |
131
- | 0.7238 | 75.0 | 201825 | 0.7682 | 0.4444 | 0.7682 | 0.6557 | 0.0001 |
132
- | 0.7257 | 76.0 | 204516 | 0.7683 | 0.4479 | 0.7683 | 0.6588 | 0.0001 |
133
- | 0.7252 | 77.0 | 207207 | 0.7686 | 0.4489 | 0.7686 | 0.6572 | 0.0001 |
134
- | 0.7231 | 78.0 | 209898 | 0.7688 | 0.4440 | 0.7688 | 0.6563 | 0.0001 |
135
- | 0.7207 | 79.0 | 212589 | 0.7681 | 0.4379 | 0.7681 | 0.6565 | 0.0001 |
136
- | 0.7179 | 80.0 | 215280 | 0.7684 | 0.4461 | 0.7684 | 0.6611 | 0.0001 |
137
- | 0.7275 | 81.0 | 217971 | 0.7689 | 0.4475 | 0.7689 | 0.6604 | 0.0001 |
138
- | 0.7101 | 82.0 | 220662 | 0.7706 | 0.4527 | 0.7706 | 0.6532 | 1e-05 |
139
- | 0.7063 | 83.0 | 223353 | 0.7702 | 0.4489 | 0.7702 | 0.6533 | 1e-05 |
140
- | 0.7067 | 84.0 | 226044 | 0.7705 | 0.4514 | 0.7705 | 0.6505 | 1e-05 |
141
- | 0.707 | 85.0 | 228735 | 0.7708 | 0.4566 | 0.7708 | 0.6502 | 1e-05 |
142
- | 0.6944 | 86.0 | 231426 | 0.7707 | 0.4559 | 0.7707 | 0.6507 | 1e-05 |
143
- | 0.6958 | 87.0 | 234117 | 0.7709 | 0.4584 | 0.7709 | 0.6484 | 1e-05 |
144
- | 0.6967 | 88.0 | 236808 | 0.7705 | 0.4569 | 0.7705 | 0.6496 | 1e-05 |
145
- | 0.698 | 89.0 | 239499 | 0.7714 | 0.4560 | 0.7714 | 0.6486 | 1e-05 |
146
- | 0.6966 | 90.0 | 242190 | 0.7712 | 0.4574 | 0.7712 | 0.6491 | 1e-05 |
147
- | 0.7017 | 91.0 | 244881 | 0.7704 | 0.4518 | 0.7704 | 0.6482 | 1e-05 |
148
- | 0.7 | 92.0 | 247572 | 0.7716 | 0.4550 | 0.7716 | 0.6477 | 1e-05 |
149
- | 0.7 | 93.0 | 250263 | 0.7712 | 0.4518 | 0.7712 | 0.6490 | 1e-05 |
150
- | 0.7049 | 94.0 | 252954 | 0.7708 | 0.4511 | 0.7708 | 0.6485 | 1e-05 |
151
- | 0.6949 | 95.0 | 255645 | 0.7717 | 0.4567 | 0.7717 | 0.6479 | 1e-05 |
152
- | 0.6998 | 96.0 | 258336 | 0.7715 | 0.4597 | 0.7715 | 0.6473 | 1e-05 |
153
- | 0.6968 | 97.0 | 261027 | 0.7714 | 0.4625 | 0.7714 | 0.6461 | 1e-05 |
154
- | 0.7055 | 98.0 | 263718 | 0.7722 | 0.4589 | 0.7722 | 0.6463 | 1e-05 |
155
- | 0.6931 | 99.0 | 266409 | 0.7709 | 0.4549 | 0.7709 | 0.6469 | 1e-05 |
156
- | 0.6872 | 100.0 | 269100 | 0.7723 | 0.4597 | 0.7723 | 0.6456 | 1e-05 |
157
- | 0.6822 | 101.0 | 271791 | 0.7717 | 0.4574 | 0.7717 | 0.6469 | 1e-05 |
158
- | 0.6875 | 102.0 | 274482 | 0.7718 | 0.4593 | 0.7718 | 0.6467 | 1e-05 |
159
- | 0.6983 | 103.0 | 277173 | 0.7723 | 0.4577 | 0.7723 | 0.6468 | 1e-05 |
160
- | 0.6902 | 104.0 | 279864 | 0.7725 | 0.4579 | 0.7725 | 0.6457 | 1e-05 |
161
- | 0.6876 | 105.0 | 282555 | 0.7719 | 0.4556 | 0.7719 | 0.6456 | 1e-05 |
162
- | 0.6849 | 106.0 | 285246 | 0.7723 | 0.4644 | 0.7723 | 0.6444 | 1e-05 |
163
- | 0.3411 | 107.0 | 287937 | 0.2655 | 0.4911 | 0.3607 | 0.0480 | 1e-05 |
164
- | 0.141 | 108.0 | 290628 | 0.1419 | 0.6694 | 0.2246 | 0.4498 | 1e-05 |
165
- | 0.0809 | 109.0 | 293319 | 0.0776 | 0.7450 | 0.2137 | 0.6715 | 1e-05 |
166
- | 0.0621 | 110.0 | 296010 | 0.0580 | 0.7489 | 0.2641 | 0.6837 | 1e-05 |
167
- | 0.0582 | 111.0 | 298701 | 0.0535 | 0.7547 | 0.3316 | 0.6935 | 1e-05 |
168
- | 0.0568 | 112.0 | 301392 | 0.0517 | 0.7585 | 0.3484 | 0.6998 | 1e-05 |
169
- | 0.0557 | 113.0 | 304083 | 0.0511 | 0.7611 | 0.3379 | 0.7043 | 1e-05 |
170
- | 0.0552 | 114.0 | 306774 | 0.0507 | 0.7622 | 0.3570 | 0.7054 | 1e-05 |
171
- | 0.0555 | 115.0 | 309465 | 0.0504 | 0.7647 | 0.3643 | 0.7118 | 1e-05 |
172
- | 0.0546 | 116.0 | 312156 | 0.0502 | 0.7647 | 0.3623 | 0.7102 | 1e-05 |
173
- | 0.0545 | 117.0 | 314847 | 0.0502 | 0.7657 | 0.3654 | 0.7129 | 1e-05 |
174
- | 0.0535 | 118.0 | 317538 | 0.0505 | 0.7655 | 0.3524 | 0.7142 | 1e-05 |
175
- | 0.0539 | 119.0 | 320229 | 0.0499 | 0.7659 | 0.3442 | 0.7127 | 1e-05 |
176
- | 0.0541 | 120.0 | 322920 | 0.0499 | 0.7657 | 0.3508 | 0.7131 | 1e-05 |
177
- | 0.0539 | 121.0 | 325611 | 0.0496 | 0.7666 | 0.3628 | 0.7141 | 1e-05 |
178
- | 0.0542 | 122.0 | 328302 | 0.0497 | 0.7672 | 0.3529 | 0.7164 | 1e-05 |
179
- | 0.0536 | 123.0 | 330993 | 0.0495 | 0.7662 | 0.3674 | 0.7154 | 1e-05 |
180
- | 0.0539 | 124.0 | 333684 | 0.0496 | 0.7674 | 0.3641 | 0.7161 | 1e-05 |
181
- | 0.0535 | 125.0 | 336375 | 0.0496 | 0.7658 | 0.3509 | 0.7124 | 1e-05 |
182
- | 0.0536 | 126.0 | 339066 | 0.0495 | 0.7661 | 0.3742 | 0.7153 | 1e-05 |
183
- | 0.054 | 127.0 | 341757 | 0.0494 | 0.7663 | 0.3574 | 0.7149 | 1e-05 |
184
- | 0.054 | 128.0 | 344448 | 0.0494 | 0.7665 | 0.3601 | 0.7144 | 1e-05 |
185
- | 0.0538 | 129.0 | 347139 | 0.0494 | 0.7674 | 0.3642 | 0.7179 | 1e-05 |
186
- | 0.0538 | 130.0 | 349830 | 0.0494 | 0.7668 | 0.3603 | 0.7177 | 1e-05 |
187
- | 0.0531 | 131.0 | 352521 | 0.0494 | 0.7665 | 0.3664 | 0.7182 | 1e-05 |
188
- | 0.0528 | 132.0 | 355212 | 0.0494 | 0.7665 | 0.3651 | 0.7176 | 1e-05 |
189
- | 0.053 | 133.0 | 357903 | 0.0493 | 0.7658 | 0.3612 | 0.7146 | 1e-05 |
190
- | 0.0534 | 134.0 | 360594 | 0.0493 | 0.7660 | 0.3678 | 0.7155 | 1e-05 |
191
- | 0.0528 | 135.0 | 363285 | 0.0493 | 0.7674 | 0.3723 | 0.7190 | 1e-05 |
192
- | 0.0533 | 136.0 | 365976 | 0.0492 | 0.7667 | 0.3597 | 0.7151 | 1e-05 |
193
- | 0.0534 | 137.0 | 368667 | 0.0492 | 0.7665 | 0.3631 | 0.7158 | 1e-05 |
194
- | 0.0534 | 138.0 | 371358 | 0.0493 | 0.7665 | 0.3688 | 0.7178 | 1e-05 |
195
- | 0.053 | 139.0 | 374049 | 0.0493 | 0.7662 | 0.3563 | 0.7182 | 1e-05 |
196
- | 0.0528 | 140.0 | 376740 | 0.0492 | 0.7674 | 0.3700 | 0.7198 | 1e-05 |
197
- | 0.0529 | 141.0 | 379431 | 0.0492 | 0.7666 | 0.3507 | 0.7165 | 1e-05 |
198
- | 0.0529 | 142.0 | 382122 | 0.0492 | 0.7669 | 0.3660 | 0.7194 | 1e-05 |
199
- | 0.0534 | 143.0 | 384813 | 0.0493 | 0.7655 | 0.3673 | 0.7168 | 1e-05 |
200
- | 0.0528 | 144.0 | 387504 | 0.0490 | 0.7668 | 0.3554 | 0.7171 | 0.0000 |
201
- | 0.0534 | 145.0 | 390195 | 0.0492 | 0.7678 | 0.3711 | 0.7202 | 0.0000 |
202
- | 0.0529 | 146.0 | 392886 | 0.0490 | 0.7665 | 0.3683 | 0.7171 | 0.0000 |
203
- | 0.0532 | 147.0 | 395577 | 0.0491 | 0.7685 | 0.3748 | 0.7208 | 0.0000 |
204
- | 0.053 | 148.0 | 398268 | 0.0491 | 0.7667 | 0.3650 | 0.7167 | 0.0000 |
205
- | 0.0526 | 149.0 | 400959 | 0.0490 | 0.7671 | 0.3788 | 0.7182 | 0.0000 |
206
- | 0.0535 | 150.0 | 403650 | 0.0491 | 0.7668 | 0.3726 | 0.7154 | 0.0000 |
207
-
208
-
209
- ### Framework versions
210
-
211
- - Transformers 4.44.2
212
- - Pytorch 2.4.1+cu121
213
- - Datasets 3.0.0
214
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: cc0-1.0
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel
 
11
  model-index:
12
  - name: Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel
13
  results: []
14
  ---
15
 
16
+ DinoVdeau is a fine-tuned version of [Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel](https://huggingface.co/Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel). It achieves the following results on the test set:
 
17
 
 
18
 
 
 
19
  - Loss: 0.0494
20
  - F1 Micro: 0.7640
21
  - F1 Macro: 0.3461
22
  - Accuracy: 0.7130
 
23
 
24
+ | Class | F1 per class |
25
+ |----------|-------|
26
+ | ALGAE | 0.7961 |
27
+ | Acr | 0.7462 |
28
+ | Acr_Br | 0.3797 |
29
+ | Anem | 0.6767 |
30
+ | CCA | 0.2710 |
31
+ | Ech | 0.3610 |
32
+ | Fts | 0.3889 |
33
+ | Gal | 0.4667 |
34
+ | Gon | 0.2222 |
35
+ | Mtp | 0.5521 |
36
+ | P | 0.3615 |
37
+ | Poc | 0.4367 |
38
+ | Por | 0.5018 |
39
+ | R | 0.7153 |
40
+ | RDC | 0.1781 |
41
+ | S | 0.8252 |
42
+ | SG | 0.8504 |
43
+ | Sarg | 0.6303 |
44
+ | Ser | 0.3252 |
45
+ | Slt | 0.4188 |
46
+ | Sp | 0.4198 |
47
+ | Turf | 0.6045 |
48
+ | UNK | 0.3763 |
49
+
50
+
51
+ ---
52
+
53
+ # Model description
54
+ DinoVdeau is a model built on top of Kamoulox-large-2024_10_31-batch-size64_freeze_monolabel model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
55
+
56
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
57
 
58
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
59
 
60
+ ---
61
+
62
+ # Intended uses & limitations
63
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
64
 
65
+ ---
66
 
67
+ # Training and evaluation data
68
+ Details on the number of images for each class are given in the following table:
69
+ | Class | train | test | val | Total |
70
+ |:--------|--------:|-------:|------:|--------:|
71
+ | ALGAE | 36874 | 12292 | 12292 | 61458 |
72
+ | Acr | 5358 | 1787 | 1786 | 8931 |
73
+ | Acr_Br | 123 | 42 | 42 | 207 |
74
+ | Anem | 235 | 79 | 79 | 393 |
75
+ | CCA | 918 | 306 | 306 | 1530 |
76
+ | Ech | 618 | 206 | 206 | 1030 |
77
+ | Fts | 168 | 57 | 57 | 282 |
78
+ | Gal | 465 | 155 | 155 | 775 |
79
+ | Gon | 158 | 53 | 53 | 264 |
80
+ | Mtp | 2370 | 791 | 790 | 3951 |
81
+ | P | 2658 | 887 | 886 | 4431 |
82
+ | Poc | 549 | 184 | 183 | 916 |
83
+ | Por | 1059 | 354 | 353 | 1766 |
84
+ | R | 31437 | 10480 | 10479 | 52396 |
85
+ | RDC | 930 | 310 | 310 | 1550 |
86
+ | S | 57624 | 19209 | 19209 | 96042 |
87
+ | SG | 25539 | 8513 | 8513 | 42565 |
88
+ | Sarg | 285 | 96 | 96 | 477 |
89
+ | Ser | 261 | 87 | 87 | 435 |
90
+ | Slt | 2730 | 911 | 911 | 4552 |
91
+ | Sp | 132 | 44 | 44 | 220 |
92
+ | Turf | 1395 | 466 | 466 | 2327 |
93
+ | UNK | 292 | 98 | 98 | 488 |
94
 
95
+ ---
96
 
97
+ # Training procedure
98
 
99
+ ## Training hyperparameters
100
 
101
  The following hyperparameters were used during training:
102
+
103
+ - **Number of Epochs**: 150.0
104
+ - **Learning Rate**: 0.001
105
+ - **Train Batch Size**: 64
106
+ - **Eval Batch Size**: 64
107
+ - **Optimizer**: Adam
108
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
109
+ - **Freeze Encoder**: Yes
110
+ - **Data Augmentation**: Yes
111
+
112
+
113
+ ## Data Augmentation
114
+ Data were augmented using the following transformations :
115
+
116
+ Train Transforms
117
+ - **PreProcess**: No additional parameters
118
+ - **Resize**: probability=1.00
119
+ - **RandomHorizontalFlip**: probability=0.25
120
+ - **RandomVerticalFlip**: probability=0.25
121
+ - **ColorJiggle**: probability=0.25
122
+ - **RandomPerspective**: probability=0.25
123
+ - **Normalize**: probability=1.00
124
+
125
+ Val Transforms
126
+ - **PreProcess**: No additional parameters
127
+ - **Resize**: probability=1.00
128
+ - **Normalize**: probability=1.00
129
+
130
+
131
+
132
+ ## Training results
133
+ Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
134
+ --- | --- | --- | --- | --- | ---
135
+ 0 | N/A | N/A | N/A | N/A | 0.001
136
+ 1 | 0.8028618097305298 | 0.7326527412414418 | 0.7326527412414418 | 0.2723318326456157 | 0.001
137
+ 2 | 0.8038854002952576 | 0.7288723192975732 | 0.7288723192975732 | 0.290650504066453 | 0.001
138
+ 3 | 0.7705450654029846 | 0.7408581732025574 | 0.7408581732025574 | 0.326985383349658 | 0.001
139
+ 4 | 0.7623223066329956 | 0.7417118168673019 | 0.7417118168673019 | 0.310238611675961 | 0.001
140
+ 5 | 0.7626621127128601 | 0.7383495061061653 | 0.7383495061061653 | 0.3107947240093857 | 0.001
141
+ 6 | 0.7451828122138977 | 0.7447257016428285 | 0.7447257016428285 | 0.34331624846537584 | 0.001
142
+ 7 | 0.7458378672599792 | 0.7467291510600861 | 0.7467291510600861 | 0.3283260141196405 | 0.001
143
+ 8 | 0.7398682832717896 | 0.7458929286946221 | 0.7458929286946221 | 0.3352756381207012 | 0.001
144
+ 9 | 0.7424591779708862 | 0.7455270814097316 | 0.7455270814097316 | 0.329402464136349 | 0.001
145
+ 10 | 0.7364382147789001 | 0.7474782669291475 | 0.7474782669291475 | 0.31573051576045374 | 0.001
146
+ 11 | 0.7368418574333191 | 0.7465375167680005 | 0.7465375167680005 | 0.34419509138343996 | 0.001
147
+ 12 | 0.7442134022712708 | 0.7428093587219735 | 0.7428093587219735 | 0.3321199292959056 | 0.001
148
+ 13 | 0.7384127378463745 | 0.7479312207104406 | 0.7479312207104406 | 0.35283143267744377 | 0.001
149
+ 14 | 0.7464041113853455 | 0.7463633037751956 | 0.7463633037751956 | 0.33455884660313173 | 0.001
150
+ 15 | 0.7394037842750549 | 0.7446734377449871 | 0.7446734377449871 | 0.34277495445998946 | 0.001
151
+ 16 | 0.7397111058235168 | 0.7478789568125991 | 0.7478789568125991 | 0.3506456767847629 | 0.001
152
+ 17 | 0.7110718488693237 | 0.7554398007003362 | 0.7554398007003362 | 0.3747287292827094 | 0.0001
153
+ 18 | 0.7041681408882141 | 0.7567463981463738 | 0.7567463981463738 | 0.3792648315920964 | 0.0001
154
+ 19 | 0.7004917860031128 | 0.7582446298844968 | 0.7582446298844968 | 0.38576725361345504 | 0.0001
155
+ 20 | 0.6942671537399292 | 0.7602829219003153 | 0.7602829219003153 | 0.39339544323315423 | 0.0001
156
+ 21 | 0.6919424533843994 | 0.7592899078413268 | 0.7592899078413268 | 0.3942710537489151 | 0.0001
157
+ 22 | 0.6903713941574097 | 0.7606313478859253 | 0.7606313478859253 | 0.3925331634038099 | 0.0001
158
+ 23 | 0.6874070167541504 | 0.7607010330830474 | 0.7607010330830474 | 0.39534246826429575 | 0.0001
159
+ 24 | 0.6864963173866272 | 0.7612236720614624 | 0.7612236720614624 | 0.3933203620961568 | 0.0001
160
+ 25 | 0.684335470199585 | 0.7614153063535478 | 0.7614153063535478 | 0.402343965759826 | 0.0001
161
+ 26 | 0.6830293536186218 | 0.7629309593909513 | 0.7629309593909513 | 0.4055175650763901 | 0.0001
162
+ 27 | 0.6827249526977539 | 0.7630703297851954 | 0.7630703297851954 | 0.40742696523740624 | 0.0001
163
+ 28 | 0.6805527210235596 | 0.7629309593909513 | 0.7629309593909513 | 0.413578373717749 | 0.0001
164
+ 29 | 0.6796479225158691 | 0.7625825334053413 | 0.7625825334053413 | 0.4138447052049864 | 0.0001
165
+ 30 | 0.6774595379829407 | 0.7635929687636104 | 0.7635929687636104 | 0.41283784117218203 | 0.0001
166
+ 31 | 0.677918553352356 | 0.7643769272312328 | 0.7643769272312328 | 0.4100283180344899 | 0.0001
167
+ 32 | 0.6754601001739502 | 0.7641504503405864 | 0.7641504503405864 | 0.41093585032897456 | 0.0001
168
+ 33 | 0.6749601364135742 | 0.7645162976254769 | 0.7645162976254769 | 0.4185852176848548 | 0.0001
169
+ 34 | 0.6746455430984497 | 0.7650040940053309 | 0.7650040940053309 | 0.41458520697205553 | 0.0001
170
+ 35 | 0.6740487813949585 | 0.7641852929391474 | 0.7641852929391474 | 0.42549626405712787 | 0.0001
171
+ 36 | 0.6740365624427795 | 0.7646905106182819 | 0.7646905106182819 | 0.42042185334833837 | 0.0001
172
+ 37 | 0.6731483936309814 | 0.7641678716398669 | 0.7641678716398669 | 0.4194338951085209 | 0.0001
173
+ 38 | 0.671998918056488 | 0.7648821449103674 | 0.7648821449103674 | 0.42433192329853336 | 0.0001
174
+ 39 | 0.6694707870483398 | 0.7659274228671974 | 0.7659274228671974 | 0.42040362773805245 | 0.0001
175
+ 40 | 0.6693674325942993 | 0.7665023257434539 | 0.7665023257434539 | 0.42052839725843943 | 0.0001
176
+ 41 | 0.6682748198509216 | 0.7658228950715145 | 0.7658228950715145 | 0.4176095837463332 | 0.0001
177
+ 42 | 0.6682831645011902 | 0.7665371683420149 | 0.7665371683420149 | 0.4307432846853069 | 0.0001
178
+ 43 | 0.6695142984390259 | 0.7664152192470515 | 0.7664152192470515 | 0.42637174689527435 | 0.0001
179
+ 44 | 0.669391393661499 | 0.765840316370795 | 0.765840316370795 | 0.42609728385101026 | 0.0001
180
+ 45 | 0.6695447564125061 | 0.7649518301074895 | 0.7649518301074895 | 0.4307801257532618 | 0.0001
181
+ 46 | 0.6653340458869934 | 0.7673908120067595 | 0.7673908120067595 | 0.4362286100546047 | 0.0001
182
+ 47 | 0.6659862995147705 | 0.7674430759046009 | 0.7674430759046009 | 0.43279477858934173 | 0.0001
183
+ 48 | 0.665557861328125 | 0.7668855943276249 | 0.7668855943276249 | 0.43077383038275147 | 0.0001
184
+ 49 | 0.6672787666320801 | 0.7666242748384174 | 0.7666242748384174 | 0.4241681801855841 | 0.0001
185
+ 50 | 0.6661437749862671 | 0.7662584275535269 | 0.7662584275535269 | 0.43151276516162357 | 0.0001
186
+ 51 | 0.6638755798339844 | 0.7667288026341005 | 0.7667288026341005 | 0.4307690989303114 | 0.0001
187
+ 52 | 0.6654694676399231 | 0.7679134509851745 | 0.7679134509851745 | 0.4427799679621408 | 0.0001
188
+ 53 | 0.6643231511116028 | 0.767234020313235 | 0.767234020313235 | 0.4341825403324277 | 0.0001
189
+ 54 | 0.667382001876831 | 0.7663455340499294 | 0.7663455340499294 | 0.4459616186399035 | 0.0001
190
+ 55 | 0.6627440452575684 | 0.7684709325621505 | 0.7684709325621505 | 0.4389385481332223 | 0.0001
191
+ 56 | 0.6627209186553955 | 0.767094649918991 | 0.767094649918991 | 0.43857797557707 | 0.0001
192
+ 57 | 0.6640397310256958 | 0.7669204369261859 | 0.7669204369261859 | 0.43847119749006624 | 0.0001
193
+ 58 | 0.6627684235572815 | 0.7672862842110765 | 0.7672862842110765 | 0.43760916892251167 | 0.0001
194
+ 59 | 0.6614954471588135 | 0.7679134509851745 | 0.7679134509851745 | 0.439932052501894 | 0.0001
195
+ 60 | 0.6633245944976807 | 0.766990122123308 | 0.766990122123308 | 0.44188579219640917 | 0.0001
196
+ 61 | 0.6611309051513672 | 0.7685754603578335 | 0.7685754603578335 | 0.4370683373238057 | 0.0001
197
+ 62 | 0.660831093788147 | 0.7684360899635895 | 0.7684360899635895 | 0.45352172583851785 | 0.0001
198
+ 63 | 0.6621896028518677 | 0.767791501890211 | 0.767791501890211 | 0.44611945476580944 | 0.0001
199
+ 64 | 0.6610415577888489 | 0.767547603700284 | 0.767547603700284 | 0.4439334258360575 | 0.0001
200
+ 65 | 0.6589834690093994 | 0.7680354000801379 | 0.7680354000801379 | 0.434573899819693 | 0.0001
201
+ 66 | 0.6599727272987366 | 0.76845351126287 | 0.76845351126287 | 0.4397010901020239 | 0.0001
202
+ 67 | 0.6572328209877014 | 0.769045835438407 | 0.769045835438407 | 0.44838573955584105 | 0.0001
203
+ 68 | 0.658860445022583 | 0.7686799881535165 | 0.7686799881535165 | 0.4442341389313245 | 0.0001
204
+ 69 | 0.659292995929718 | 0.7688542011463215 | 0.7688542011463215 | 0.43926108990057783 | 0.0001
205
+ 70 | 0.658970832824707 | 0.7679134509851745 | 0.7679134509851745 | 0.4357439201261406 | 0.0001
206
+ 71 | 0.6567061543464661 | 0.768244455671504 | 0.768244455671504 | 0.4432082950234043 | 0.0001
207
+ 72 | 0.65887850522995 | 0.7681225065765405 | 0.7681225065765405 | 0.4369170898714745 | 0.0001
208
+ 73 | 0.6611541509628296 | 0.7675998675981255 | 0.7675998675981255 | 0.4436833314710758 | 0.0001
209
+ 74 | 0.6570971012115479 | 0.768488353861431 | 0.768488353861431 | 0.44906341810873124 | 0.0001
210
+ 75 | 0.6557245254516602 | 0.768174770474382 | 0.768174770474382 | 0.44439364748025234 | 0.0001
211
+ 76 | 0.658838152885437 | 0.7683489834671869 | 0.7683489834671869 | 0.4478607285233603 | 0.0001
212
+ 77 | 0.6572225093841553 | 0.7686277242556749 | 0.7686277242556749 | 0.44887316801028765 | 0.0001
213
+ 78 | 0.6562930941581726 | 0.768767094649919 | 0.768767094649919 | 0.4439839848601264 | 0.0001
214
+ 79 | 0.6564787030220032 | 0.76810508527726 | 0.76810508527726 | 0.4379298766193662 | 0.0001
215
+ 80 | 0.661143958568573 | 0.768383826065748 | 0.768383826065748 | 0.4460529321244195 | 0.0001
216
+ 81 | 0.660437285900116 | 0.768941307642724 | 0.768941307642724 | 0.44750591322776384 | 0.0001
217
+ 82 | 0.6531779766082764 | 0.7705614884758105 | 0.7705614884758105 | 0.4526720456188751 | 1e-05
218
+ 83 | 0.6532895565032959 | 0.77019564119092 | 0.77019564119092 | 0.4489367718812771 | 1e-05
219
+ 84 | 0.6505005359649658 | 0.7705266458772495 | 0.7705266458772495 | 0.45139096558153424 | 1e-05
220
+ 85 | 0.6501905918121338 | 0.7708053866657375 | 0.7708053866657375 | 0.4565625671001629 | 1e-05
221
+ 86 | 0.6507149338722229 | 0.7707182801693351 | 0.7707182801693351 | 0.4559124472677571 | 1e-05
222
+ 87 | 0.6484472751617432 | 0.770927335760701 | 0.770927335760701 | 0.45838476700319497 | 1e-05
223
+ 88 | 0.6496042013168335 | 0.77054406717653 | 0.77054406717653 | 0.4569340367642959 | 1e-05
224
+ 89 | 0.6486304402351379 | 0.7713977108412745 | 0.7713977108412745 | 0.45601319436503734 | 1e-05
225
+ 90 | 0.6490767598152161 | 0.7711886552499085 | 0.7711886552499085 | 0.45742417795749246 | 1e-05
226
+ 91 | 0.6481940746307373 | 0.7704221180815666 | 0.7704221180815666 | 0.45182909085509243 | 1e-05
227
+ 92 | 0.6477252244949341 | 0.7715545025347991 | 0.7715545025347991 | 0.45503456574154166 | 1e-05
228
+ 93 | 0.6489835381507874 | 0.771206076549189 | 0.771206076549189 | 0.4518083431267937 | 1e-05
229
+ 94 | 0.6485304832458496 | 0.770753122767896 | 0.770753122767896 | 0.45105746851856937 | 1e-05
230
+ 95 | 0.647895336151123 | 0.771659030330482 | 0.771659030330482 | 0.45671492126995916 | 1e-05
231
+ 96 | 0.6472702622413635 | 0.7715022386369575 | 0.7715022386369575 | 0.4596959338122155 | 1e-05
232
+ 97 | 0.6460831165313721 | 0.7713977108412745 | 0.7713977108412745 | 0.4625401473178366 | 1e-05
233
+ 98 | 0.6463102698326111 | 0.7722165119074581 | 0.7722165119074581 | 0.45892155771298887 | 1e-05
234
+ 99 | 0.646852433681488 | 0.77089249316214 | 0.77089249316214 | 0.4549241891767946 | 1e-05
235
+ 100 | 0.6456441879272461 | 0.7723384610024215 | 0.7723384610024215 | 0.45970146016594643 | 1e-05
236
+ 101 | 0.6469387412071228 | 0.7717461368268845 | 0.7717461368268845 | 0.4573593202819609 | 1e-05
237
+ 102 | 0.646738588809967 | 0.7717809794254455 | 0.7717809794254455 | 0.4593480600391769 | 1e-05
238
+ 103 | 0.6467755436897278 | 0.7723036184038605 | 0.7723036184038605 | 0.4576617106244536 | 1e-05
239
+ 104 | 0.6456966400146484 | 0.7725475165937876 | 0.7725475165937876 | 0.4578893476938316 | 1e-05
240
+ 105 | 0.6455578804016113 | 0.7719377711189701 | 0.7719377711189701 | 0.45556962818046953 | 1e-05
241
+ 106 | 0.6444206237792969 | 0.7723384610024215 | 0.7723384610024215 | 0.4644160307431161 | 1e-05
242
+ 107 | 0.26552170515060425 | 0.04797825821849794 | 0.4910938804941607 | 0.360721302464235 | 1e-05
243
+ 108 | 0.1419014185667038 | 0.44983536872179924 | 0.6693680656054029 | 0.22462065038139475 | 1e-05
244
+ 109 | 0.07755623757839203 | 0.6714691381683245 | 0.7449736568518617 | 0.2136927959803109 | 1e-05
245
+ 110 | 0.05802077427506447 | 0.6837163115625163 | 0.7489470111853911 | 0.26414762709864326 | 1e-05
246
+ 111 | 0.053473543375730515 | 0.6935245030574381 | 0.7547299175391458 | 0.331603552491686 | 1e-05
247
+ 112 | 0.05167479068040848 | 0.6998135920976987 | 0.7585280588776449 | 0.3483537603081725 | 1e-05
248
+ 113 | 0.05106380954384804 | 0.7042734447135067 | 0.7611001027447527 | 0.3378893620669972 | 1e-05
249
+ 114 | 0.05065497010946274 | 0.7053535652688978 | 0.7622047244094489 | 0.35703913789456687 | 1e-05
250
+ 115 | 0.05039990693330765 | 0.7117820247034023 | 0.7647240545893983 | 0.36428903036337407 | 1e-05
251
+ 116 | 0.05015714839100838 | 0.7101966864688769 | 0.7647289615591668 | 0.3622993891059384 | 1e-05
252
+ 117 | 0.050175271928310394 | 0.712914409156635 | 0.7657223847509677 | 0.36544151863175506 | 1e-05
253
+ 118 | 0.050468478351831436 | 0.7141687427048309 | 0.7654782537680462 | 0.3524192831401073 | 1e-05
254
+ 119 | 0.049900032579898834 | 0.7127053535652689 | 0.7658673932788375 | 0.34416444697858145 | 1e-05
255
+ 120 | 0.049903545528650284 | 0.7130886221494399 | 0.7657258505633957 | 0.35077501544817247 | 1e-05
256
+ 121 | 0.04957958310842514 | 0.7140816362084285 | 0.7665756914119359 | 0.3627670615797559 | 1e-05
257
+ 122 | 0.04973344877362251 | 0.7163812477134545 | 0.7672263726699065 | 0.35293584502475456 | 1e-05
258
+ 123 | 0.04949206858873367 | 0.7153533910559049 | 0.7661930650098223 | 0.36741171960145996 | 1e-05
259
+ 124 | 0.049613192677497864 | 0.7160676643264055 | 0.7673595994775795 | 0.36413807211107735 | 1e-05
260
+ 125 | 0.04959910735487938 | 0.7124440340760614 | 0.7658070643240676 | 0.3509397162428964 | 1e-05
261
+ 126 | 0.0494619682431221 | 0.7152662845595025 | 0.7660751240774316 | 0.37424111866414195 | 1e-05
262
+ 127 | 0.049399666488170624 | 0.7149178585738925 | 0.766314294299216 | 0.35735030768195364 | 1e-05
263
+ 128 | 0.04938925430178642 | 0.714412640894758 | 0.7664776721721585 | 0.36010176970077795 | 1e-05
264
+ 129 | 0.049368634819984436 | 0.717931743349419 | 0.7674323253122921 | 0.3641550142658243 | 1e-05
265
+ 130 | 0.049409620463848114 | 0.717687845159492 | 0.7667705923765463 | 0.3602711408206009 | 1e-05
266
+ 131 | 0.04939533770084381 | 0.718210484137907 | 0.7665082507046622 | 0.3664294602272974 | 1e-05
267
+ 132 | 0.04943186417222023 | 0.717583317363809 | 0.7664564319910658 | 0.3651176446191739 | 1e-05
268
+ 133 | 0.049288176000118256 | 0.714621696486124 | 0.7658437005098911 | 0.36115748131858555 | 1e-05
269
+ 134 | 0.04927274212241173 | 0.7154753401508684 | 0.7659699195779215 | 0.3677607284274943 | 1e-05
270
+ 135 | 0.04929700121283531 | 0.7189944426055295 | 0.7674080308866179 | 0.37226237632641596 | 1e-05
271
+ 136 | 0.049187980592250824 | 0.7151094928659779 | 0.766711291239524 | 0.35969130867283144 | 1e-05
272
+ 137 | 0.0491538941860199 | 0.7157889235379175 | 0.7664713487937058 | 0.3631043131303743 | 1e-05
273
+ 138 | 0.04930136725306511 | 0.7178446368530165 | 0.7665450277813434 | 0.3687935119842292 | 1e-05
274
+ 139 | 0.04927237331867218 | 0.7182279054371875 | 0.766155421092079 | 0.35626916649899915 | 1e-05
275
+ 140 | 0.04918988421559334 | 0.7197958223724326 | 0.767376184687937 | 0.3699733355385332 | 1e-05
276
+ 141 | 0.04920462518930435 | 0.716468354209857 | 0.7666041104041745 | 0.35072596287625124 | 1e-05
277
+ 142 | 0.04919710010290146 | 0.7194473963868225 | 0.7669340748803981 | 0.36600085102264546 | 1e-05
278
+ 143 | 0.04930509999394417 | 0.7168342014947475 | 0.765517685242224 | 0.3673237139632794 | 1e-05
279
+ 144 | 0.0490318201482296 | 0.7171477848817965 | 0.7667940015206897 | 0.3554021435508309 | 1.0000000000000002e-06
280
+ 145 | 0.04918621480464935 | 0.7201616696573231 | 0.7677822164123848 | 0.3711029550898432 | 1.0000000000000002e-06
281
+ 146 | 0.04903709515929222 | 0.717130363582516 | 0.7665065530257804 | 0.368326447075977 | 1.0000000000000002e-06
282
+ 147 | 0.049094948917627335 | 0.720823679029982 | 0.768544776459646 | 0.37476196073915324 | 1.0000000000000002e-06
283
+ 148 | 0.04907181113958359 | 0.7167296736990645 | 0.7667018106807243 | 0.3649988602534311 | 1.0000000000000002e-06
284
+ 149 | 0.04904184117913246 | 0.718210484137907 | 0.7671139893046166 | 0.3787860055208151 | 1.0000000000000002e-06
285
+ 150 | 0.04912904277443886 | 0.7154404975523074 | 0.7667920374277589 | 0.3726446424747262 | 1.0000000000000002e-06
286
+
287
+
288
+ ---
289
+
290
+ # Framework Versions
291
+
292
+ - **Transformers**: 4.44.2
293
+ - **Pytorch**: 2.4.1+cu121
294
+ - **Datasets**: 3.0.0
295
+ - **Tokenizers**: 0.19.1
296
+