lombardata commited on
Commit
0da2bb9
1 Parent(s): deca514

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +189 -127
README.md CHANGED
@@ -1,151 +1,213 @@
 
1
  ---
2
- license: apache-2.0
3
- base_model: facebook/dinov2-large
 
4
  tags:
 
 
5
  - generated_from_trainer
6
- metrics:
7
- - accuracy
8
  model-index:
9
  - name: drone-DinoVdeau-produttoria_binary-binary-large-2024_11_03-batch-size64_freeze
10
  results: []
11
  ---
12
 
13
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
- should probably proofread and complete it, then remove this comment. -->
15
 
16
- # drone-DinoVdeau-produttoria_binary-binary-large-2024_11_03-batch-size64_freeze
17
 
18
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
19
- It achieves the following results on the evaluation set:
20
  - Loss: 0.2854
21
  - F1 Micro: 0.8468
22
  - F1 Macro: 0.6351
23
  - Accuracy: 0.2786
24
- - Learning Rate: 0.0000
25
 
26
- ## Model description
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
27
 
28
- More information needed
29
 
30
- ## Intended uses & limitations
 
31
 
32
- More information needed
33
 
34
- ## Training and evaluation data
 
 
35
 
36
- More information needed
 
37
 
38
- ## Training procedure
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
39
 
40
- ### Training hyperparameters
 
 
41
 
42
  The following hyperparameters were used during training:
43
- - learning_rate: 0.001
44
- - train_batch_size: 64
45
- - eval_batch_size: 64
46
- - seed: 42
47
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
- - lr_scheduler_type: linear
49
- - num_epochs: 150
50
- - mixed_precision_training: Native AMP
51
-
52
- ### Training results
53
-
54
- | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Accuracy | Rate |
55
- |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:--------:|:------:|
56
- | No log | 1.0 | 181 | 0.3236 | 0.8262 | 0.5774 | 0.2630 | 0.001 |
57
- | No log | 2.0 | 362 | 0.3146 | 0.8379 | 0.6199 | 0.2412 | 0.001 |
58
- | 0.3995 | 3.0 | 543 | 0.3090 | 0.8398 | 0.6044 | 0.2555 | 0.001 |
59
- | 0.3995 | 4.0 | 724 | 0.3074 | 0.8349 | 0.6003 | 0.2562 | 0.001 |
60
- | 0.3995 | 5.0 | 905 | 0.3039 | 0.8406 | 0.6248 | 0.2516 | 0.001 |
61
- | 0.3299 | 6.0 | 1086 | 0.3060 | 0.8420 | 0.6225 | 0.2596 | 0.001 |
62
- | 0.3299 | 7.0 | 1267 | 0.3014 | 0.8387 | 0.5955 | 0.2820 | 0.001 |
63
- | 0.3299 | 8.0 | 1448 | 0.3013 | 0.8391 | 0.5975 | 0.2703 | 0.001 |
64
- | 0.3216 | 9.0 | 1629 | 0.3010 | 0.8407 | 0.5974 | 0.2841 | 0.001 |
65
- | 0.3216 | 10.0 | 1810 | 0.3007 | 0.8376 | 0.5938 | 0.2711 | 0.001 |
66
- | 0.3216 | 11.0 | 1991 | 0.3036 | 0.8349 | 0.5762 | 0.2773 | 0.001 |
67
- | 0.3167 | 12.0 | 2172 | 0.3013 | 0.8385 | 0.6115 | 0.2674 | 0.001 |
68
- | 0.3167 | 13.0 | 2353 | 0.2978 | 0.8421 | 0.6146 | 0.2648 | 0.001 |
69
- | 0.315 | 14.0 | 2534 | 0.2977 | 0.8400 | 0.6059 | 0.2734 | 0.001 |
70
- | 0.315 | 15.0 | 2715 | 0.2981 | 0.8434 | 0.6075 | 0.2666 | 0.001 |
71
- | 0.315 | 16.0 | 2896 | 0.2974 | 0.8394 | 0.5933 | 0.2747 | 0.001 |
72
- | 0.3147 | 17.0 | 3077 | 0.2984 | 0.8438 | 0.6147 | 0.2664 | 0.001 |
73
- | 0.3147 | 18.0 | 3258 | 0.3023 | 0.8356 | 0.5804 | 0.2763 | 0.001 |
74
- | 0.3147 | 19.0 | 3439 | 0.2985 | 0.8424 | 0.6159 | 0.2739 | 0.001 |
75
- | 0.3122 | 20.0 | 3620 | 0.2968 | 0.8412 | 0.5984 | 0.2807 | 0.001 |
76
- | 0.3122 | 21.0 | 3801 | 0.3005 | 0.8419 | 0.6060 | 0.2703 | 0.001 |
77
- | 0.3122 | 22.0 | 3982 | 0.2982 | 0.8375 | 0.5804 | 0.2747 | 0.001 |
78
- | 0.3149 | 23.0 | 4163 | 0.2939 | 0.8436 | 0.6152 | 0.2781 | 0.001 |
79
- | 0.3149 | 24.0 | 4344 | 0.2948 | 0.8453 | 0.6229 | 0.2760 | 0.001 |
80
- | 0.3118 | 25.0 | 4525 | 0.2968 | 0.8427 | 0.6103 | 0.2737 | 0.001 |
81
- | 0.3118 | 26.0 | 4706 | 0.2956 | 0.8421 | 0.6045 | 0.2755 | 0.001 |
82
- | 0.3118 | 27.0 | 4887 | 0.2959 | 0.8438 | 0.6115 | 0.2765 | 0.001 |
83
- | 0.3126 | 28.0 | 5068 | 0.2955 | 0.8447 | 0.6191 | 0.2693 | 0.001 |
84
- | 0.3126 | 29.0 | 5249 | 0.3011 | 0.8438 | 0.6216 | 0.2664 | 0.001 |
85
- | 0.3126 | 30.0 | 5430 | 0.2921 | 0.8437 | 0.6025 | 0.2810 | 0.0001 |
86
- | 0.3093 | 31.0 | 5611 | 0.2904 | 0.8439 | 0.6072 | 0.2812 | 0.0001 |
87
- | 0.3093 | 32.0 | 5792 | 0.2903 | 0.8437 | 0.6112 | 0.2810 | 0.0001 |
88
- | 0.3093 | 33.0 | 5973 | 0.2889 | 0.8462 | 0.6202 | 0.2854 | 0.0001 |
89
- | 0.3049 | 34.0 | 6154 | 0.2896 | 0.8446 | 0.6151 | 0.2862 | 0.0001 |
90
- | 0.3049 | 35.0 | 6335 | 0.2887 | 0.8449 | 0.6112 | 0.2867 | 0.0001 |
91
- | 0.3012 | 36.0 | 6516 | 0.2889 | 0.8447 | 0.6120 | 0.2836 | 0.0001 |
92
- | 0.3012 | 37.0 | 6697 | 0.2883 | 0.8476 | 0.6256 | 0.2867 | 0.0001 |
93
- | 0.3012 | 38.0 | 6878 | 0.2905 | 0.8453 | 0.6057 | 0.2825 | 0.0001 |
94
- | 0.299 | 39.0 | 7059 | 0.2878 | 0.8471 | 0.6254 | 0.2854 | 0.0001 |
95
- | 0.299 | 40.0 | 7240 | 0.2886 | 0.8468 | 0.6223 | 0.2810 | 0.0001 |
96
- | 0.299 | 41.0 | 7421 | 0.2877 | 0.8473 | 0.6261 | 0.2843 | 0.0001 |
97
- | 0.2989 | 42.0 | 7602 | 0.2878 | 0.8477 | 0.6199 | 0.2856 | 0.0001 |
98
- | 0.2989 | 43.0 | 7783 | 0.2872 | 0.8479 | 0.6288 | 0.2830 | 0.0001 |
99
- | 0.2989 | 44.0 | 7964 | 0.2868 | 0.8464 | 0.6190 | 0.2841 | 0.0001 |
100
- | 0.2983 | 45.0 | 8145 | 0.2870 | 0.8463 | 0.6236 | 0.2838 | 0.0001 |
101
- | 0.2983 | 46.0 | 8326 | 0.2868 | 0.8460 | 0.6151 | 0.2825 | 0.0001 |
102
- | 0.298 | 47.0 | 8507 | 0.2872 | 0.8462 | 0.6211 | 0.2846 | 0.0001 |
103
- | 0.298 | 48.0 | 8688 | 0.2866 | 0.8467 | 0.6231 | 0.2836 | 0.0001 |
104
- | 0.298 | 49.0 | 8869 | 0.2863 | 0.8460 | 0.6161 | 0.2859 | 0.0001 |
105
- | 0.2965 | 50.0 | 9050 | 0.2864 | 0.8483 | 0.6255 | 0.2846 | 0.0001 |
106
- | 0.2965 | 51.0 | 9231 | 0.2891 | 0.8486 | 0.6278 | 0.2849 | 0.0001 |
107
- | 0.2965 | 52.0 | 9412 | 0.2856 | 0.8464 | 0.6255 | 0.2851 | 0.0001 |
108
- | 0.2956 | 53.0 | 9593 | 0.2872 | 0.8490 | 0.6458 | 0.2789 | 0.0001 |
109
- | 0.2956 | 54.0 | 9774 | 0.2856 | 0.8477 | 0.6244 | 0.2903 | 0.0001 |
110
- | 0.2956 | 55.0 | 9955 | 0.2857 | 0.8475 | 0.6340 | 0.2846 | 0.0001 |
111
- | 0.2958 | 56.0 | 10136 | 0.2862 | 0.8466 | 0.6241 | 0.2867 | 0.0001 |
112
- | 0.2958 | 57.0 | 10317 | 0.2871 | 0.8454 | 0.6249 | 0.2862 | 0.0001 |
113
- | 0.2958 | 58.0 | 10498 | 0.2858 | 0.8492 | 0.6334 | 0.2812 | 0.0001 |
114
- | 0.2954 | 59.0 | 10679 | 0.2862 | 0.8468 | 0.6178 | 0.2888 | 1e-05 |
115
- | 0.2954 | 60.0 | 10860 | 0.2847 | 0.8485 | 0.6276 | 0.2854 | 1e-05 |
116
- | 0.2923 | 61.0 | 11041 | 0.2849 | 0.8480 | 0.6224 | 0.2830 | 1e-05 |
117
- | 0.2923 | 62.0 | 11222 | 0.2855 | 0.8469 | 0.6248 | 0.2843 | 1e-05 |
118
- | 0.2923 | 63.0 | 11403 | 0.2849 | 0.8489 | 0.6275 | 0.2828 | 1e-05 |
119
- | 0.2918 | 64.0 | 11584 | 0.2846 | 0.8475 | 0.6371 | 0.2823 | 1e-05 |
120
- | 0.2918 | 65.0 | 11765 | 0.2860 | 0.8468 | 0.6241 | 0.2869 | 1e-05 |
121
- | 0.2918 | 66.0 | 11946 | 0.2847 | 0.8481 | 0.6347 | 0.2841 | 1e-05 |
122
- | 0.2906 | 67.0 | 12127 | 0.2853 | 0.8488 | 0.6287 | 0.2854 | 1e-05 |
123
- | 0.2906 | 68.0 | 12308 | 0.2853 | 0.8480 | 0.6321 | 0.2867 | 1e-05 |
124
- | 0.2906 | 69.0 | 12489 | 0.2848 | 0.8477 | 0.6397 | 0.2836 | 1e-05 |
125
- | 0.2918 | 70.0 | 12670 | 0.2853 | 0.8492 | 0.6381 | 0.2823 | 1e-05 |
126
- | 0.2918 | 71.0 | 12851 | 0.2851 | 0.8476 | 0.6325 | 0.2882 | 0.0000 |
127
- | 0.2918 | 72.0 | 13032 | 0.2845 | 0.8474 | 0.6236 | 0.2849 | 0.0000 |
128
- | 0.2918 | 73.0 | 13213 | 0.2845 | 0.8476 | 0.6333 | 0.2812 | 0.0000 |
129
- | 0.2918 | 74.0 | 13394 | 0.2845 | 0.8466 | 0.6300 | 0.2828 | 0.0000 |
130
- | 0.2913 | 75.0 | 13575 | 0.2851 | 0.8474 | 0.6235 | 0.2820 | 0.0000 |
131
- | 0.2913 | 76.0 | 13756 | 0.2860 | 0.8473 | 0.6186 | 0.2880 | 0.0000 |
132
- | 0.2913 | 77.0 | 13937 | 0.2858 | 0.8459 | 0.6173 | 0.2856 | 0.0000 |
133
- | 0.2913 | 78.0 | 14118 | 0.2844 | 0.8481 | 0.6326 | 0.2843 | 0.0000 |
134
- | 0.2913 | 79.0 | 14299 | 0.2871 | 0.8472 | 0.6179 | 0.2875 | 0.0000 |
135
- | 0.2913 | 80.0 | 14480 | 0.2848 | 0.8477 | 0.6287 | 0.2838 | 0.0000 |
136
- | 0.2915 | 81.0 | 14661 | 0.2848 | 0.8490 | 0.6305 | 0.2854 | 0.0000 |
137
- | 0.2915 | 82.0 | 14842 | 0.2851 | 0.8480 | 0.6394 | 0.2859 | 0.0000 |
138
- | 0.2913 | 83.0 | 15023 | 0.2846 | 0.8488 | 0.6255 | 0.2856 | 0.0000 |
139
- | 0.2913 | 84.0 | 15204 | 0.2857 | 0.8482 | 0.6458 | 0.2833 | 0.0000 |
140
- | 0.2913 | 85.0 | 15385 | 0.2855 | 0.8488 | 0.6340 | 0.2812 | 0.0000 |
141
- | 0.2922 | 86.0 | 15566 | 0.2849 | 0.8480 | 0.6363 | 0.2859 | 0.0000 |
142
- | 0.2922 | 87.0 | 15747 | 0.2845 | 0.8474 | 0.6328 | 0.2851 | 0.0000 |
143
- | 0.2922 | 88.0 | 15928 | 0.2854 | 0.8478 | 0.6371 | 0.2812 | 0.0000 |
144
-
145
-
146
- ### Framework versions
147
-
148
- - Transformers 4.41.0
149
- - Pytorch 2.5.0+cu124
150
- - Datasets 3.0.2
151
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: cc0-1.0
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: drone-DinoVdeau-produttoria_binary-binary-large-2024_11_03-batch-size64_freeze
 
11
  model-index:
12
  - name: drone-DinoVdeau-produttoria_binary-binary-large-2024_11_03-batch-size64_freeze
13
  results: []
14
  ---
15
 
16
+ drone-DinoVdeau-produttoria_binary-binary is a fine-tuned version of [drone-DinoVdeau-produttoria_binary-binary-large-2024_11_03-batch-size64_freeze](https://huggingface.co/drone-DinoVdeau-produttoria_binary-binary-large-2024_11_03-batch-size64_freeze). It achieves the following results on the test set:
 
17
 
 
18
 
 
 
19
  - Loss: 0.2854
20
  - F1 Micro: 0.8468
21
  - F1 Macro: 0.6351
22
  - Accuracy: 0.2786
 
23
 
24
+ | Class | F1 per class |
25
+ |----------|-------|
26
+ | Acropore_branched | 0.8084 |
27
+ | Acropore_digitised | 0.5125 |
28
+ | Acropore_tabular | 0.3951 |
29
+ | Algae | 0.9562 |
30
+ | Dead_coral | 0.7470 |
31
+ | Fish | 0.6639 |
32
+ | Millepore | 0.3021 |
33
+ | No_acropore_encrusting | 0.5923 |
34
+ | No_acropore_massive | 0.7651 |
35
+ | No_acropore_sub_massive | 0.6345 |
36
+ | Rock | 0.9536 |
37
+ | Rubble | 0.9042 |
38
+ | Sand | 0.9008 |
39
+
40
 
41
+ ---
42
 
43
+ # Model description
44
+ drone-DinoVdeau-produttoria_binary-binary is a model built on top of drone-DinoVdeau-produttoria_binary-binary-large-2024_11_03-batch-size64_freeze model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
45
 
46
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
47
 
48
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
49
+
50
+ ---
51
 
52
+ # Intended uses & limitations
53
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
54
 
55
+ ---
56
+
57
+ # Training and evaluation data
58
+ Details on the number of images for each class are given in the following table:
59
+ | Class | train | test | val | Total |
60
+ |:------------------------|--------:|-------:|------:|--------:|
61
+ | Acropore_branched | 1483 | 522 | 529 | 2534 |
62
+ | Acropore_digitised | 1085 | 371 | 362 | 1818 |
63
+ | Acropore_tabular | 486 | 176 | 178 | 840 |
64
+ | Algae | 10340 | 3441 | 3461 | 17242 |
65
+ | Dead_coral | 3710 | 1252 | 1267 | 6229 |
66
+ | Fish | 1462 | 517 | 515 | 2494 |
67
+ | Millepore | 746 | 282 | 273 | 1301 |
68
+ | No_acropore_encrusting | 1993 | 751 | 728 | 3472 |
69
+ | No_acropore_massive | 4450 | 1581 | 1649 | 7680 |
70
+ | No_acropore_sub_massive | 3034 | 1102 | 1113 | 5249 |
71
+ | Rock | 10225 | 3429 | 3445 | 17099 |
72
+ | Rubble | 9353 | 3100 | 3105 | 15558 |
73
+ | Sand | 9271 | 3101 | 3132 | 15504 |
74
+
75
+ ---
76
 
77
+ # Training procedure
78
+
79
+ ## Training hyperparameters
80
 
81
  The following hyperparameters were used during training:
82
+
83
+ - **Number of Epochs**: 88.0
84
+ - **Learning Rate**: 0.001
85
+ - **Train Batch Size**: 64
86
+ - **Eval Batch Size**: 64
87
+ - **Optimizer**: Adam
88
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
89
+ - **Freeze Encoder**: Yes
90
+ - **Data Augmentation**: Yes
91
+
92
+
93
+ ## Data Augmentation
94
+ Data were augmented using the following transformations :
95
+
96
+ Train Transforms
97
+ - **PreProcess**: No additional parameters
98
+ - **Resize**: probability=1.00
99
+ - **RandomHorizontalFlip**: probability=0.25
100
+ - **RandomVerticalFlip**: probability=0.25
101
+ - **ColorJiggle**: probability=0.25
102
+ - **RandomPerspective**: probability=0.25
103
+ - **Normalize**: probability=1.00
104
+
105
+ Val Transforms
106
+ - **PreProcess**: No additional parameters
107
+ - **Resize**: probability=1.00
108
+ - **Normalize**: probability=1.00
109
+
110
+
111
+
112
+ ## Training results
113
+ Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
114
+ --- | --- | --- | --- | --- | ---
115
+ 1 | 0.3235681354999542 | 0.2630072840790843 | 0.8262109753225342 | 0.5774239185038708 | 0.001
116
+ 2 | 0.3146470785140991 | 0.24115504682622269 | 0.8378565084377776 | 0.6199165901601139 | 0.001
117
+ 3 | 0.3090434670448303 | 0.2554630593132154 | 0.8398465111582348 | 0.6043570009634397 | 0.001
118
+ 4 | 0.30735355615615845 | 0.25624349635796045 | 0.8348980169243037 | 0.600278483167516 | 0.001
119
+ 5 | 0.30385810136795044 | 0.2515608740894901 | 0.8405948994360434 | 0.6247746971203368 | 0.001
120
+ 6 | 0.3059956729412079 | 0.2596253902185224 | 0.841987466427932 | 0.6225111439021958 | 0.001
121
+ 7 | 0.3013758361339569 | 0.28199791883454733 | 0.8387498056289846 | 0.5954695621655504 | 0.001
122
+ 8 | 0.30131709575653076 | 0.2702913631633715 | 0.8390550208451284 | 0.5974832028652961 | 0.001
123
+ 9 | 0.30098479986190796 | 0.28407908428720086 | 0.8406665130922214 | 0.5974259992816957 | 0.001
124
+ 10 | 0.30072343349456787 | 0.27107180020811655 | 0.8376187886791475 | 0.5937940362628795 | 0.001
125
+ 11 | 0.3035621643066406 | 0.277315296566077 | 0.8348592565387339 | 0.5761905737205768 | 0.001
126
+ 12 | 0.3012838363647461 | 0.26742976066597296 | 0.838466245156027 | 0.6114755503631268 | 0.001
127
+ 13 | 0.29778778553009033 | 0.2648283038501561 | 0.8421213122252433 | 0.6145726431106396 | 0.001
128
+ 14 | 0.29774588346481323 | 0.27341311134235174 | 0.8399742101869762 | 0.605884177295118 | 0.001
129
+ 15 | 0.29809942841529846 | 0.2666493236212279 | 0.8433503513117323 | 0.6074624445346274 | 0.001
130
+ 16 | 0.29744812846183777 | 0.27471383975026015 | 0.8394100355835181 | 0.5932952143692389 | 0.001
131
+ 17 | 0.2983638644218445 | 0.2663891779396462 | 0.8437578624264077 | 0.6146867059353278 | 0.001
132
+ 18 | 0.3023049235343933 | 0.2762747138397503 | 0.8356339535005088 | 0.5803903225868541 | 0.001
133
+ 19 | 0.2984697222709656 | 0.2739334027055151 | 0.8423529411764706 | 0.6158875389283108 | 0.001
134
+ 20 | 0.29680272936820984 | 0.28069719042663893 | 0.8411767731317183 | 0.5984147849283556 | 0.001
135
+ 21 | 0.30051520466804504 | 0.2702913631633715 | 0.8418969323285377 | 0.6060492619397649 | 0.001
136
+ 22 | 0.29818177223205566 | 0.27471383975026015 | 0.8374817746302854 | 0.580353532272699 | 0.001
137
+ 23 | 0.29393449425697327 | 0.27809573361082207 | 0.8436262061960386 | 0.615237110287355 | 0.001
138
+ 24 | 0.2948347330093384 | 0.27601456815816855 | 0.8453232862164007 | 0.6228721497006335 | 0.001
139
+ 25 | 0.29676035046577454 | 0.2736732570239334 | 0.8427456149244652 | 0.610255370235793 | 0.001
140
+ 26 | 0.2955995500087738 | 0.2754942767950052 | 0.8420542140997499 | 0.6045462014226007 | 0.001
141
+ 27 | 0.29585039615631104 | 0.27653485952133194 | 0.8437684356323902 | 0.6115221375683754 | 0.001
142
+ 28 | 0.295540988445282 | 0.26925078043704476 | 0.8446938104986479 | 0.6191186747828321 | 0.001
143
+ 29 | 0.3010655343532562 | 0.2663891779396462 | 0.8437664387164651 | 0.6215750043898619 | 0.001
144
+ 30 | 0.29214760661125183 | 0.2809573361082206 | 0.8437435686355217 | 0.6025311078598518 | 0.0001
145
+ 31 | 0.29040178656578064 | 0.28121748178980227 | 0.8439103638567266 | 0.6071651131848005 | 0.0001
146
+ 32 | 0.29034462571144104 | 0.2809573361082206 | 0.8437194965322373 | 0.6111569473926136 | 0.0001
147
+ 33 | 0.2888760268688202 | 0.28537981269510926 | 0.8461617038663874 | 0.6202495870793918 | 0.0001
148
+ 34 | 0.28964364528656006 | 0.2861602497398543 | 0.8446023671361742 | 0.6150504150317478 | 0.0001
149
+ 35 | 0.28874215483665466 | 0.2866805411030177 | 0.8449244728566273 | 0.611180048847438 | 0.0001
150
+ 36 | 0.2888963222503662 | 0.28355879292403746 | 0.8447173058645225 | 0.6119874534823754 | 0.0001
151
+ 37 | 0.288282573223114 | 0.2866805411030177 | 0.8475834540970686 | 0.6255767175486281 | 0.0001
152
+ 38 | 0.29050976037979126 | 0.28251821019771073 | 0.8452536426724028 | 0.6057239934398935 | 0.0001
153
+ 39 | 0.28778275847435 | 0.28537981269510926 | 0.8470600182796791 | 0.625366961909805 | 0.0001
154
+ 40 | 0.2885717749595642 | 0.2809573361082206 | 0.8468000302716884 | 0.622337777946806 | 0.0001
155
+ 41 | 0.28773826360702515 | 0.2843392299687825 | 0.847323400258903 | 0.6260539681026288 | 0.0001
156
+ 42 | 0.28776827454566956 | 0.28563995837669093 | 0.8476613005450627 | 0.6199392946357273 | 0.0001
157
+ 43 | 0.28717148303985596 | 0.28303850156087407 | 0.8479237095716232 | 0.6287571427217789 | 0.0001
158
+ 44 | 0.28678667545318604 | 0.28407908428720086 | 0.8463665693654939 | 0.6189979239207937 | 0.0001
159
+ 45 | 0.28698909282684326 | 0.28381893860561913 | 0.8462928555066304 | 0.6235508782461164 | 0.0001
160
+ 46 | 0.2868472635746002 | 0.28251821019771073 | 0.8459846547314578 | 0.6151318511304835 | 0.0001
161
+ 47 | 0.28715068101882935 | 0.2845993756503642 | 0.8462129359348595 | 0.6211457155619424 | 0.0001
162
+ 48 | 0.28661593794822693 | 0.28355879292403746 | 0.8466852933705867 | 0.6231150403485404 | 0.0001
163
+ 49 | 0.28633347153663635 | 0.28590010405827265 | 0.8460415439387342 | 0.616055362439494 | 0.0001
164
+ 50 | 0.28642749786376953 | 0.2845993756503642 | 0.8482882700250868 | 0.625458075101288 | 0.0001
165
+ 51 | 0.2890762686729431 | 0.28485952133194586 | 0.848592785832539 | 0.6278100779578839 | 0.0001
166
+ 52 | 0.2855978012084961 | 0.2851196670135276 | 0.8464228285561143 | 0.6255462096645672 | 0.0001
167
+ 53 | 0.2872205674648285 | 0.27887617065556713 | 0.8489991514001897 | 0.6457587856102145 | 0.0001
168
+ 54 | 0.2855803072452545 | 0.2903225806451613 | 0.8476844874709444 | 0.6243869856844756 | 0.0001
169
+ 55 | 0.28568968176841736 | 0.2845993756503642 | 0.8475136716266056 | 0.6339630509281279 | 0.0001
170
+ 56 | 0.28617897629737854 | 0.2866805411030177 | 0.8465597622829039 | 0.6241465491773776 | 0.0001
171
+ 57 | 0.2870914936065674 | 0.2861602497398543 | 0.845436853426201 | 0.6249269702519318 | 0.0001
172
+ 58 | 0.2857914865016937 | 0.28121748178980227 | 0.8491941382702348 | 0.6333866717026029 | 0.0001
173
+ 59 | 0.28617140650749207 | 0.2887617065556712 | 0.8468232576049287 | 0.6178461796051926 | 1e-05
174
+ 60 | 0.2846605181694031 | 0.28537981269510926 | 0.8485033598045205 | 0.6275748058546806 | 1e-05
175
+ 61 | 0.2848633825778961 | 0.28303850156087407 | 0.8479865171982329 | 0.6223888517425455 | 1e-05
176
+ 62 | 0.28548601269721985 | 0.2843392299687825 | 0.8469200122586577 | 0.6247632003821695 | 1e-05
177
+ 63 | 0.28493326902389526 | 0.2827783558792924 | 0.8488979777323336 | 0.6274806463168713 | 1e-05
178
+ 64 | 0.28459736704826355 | 0.28225806451612906 | 0.8475187206498287 | 0.6370787064578803 | 1e-05
179
+ 65 | 0.2860054671764374 | 0.2869406867845994 | 0.8467700785794469 | 0.6240984315849201 | 1e-05
180
+ 66 | 0.2847185730934143 | 0.28407908428720086 | 0.8481340441736481 | 0.6346693986906206 | 1e-05
181
+ 67 | 0.28529325127601624 | 0.28537981269510926 | 0.8487528745798691 | 0.6287121285420982 | 1e-05
182
+ 68 | 0.2852926254272461 | 0.2866805411030177 | 0.8480251642525557 | 0.6321379394582358 | 1e-05
183
+ 69 | 0.284834623336792 | 0.28355879292403746 | 0.847692190707931 | 0.6397237492354447 | 1e-05
184
+ 70 | 0.28527727723121643 | 0.28225806451612906 | 0.8492167101827677 | 0.6381143671040704 | 1e-05
185
+ 71 | 0.28507113456726074 | 0.2882414151925078 | 0.8475971370143149 | 0.6325489300082728 | 1.0000000000000002e-06
186
+ 72 | 0.28452861309051514 | 0.28485952133194586 | 0.8474255781269963 | 0.6236352127811986 | 1.0000000000000002e-06
187
+ 73 | 0.28448227047920227 | 0.28121748178980227 | 0.847641772858811 | 0.6333277250193455 | 1.0000000000000002e-06
188
+ 74 | 0.28447526693344116 | 0.2827783558792924 | 0.8465770953294945 | 0.6300187593616763 | 1.0000000000000002e-06
189
+ 75 | 0.2851284146308899 | 0.28199791883454733 | 0.8473772748126625 | 0.6235297745568456 | 1.0000000000000002e-06
190
+ 76 | 0.2859683036804199 | 0.2879812695109261 | 0.847320835674516 | 0.6186062513830065 | 1.0000000000000002e-06
191
+ 77 | 0.2858298718929291 | 0.28563995837669093 | 0.8459046737621472 | 0.6172786558676017 | 1.0000000000000002e-06
192
+ 78 | 0.28438833355903625 | 0.2843392299687825 | 0.8480547459130655 | 0.6325947858436887 | 1.0000000000000002e-06
193
+ 79 | 0.2870919704437256 | 0.2874609781477627 | 0.8472353346431579 | 0.617917490234713 | 1.0000000000000002e-06
194
+ 80 | 0.28482332825660706 | 0.28381893860561913 | 0.8477330616403465 | 0.6286567457369128 | 1.0000000000000002e-06
195
+ 81 | 0.2847617268562317 | 0.28537981269510926 | 0.8489678202792957 | 0.6304525529970205 | 1.0000000000000002e-06
196
+ 82 | 0.28511229157447815 | 0.28590010405827265 | 0.8480416961845967 | 0.6394217270135759 | 1.0000000000000002e-06
197
+ 83 | 0.284644216299057 | 0.28563995837669093 | 0.8488055562622434 | 0.6255055774993536 | 1.0000000000000002e-06
198
+ 84 | 0.2857225835323334 | 0.2832986472424558 | 0.848188643119867 | 0.6457553263622914 | 1.0000000000000002e-06
199
+ 85 | 0.28550758957862854 | 0.28121748178980227 | 0.848818698673405 | 0.6339586571635658 | 1.0000000000000002e-07
200
+ 86 | 0.284895658493042 | 0.28590010405827265 | 0.8479890588592848 | 0.6362631688004041 | 1.0000000000000002e-07
201
+ 87 | 0.2845035493373871 | 0.2851196670135276 | 0.8473590201582036 | 0.6327749126527296 | 1.0000000000000002e-07
202
+ 88 | 0.28541097044944763 | 0.28121748178980227 | 0.8477551536613127 | 0.6370893160624239 | 1.0000000000000002e-07
203
+
204
+
205
+ ---
206
+
207
+ # Framework Versions
208
+
209
+ - **Transformers**: 4.41.0
210
+ - **Pytorch**: 2.5.0+cu124
211
+ - **Datasets**: 3.0.2
212
+ - **Tokenizers**: 0.19.1
213
+