File size: 13,128 Bytes
ac02534
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
---
license: apache-2.0
base_model: facebook/dinov2-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: DinoVdeau-base-2024_09_03-batch-size32_epochs150_freeze
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# DinoVdeau-base-2024_09_03-batch-size32_epochs150_freeze

This model is a fine-tuned version of [facebook/dinov2-base](https://huggingface.co/facebook/dinov2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1260
- F1 Micro: 0.8131
- F1 Macro: 0.6976
- Roc Auc: 0.8760
- Accuracy: 0.3014
- Learning Rate: 0.0000

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate   |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
| No log        | 1.0   | 273   | 0.1752          | 0.7311   | 0.5105   | 0.8187  | 0.2079   | 0.001  |
| 0.2857        | 2.0   | 546   | 0.1578          | 0.7583   | 0.5498   | 0.8363  | 0.2349   | 0.001  |
| 0.2857        | 3.0   | 819   | 0.1516          | 0.7722   | 0.6037   | 0.8505  | 0.2315   | 0.001  |
| 0.1764        | 4.0   | 1092  | 0.1522          | 0.7650   | 0.6140   | 0.8387  | 0.2422   | 0.001  |
| 0.1764        | 5.0   | 1365  | 0.1484          | 0.7720   | 0.6162   | 0.8403  | 0.2422   | 0.001  |
| 0.1677        | 6.0   | 1638  | 0.1482          | 0.7750   | 0.6052   | 0.8435  | 0.2561   | 0.001  |
| 0.1677        | 7.0   | 1911  | 0.1486          | 0.7729   | 0.6177   | 0.8431  | 0.2419   | 0.001  |
| 0.1652        | 8.0   | 2184  | 0.1486          | 0.7767   | 0.6172   | 0.8485  | 0.2512   | 0.001  |
| 0.1652        | 9.0   | 2457  | 0.1483          | 0.7805   | 0.6366   | 0.8570  | 0.2512   | 0.001  |
| 0.1617        | 10.0  | 2730  | 0.1503          | 0.7683   | 0.6081   | 0.8352  | 0.2453   | 0.001  |
| 0.1615        | 11.0  | 3003  | 0.1441          | 0.7757   | 0.6200   | 0.8409  | 0.2609   | 0.001  |
| 0.1615        | 12.0  | 3276  | 0.1487          | 0.7815   | 0.6299   | 0.8543  | 0.2495   | 0.001  |
| 0.1614        | 13.0  | 3549  | 0.1490          | 0.7779   | 0.6242   | 0.8446  | 0.2519   | 0.001  |
| 0.1614        | 14.0  | 3822  | 0.1434          | 0.7826   | 0.6379   | 0.8475  | 0.2606   | 0.001  |
| 0.1599        | 15.0  | 4095  | 0.1435          | 0.7874   | 0.6397   | 0.8552  | 0.2554   | 0.001  |
| 0.1599        | 16.0  | 4368  | 0.1439          | 0.7793   | 0.6344   | 0.8464  | 0.2568   | 0.001  |
| 0.1589        | 17.0  | 4641  | 0.1448          | 0.7878   | 0.6422   | 0.8596  | 0.2543   | 0.001  |
| 0.1589        | 18.0  | 4914  | 0.1440          | 0.7865   | 0.6417   | 0.8552  | 0.2568   | 0.001  |
| 0.1604        | 19.0  | 5187  | 0.1420          | 0.7864   | 0.6318   | 0.8550  | 0.2540   | 0.001  |
| 0.1604        | 20.0  | 5460  | 0.1409          | 0.7869   | 0.6409   | 0.8522  | 0.2588   | 0.001  |
| 0.1586        | 21.0  | 5733  | 0.1425          | 0.7865   | 0.6413   | 0.8561  | 0.2620   | 0.001  |
| 0.1587        | 22.0  | 6006  | 0.1538          | 0.7854   | 0.6371   | 0.8608  | 0.2370   | 0.001  |
| 0.1587        | 23.0  | 6279  | 0.1419          | 0.7842   | 0.6390   | 0.8497  | 0.2557   | 0.001  |
| 0.1592        | 24.0  | 6552  | 0.1414          | 0.7870   | 0.6459   | 0.8561  | 0.2599   | 0.001  |
| 0.1592        | 25.0  | 6825  | 0.1399          | 0.7868   | 0.6263   | 0.8523  | 0.2685   | 0.001  |
| 0.1586        | 26.0  | 7098  | 0.1465          | 0.7847   | 0.6238   | 0.8561  | 0.2592   | 0.001  |
| 0.1586        | 27.0  | 7371  | 0.1551          | 0.7720   | 0.6344   | 0.8433  | 0.2380   | 0.001  |
| 0.16          | 28.0  | 7644  | 0.1443          | 0.7891   | 0.6430   | 0.8550  | 0.2616   | 0.001  |
| 0.16          | 29.0  | 7917  | 0.1428          | 0.7874   | 0.6416   | 0.8565  | 0.2568   | 0.001  |
| 0.1589        | 30.0  | 8190  | 0.1416          | 0.7799   | 0.6308   | 0.8425  | 0.2526   | 0.001  |
| 0.1589        | 31.0  | 8463  | 0.1398          | 0.7895   | 0.6431   | 0.8566  | 0.2689   | 0.001  |
| 0.1588        | 32.0  | 8736  | 0.1448          | 0.7891   | 0.6521   | 0.8601  | 0.2568   | 0.001  |
| 0.1581        | 33.0  | 9009  | 0.1404          | 0.7896   | 0.6497   | 0.8582  | 0.2640   | 0.001  |
| 0.1581        | 34.0  | 9282  | 0.1426          | 0.7871   | 0.6449   | 0.8537  | 0.2557   | 0.001  |
| 0.1578        | 35.0  | 9555  | 0.1414          | 0.7846   | 0.6428   | 0.8487  | 0.2630   | 0.001  |
| 0.1578        | 36.0  | 9828  | 0.1465          | 0.7834   | 0.6434   | 0.8484  | 0.2678   | 0.001  |
| 0.1576        | 37.0  | 10101 | 0.1380          | 0.7924   | 0.6438   | 0.8577  | 0.2668   | 0.001  |
| 0.1576        | 38.0  | 10374 | 0.1392          | 0.7892   | 0.6475   | 0.8555  | 0.2637   | 0.001  |
| 0.1556        | 39.0  | 10647 | 0.1458          | 0.7872   | 0.6592   | 0.8680  | 0.2460   | 0.001  |
| 0.1556        | 40.0  | 10920 | 0.1389          | 0.7946   | 0.6469   | 0.8660  | 0.2699   | 0.001  |
| 0.1577        | 41.0  | 11193 | 0.1402          | 0.7848   | 0.6510   | 0.8491  | 0.2616   | 0.001  |
| 0.1577        | 42.0  | 11466 | 0.1404          | 0.7928   | 0.6609   | 0.8625  | 0.2717   | 0.001  |
| 0.1576        | 43.0  | 11739 | 0.1394          | 0.7931   | 0.6427   | 0.8593  | 0.2696   | 0.001  |
| 0.1543        | 44.0  | 12012 | 0.1367          | 0.7989   | 0.6568   | 0.8632  | 0.2755   | 0.0001 |
| 0.1543        | 45.0  | 12285 | 0.1362          | 0.8018   | 0.6686   | 0.8652  | 0.2827   | 0.0001 |
| 0.1481        | 46.0  | 12558 | 0.1338          | 0.8022   | 0.6640   | 0.8656  | 0.2852   | 0.0001 |
| 0.1481        | 47.0  | 12831 | 0.1410          | 0.7999   | 0.6573   | 0.8621  | 0.2786   | 0.0001 |
| 0.1472        | 48.0  | 13104 | 0.1338          | 0.8044   | 0.6728   | 0.8675  | 0.2848   | 0.0001 |
| 0.1472        | 49.0  | 13377 | 0.1322          | 0.8058   | 0.6742   | 0.8724  | 0.2855   | 0.0001 |
| 0.1448        | 50.0  | 13650 | 0.1332          | 0.8063   | 0.6739   | 0.8703  | 0.2897   | 0.0001 |
| 0.1448        | 51.0  | 13923 | 0.1306          | 0.8063   | 0.6771   | 0.8702  | 0.2897   | 0.0001 |
| 0.1432        | 52.0  | 14196 | 0.1311          | 0.8044   | 0.6727   | 0.8654  | 0.2872   | 0.0001 |
| 0.1432        | 53.0  | 14469 | 0.1316          | 0.8071   | 0.6703   | 0.8713  | 0.2872   | 0.0001 |
| 0.1438        | 54.0  | 14742 | 0.1316          | 0.8064   | 0.6788   | 0.8688  | 0.2883   | 0.0001 |
| 0.1417        | 55.0  | 15015 | 0.1308          | 0.8061   | 0.6699   | 0.8686  | 0.2876   | 0.0001 |
| 0.1417        | 56.0  | 15288 | 0.1297          | 0.8094   | 0.6800   | 0.8744  | 0.2942   | 0.0001 |
| 0.1415        | 57.0  | 15561 | 0.1296          | 0.8087   | 0.6717   | 0.8711  | 0.2935   | 0.0001 |
| 0.1415        | 58.0  | 15834 | 0.1297          | 0.8069   | 0.6785   | 0.8708  | 0.2924   | 0.0001 |
| 0.1413        | 59.0  | 16107 | 0.1300          | 0.8087   | 0.6811   | 0.8707  | 0.2911   | 0.0001 |
| 0.1413        | 60.0  | 16380 | 0.1302          | 0.8056   | 0.6726   | 0.8658  | 0.2879   | 0.0001 |
| 0.1404        | 61.0  | 16653 | 0.1287          | 0.8096   | 0.6843   | 0.8721  | 0.2949   | 0.0001 |
| 0.1404        | 62.0  | 16926 | 0.1291          | 0.8080   | 0.6822   | 0.8690  | 0.2900   | 0.0001 |
| 0.1393        | 63.0  | 17199 | 0.1287          | 0.8076   | 0.6813   | 0.8685  | 0.2980   | 0.0001 |
| 0.1393        | 64.0  | 17472 | 0.1286          | 0.8091   | 0.6806   | 0.8722  | 0.2959   | 0.0001 |
| 0.1395        | 65.0  | 17745 | 0.1280          | 0.8093   | 0.6838   | 0.8704  | 0.2931   | 0.0001 |
| 0.1389        | 66.0  | 18018 | 0.1278          | 0.8108   | 0.6855   | 0.8744  | 0.2959   | 0.0001 |
| 0.1389        | 67.0  | 18291 | 0.1282          | 0.8098   | 0.6849   | 0.8746  | 0.2949   | 0.0001 |
| 0.1376        | 68.0  | 18564 | 0.1280          | 0.8123   | 0.6903   | 0.8771  | 0.2980   | 0.0001 |
| 0.1376        | 69.0  | 18837 | 0.1280          | 0.8105   | 0.6800   | 0.8711  | 0.2952   | 0.0001 |
| 0.1375        | 70.0  | 19110 | 0.1276          | 0.8096   | 0.6848   | 0.8709  | 0.2931   | 0.0001 |
| 0.1375        | 71.0  | 19383 | 0.1279          | 0.8073   | 0.6797   | 0.8675  | 0.2904   | 0.0001 |
| 0.1368        | 72.0  | 19656 | 0.1278          | 0.8103   | 0.6802   | 0.8719  | 0.2938   | 0.0001 |
| 0.1368        | 73.0  | 19929 | 0.1272          | 0.8091   | 0.6806   | 0.8683  | 0.2976   | 0.0001 |
| 0.137         | 74.0  | 20202 | 0.1280          | 0.8064   | 0.6777   | 0.8648  | 0.2935   | 0.0001 |
| 0.137         | 75.0  | 20475 | 0.1273          | 0.8110   | 0.6885   | 0.8731  | 0.2924   | 0.0001 |
| 0.1367        | 76.0  | 20748 | 0.1273          | 0.8089   | 0.6811   | 0.8696  | 0.2973   | 0.0001 |
| 0.1358        | 77.0  | 21021 | 0.1275          | 0.8102   | 0.6863   | 0.8739  | 0.2924   | 0.0001 |
| 0.1358        | 78.0  | 21294 | 0.1271          | 0.8122   | 0.6897   | 0.8765  | 0.2945   | 0.0001 |
| 0.1352        | 79.0  | 21567 | 0.1271          | 0.8098   | 0.6882   | 0.8697  | 0.2935   | 0.0001 |
| 0.1352        | 80.0  | 21840 | 0.1272          | 0.8124   | 0.6914   | 0.8773  | 0.2983   | 0.0001 |
| 0.1353        | 81.0  | 22113 | 0.1265          | 0.8104   | 0.6899   | 0.8716  | 0.2966   | 0.0001 |
| 0.1353        | 82.0  | 22386 | 0.1264          | 0.8105   | 0.6845   | 0.8694  | 0.2914   | 0.0001 |
| 0.1337        | 83.0  | 22659 | 0.1273          | 0.8100   | 0.6832   | 0.8701  | 0.2935   | 0.0001 |
| 0.1337        | 84.0  | 22932 | 0.1264          | 0.8124   | 0.6944   | 0.8756  | 0.2959   | 0.0001 |
| 0.1354        | 85.0  | 23205 | 0.1265          | 0.8127   | 0.6880   | 0.8750  | 0.2973   | 0.0001 |
| 0.1354        | 86.0  | 23478 | 0.1259          | 0.8136   | 0.6933   | 0.8746  | 0.2952   | 0.0001 |
| 0.1334        | 87.0  | 23751 | 0.1264          | 0.8111   | 0.6882   | 0.8738  | 0.2966   | 0.0001 |
| 0.1335        | 88.0  | 24024 | 0.1264          | 0.8127   | 0.6860   | 0.8754  | 0.2990   | 0.0001 |
| 0.1335        | 89.0  | 24297 | 0.1269          | 0.8140   | 0.6990   | 0.8792  | 0.2983   | 0.0001 |
| 0.1332        | 90.0  | 24570 | 0.1261          | 0.8155   | 0.6994   | 0.8798  | 0.2980   | 0.0001 |
| 0.1332        | 91.0  | 24843 | 0.1268          | 0.8109   | 0.6828   | 0.8728  | 0.2893   | 0.0001 |
| 0.1326        | 92.0  | 25116 | 0.1261          | 0.8124   | 0.6858   | 0.8724  | 0.2952   | 0.0001 |
| 0.1326        | 93.0  | 25389 | 0.1258          | 0.8138   | 0.6897   | 0.8759  | 0.2966   | 1e-05  |
| 0.132         | 94.0  | 25662 | 0.1268          | 0.8138   | 0.6941   | 0.8755  | 0.2976   | 1e-05  |
| 0.132         | 95.0  | 25935 | 0.1257          | 0.8134   | 0.6913   | 0.8750  | 0.2949   | 1e-05  |
| 0.1294        | 96.0  | 26208 | 0.1259          | 0.8147   | 0.6957   | 0.8763  | 0.2976   | 1e-05  |
| 0.1294        | 97.0  | 26481 | 0.1256          | 0.8126   | 0.6941   | 0.8720  | 0.2945   | 1e-05  |
| 0.1302        | 98.0  | 26754 | 0.1253          | 0.8159   | 0.6951   | 0.8785  | 0.2994   | 1e-05  |
| 0.1298        | 99.0  | 27027 | 0.1249          | 0.8142   | 0.6968   | 0.8752  | 0.2994   | 1e-05  |
| 0.1298        | 100.0 | 27300 | 0.1252          | 0.8135   | 0.6936   | 0.8732  | 0.2973   | 1e-05  |
| 0.1304        | 101.0 | 27573 | 0.1248          | 0.8149   | 0.6961   | 0.8765  | 0.2990   | 1e-05  |
| 0.1304        | 102.0 | 27846 | 0.1266          | 0.8137   | 0.6927   | 0.8738  | 0.2963   | 1e-05  |
| 0.1287        | 103.0 | 28119 | 0.1249          | 0.8146   | 0.6954   | 0.8754  | 0.2990   | 1e-05  |
| 0.1287        | 104.0 | 28392 | 0.1252          | 0.8149   | 0.6927   | 0.8770  | 0.2976   | 1e-05  |
| 0.1282        | 105.0 | 28665 | 0.1251          | 0.8152   | 0.6962   | 0.8773  | 0.2990   | 1e-05  |
| 0.1282        | 106.0 | 28938 | 0.1251          | 0.8147   | 0.6964   | 0.8770  | 0.2997   | 1e-05  |
| 0.1293        | 107.0 | 29211 | 0.1250          | 0.8145   | 0.6946   | 0.8759  | 0.2980   | 1e-05  |
| 0.1293        | 108.0 | 29484 | 0.1249          | 0.8145   | 0.6935   | 0.8751  | 0.2997   | 0.0000 |
| 0.129         | 109.0 | 29757 | 0.1253          | 0.8116   | 0.6901   | 0.8713  | 0.2952   | 0.0000 |
| 0.1293        | 110.0 | 30030 | 0.1252          | 0.8144   | 0.6949   | 0.8768  | 0.2980   | 0.0000 |
| 0.1293        | 111.0 | 30303 | 0.1250          | 0.8137   | 0.6932   | 0.8755  | 0.2983   | 0.0000 |


### Framework versions

- Transformers 4.41.1
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1