File size: 9,055 Bytes
2db9804
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
---
license: apache-2.0
base_model: facebook/dinov2-large
tags:
- generated_from_trainer
model-index:
- name: drone-DinoVdeau-produttoria_binary-probabilities-large-2024_11_03-batch-size64_freeze_probs
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# drone-DinoVdeau-produttoria_binary-probabilities-large-2024_11_03-batch-size64_freeze_probs

This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3499
- Rmse: 0.1848
- Mae: 0.1248
- R2: 0.4361
- Explained Variance: 0.4376
- Learning Rate: 1e-05

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 150
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Rmse   | Mae    | R2     | Explained Variance | Rate   |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:------------------:|:------:|
| No log        | 1.0   | 181   | 0.3795          | 0.2067 | 0.1489 | 0.2894 | 0.3009             | 0.001  |
| No log        | 2.0   | 362   | 0.3674          | 0.1983 | 0.1374 | 0.3517 | 0.3548             | 0.001  |
| 0.4416        | 3.0   | 543   | 0.3671          | 0.1981 | 0.1414 | 0.3521 | 0.3569             | 0.001  |
| 0.4416        | 4.0   | 724   | 0.3632          | 0.1952 | 0.1391 | 0.3708 | 0.3749             | 0.001  |
| 0.4416        | 5.0   | 905   | 0.3679          | 0.1993 | 0.1418 | 0.3453 | 0.3614             | 0.001  |
| 0.3813        | 6.0   | 1086  | 0.3625          | 0.1951 | 0.1380 | 0.3718 | 0.3743             | 0.001  |
| 0.3813        | 7.0   | 1267  | 0.3619          | 0.1941 | 0.1348 | 0.3771 | 0.3837             | 0.001  |
| 0.3813        | 8.0   | 1448  | 0.3613          | 0.1935 | 0.1368 | 0.3788 | 0.3809             | 0.001  |
| 0.3785        | 9.0   | 1629  | 0.3604          | 0.1934 | 0.1354 | 0.3812 | 0.3833             | 0.001  |
| 0.3785        | 10.0  | 1810  | 0.3613          | 0.1932 | 0.1338 | 0.3812 | 0.3844             | 0.001  |
| 0.3785        | 11.0  | 1991  | 0.3604          | 0.1931 | 0.1323 | 0.3845 | 0.3857             | 0.001  |
| 0.3743        | 12.0  | 2172  | 0.3618          | 0.1942 | 0.1386 | 0.3774 | 0.3844             | 0.001  |
| 0.3743        | 13.0  | 2353  | 0.3593          | 0.1925 | 0.1343 | 0.3875 | 0.3894             | 0.001  |
| 0.3732        | 14.0  | 2534  | 0.3605          | 0.1932 | 0.1352 | 0.3831 | 0.3863             | 0.001  |
| 0.3732        | 15.0  | 2715  | 0.3605          | 0.1935 | 0.1366 | 0.3817 | 0.3836             | 0.001  |
| 0.3732        | 16.0  | 2896  | 0.3600          | 0.1922 | 0.1312 | 0.3882 | 0.3910             | 0.001  |
| 0.3733        | 17.0  | 3077  | 0.3629          | 0.1932 | 0.1378 | 0.3843 | 0.3882             | 0.001  |
| 0.3733        | 18.0  | 3258  | 0.3615          | 0.1943 | 0.1323 | 0.3768 | 0.3840             | 0.001  |
| 0.3733        | 19.0  | 3439  | 0.3595          | 0.1922 | 0.1330 | 0.3895 | 0.3911             | 0.001  |
| 0.3723        | 20.0  | 3620  | 0.3566          | 0.1902 | 0.1330 | 0.4006 | 0.4041             | 0.0001 |
| 0.3723        | 21.0  | 3801  | 0.3549          | 0.1890 | 0.1306 | 0.4076 | 0.4089             | 0.0001 |
| 0.3723        | 22.0  | 3982  | 0.3545          | 0.1886 | 0.1308 | 0.4096 | 0.4108             | 0.0001 |
| 0.3683        | 23.0  | 4163  | 0.3545          | 0.1882 | 0.1303 | 0.4116 | 0.4124             | 0.0001 |
| 0.3683        | 24.0  | 4344  | 0.3540          | 0.1882 | 0.1317 | 0.4121 | 0.4131             | 0.0001 |
| 0.3654        | 25.0  | 4525  | 0.3546          | 0.1883 | 0.1284 | 0.4113 | 0.4126             | 0.0001 |
| 0.3654        | 26.0  | 4706  | 0.3529          | 0.1876 | 0.1264 | 0.4154 | 0.4165             | 0.0001 |
| 0.3654        | 27.0  | 4887  | 0.3533          | 0.1874 | 0.1294 | 0.4166 | 0.4177             | 0.0001 |
| 0.3652        | 28.0  | 5068  | 0.3532          | 0.1876 | 0.1294 | 0.4160 | 0.4169             | 0.0001 |
| 0.3652        | 29.0  | 5249  | 0.3531          | 0.1871 | 0.1302 | 0.4184 | 0.4192             | 0.0001 |
| 0.3652        | 30.0  | 5430  | 0.3536          | 0.1878 | 0.1292 | 0.4148 | 0.4160             | 0.0001 |
| 0.3628        | 31.0  | 5611  | 0.3531          | 0.1877 | 0.1267 | 0.4152 | 0.4175             | 0.0001 |
| 0.3628        | 32.0  | 5792  | 0.3528          | 0.1876 | 0.1288 | 0.4162 | 0.4168             | 0.0001 |
| 0.3628        | 33.0  | 5973  | 0.3515          | 0.1864 | 0.1273 | 0.4225 | 0.4230             | 0.0001 |
| 0.3638        | 34.0  | 6154  | 0.3520          | 0.1868 | 0.1263 | 0.4202 | 0.4216             | 0.0001 |
| 0.3638        | 35.0  | 6335  | 0.3518          | 0.1866 | 0.1278 | 0.4215 | 0.4220             | 0.0001 |
| 0.3618        | 36.0  | 6516  | 0.3523          | 0.1871 | 0.1285 | 0.4193 | 0.4196             | 0.0001 |
| 0.3618        | 37.0  | 6697  | 0.3516          | 0.1866 | 0.1273 | 0.4217 | 0.4225             | 0.0001 |
| 0.3618        | 38.0  | 6878  | 0.3527          | 0.1878 | 0.1274 | 0.4157 | 0.4184             | 0.0001 |
| 0.3611        | 39.0  | 7059  | 0.3512          | 0.1862 | 0.1266 | 0.4242 | 0.4249             | 0.0001 |
| 0.3611        | 40.0  | 7240  | 0.3521          | 0.1866 | 0.1302 | 0.4224 | 0.4237             | 0.0001 |
| 0.3611        | 41.0  | 7421  | 0.3507          | 0.1858 | 0.1266 | 0.4264 | 0.4275             | 0.0001 |
| 0.3613        | 42.0  | 7602  | 0.3513          | 0.1860 | 0.1278 | 0.4263 | 0.4272             | 0.0001 |
| 0.3613        | 43.0  | 7783  | 0.3511          | 0.1860 | 0.1274 | 0.4262 | 0.4273             | 0.0001 |
| 0.3613        | 44.0  | 7964  | 0.3514          | 0.1859 | 0.1244 | 0.4266 | 0.4282             | 0.0001 |
| 0.3603        | 45.0  | 8145  | 0.3525          | 0.1863 | 0.1273 | 0.4249 | 0.4276             | 0.0001 |
| 0.3603        | 46.0  | 8326  | 0.3505          | 0.1856 | 0.1258 | 0.4275 | 0.4286             | 0.0001 |
| 0.3603        | 47.0  | 8507  | 0.3517          | 0.1866 | 0.1250 | 0.4231 | 0.4258             | 0.0001 |
| 0.3603        | 48.0  | 8688  | 0.3504          | 0.1856 | 0.1259 | 0.4286 | 0.4292             | 0.0001 |
| 0.3603        | 49.0  | 8869  | 0.3507          | 0.1857 | 0.1272 | 0.4274 | 0.4284             | 0.0001 |
| 0.3604        | 50.0  | 9050  | 0.3516          | 0.1857 | 0.1283 | 0.4280 | 0.4289             | 0.0001 |
| 0.3604        | 51.0  | 9231  | 0.3529          | 0.1867 | 0.1288 | 0.4227 | 0.4282             | 0.0001 |
| 0.3604        | 52.0  | 9412  | 0.3506          | 0.1857 | 0.1268 | 0.4282 | 0.4295             | 0.0001 |
| 0.3592        | 53.0  | 9593  | 0.3505          | 0.1856 | 0.1273 | 0.4286 | 0.4302             | 0.0001 |
| 0.3592        | 54.0  | 9774  | 0.3502          | 0.1854 | 0.1266 | 0.4300 | 0.4304             | 0.0001 |
| 0.3592        | 55.0  | 9955  | 0.3501          | 0.1854 | 0.1251 | 0.4299 | 0.4319             | 0.0001 |
| 0.3601        | 56.0  | 10136 | 0.3507          | 0.1858 | 0.1243 | 0.4273 | 0.4294             | 0.0001 |
| 0.3601        | 57.0  | 10317 | 0.3509          | 0.1860 | 0.1253 | 0.4274 | 0.4297             | 0.0001 |
| 0.3601        | 58.0  | 10498 | 0.3493          | 0.1846 | 0.1251 | 0.4338 | 0.4354             | 0.0001 |
| 0.3601        | 59.0  | 10679 | 0.3501          | 0.1855 | 0.1241 | 0.4282 | 0.4299             | 0.0001 |
| 0.3601        | 60.0  | 10860 | 0.3501          | 0.1852 | 0.1259 | 0.4303 | 0.4325             | 0.0001 |
| 0.3588        | 61.0  | 11041 | 0.3498          | 0.1850 | 0.1264 | 0.4305 | 0.4310             | 0.0001 |
| 0.3588        | 62.0  | 11222 | 0.3498          | 0.1850 | 0.1265 | 0.4323 | 0.4333             | 0.0001 |
| 0.3588        | 63.0  | 11403 | 0.3502          | 0.1851 | 0.1270 | 0.4321 | 0.4339             | 0.0001 |
| 0.3579        | 64.0  | 11584 | 0.3500          | 0.1853 | 0.1256 | 0.4300 | 0.4312             | 0.0001 |
| 0.3579        | 65.0  | 11765 | 0.3501          | 0.1854 | 0.1280 | 0.4299 | 0.4304             | 1e-05  |
| 0.3579        | 66.0  | 11946 | 0.3493          | 0.1847 | 0.1253 | 0.4336 | 0.4342             | 1e-05  |
| 0.3564        | 67.0  | 12127 | 0.3494          | 0.1847 | 0.1261 | 0.4334 | 0.4340             | 1e-05  |
| 0.3564        | 68.0  | 12308 | 0.3500          | 0.1856 | 0.1261 | 0.4291 | 0.4307             | 1e-05  |


### Framework versions

- Transformers 4.41.0
- Pytorch 2.5.0+cu124
- Datasets 3.0.2
- Tokenizers 0.19.1