|
--- |
|
license: apache-2.0 |
|
base_model: facebook/dinov2-large |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: drone-DinoVdeau-produttoria-probabilities-large-2024_11_04-batch-size64_freeze_probs |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# drone-DinoVdeau-produttoria-probabilities-large-2024_11_04-batch-size64_freeze_probs |
|
|
|
This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.3194 |
|
- Rmse: 0.2405 |
|
- Mae: 0.1536 |
|
- R2: 0.4281 |
|
- Explained Variance: 0.4294 |
|
- Learning Rate: 0.0000 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.001 |
|
- train_batch_size: 64 |
|
- eval_batch_size: 64 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 150 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Rmse | Mae | R2 | Explained Variance | Rate | |
|
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:------------------:|:------:| |
|
| No log | 1.0 | 181 | 0.3610 | 0.2645 | 0.1878 | 0.2818 | 0.3037 | 0.001 | |
|
| No log | 2.0 | 362 | 0.3465 | 0.2566 | 0.1778 | 0.3349 | 0.3479 | 0.001 | |
|
| 0.4149 | 3.0 | 543 | 0.3413 | 0.2532 | 0.1731 | 0.3536 | 0.3600 | 0.001 | |
|
| 0.4149 | 4.0 | 724 | 0.3406 | 0.2532 | 0.1743 | 0.3519 | 0.3591 | 0.001 | |
|
| 0.4149 | 5.0 | 905 | 0.3342 | 0.2496 | 0.1661 | 0.3702 | 0.3731 | 0.001 | |
|
| 0.3486 | 6.0 | 1086 | 0.3385 | 0.2512 | 0.1739 | 0.3651 | 0.3724 | 0.001 | |
|
| 0.3486 | 7.0 | 1267 | 0.3321 | 0.2476 | 0.1650 | 0.3836 | 0.3846 | 0.001 | |
|
| 0.3486 | 8.0 | 1448 | 0.3332 | 0.2484 | 0.1629 | 0.3802 | 0.3811 | 0.001 | |
|
| 0.3462 | 9.0 | 1629 | 0.3305 | 0.2468 | 0.1652 | 0.3859 | 0.3872 | 0.001 | |
|
| 0.3462 | 10.0 | 1810 | 0.3314 | 0.2476 | 0.1655 | 0.3827 | 0.3850 | 0.001 | |
|
| 0.3462 | 11.0 | 1991 | 0.3320 | 0.2474 | 0.1602 | 0.3840 | 0.3866 | 0.001 | |
|
| 0.3391 | 12.0 | 2172 | 0.3342 | 0.2494 | 0.1683 | 0.3761 | 0.3843 | 0.001 | |
|
| 0.3391 | 13.0 | 2353 | 0.3325 | 0.2480 | 0.1649 | 0.3821 | 0.3836 | 0.001 | |
|
| 0.3372 | 14.0 | 2534 | 0.3323 | 0.2472 | 0.1700 | 0.3878 | 0.3930 | 0.001 | |
|
| 0.3372 | 15.0 | 2715 | 0.3349 | 0.2493 | 0.1703 | 0.3749 | 0.3828 | 0.001 | |
|
| 0.3372 | 16.0 | 2896 | 0.3279 | 0.2448 | 0.1649 | 0.3983 | 0.4019 | 0.0001 | |
|
| 0.3343 | 17.0 | 3077 | 0.3279 | 0.2448 | 0.1648 | 0.3984 | 0.4041 | 0.0001 | |
|
| 0.3343 | 18.0 | 3258 | 0.3262 | 0.2440 | 0.1622 | 0.4025 | 0.4032 | 0.0001 | |
|
| 0.3343 | 19.0 | 3439 | 0.3247 | 0.2432 | 0.1588 | 0.4046 | 0.4051 | 0.0001 | |
|
| 0.3271 | 20.0 | 3620 | 0.3261 | 0.2433 | 0.1625 | 0.4059 | 0.4106 | 0.0001 | |
|
| 0.3271 | 21.0 | 3801 | 0.3241 | 0.2424 | 0.1606 | 0.4095 | 0.4119 | 0.0001 | |
|
| 0.3271 | 22.0 | 3982 | 0.3236 | 0.2422 | 0.1587 | 0.4111 | 0.4132 | 0.0001 | |
|
| 0.3275 | 23.0 | 4163 | 0.3242 | 0.2423 | 0.1601 | 0.4107 | 0.4134 | 0.0001 | |
|
| 0.3275 | 24.0 | 4344 | 0.3227 | 0.2414 | 0.1586 | 0.4150 | 0.4161 | 0.0001 | |
|
| 0.3247 | 25.0 | 4525 | 0.3224 | 0.2413 | 0.1587 | 0.4148 | 0.4162 | 0.0001 | |
|
| 0.3247 | 26.0 | 4706 | 0.3218 | 0.2413 | 0.1557 | 0.4143 | 0.4155 | 0.0001 | |
|
| 0.3247 | 27.0 | 4887 | 0.3227 | 0.2416 | 0.1603 | 0.4138 | 0.4154 | 0.0001 | |
|
| 0.3231 | 28.0 | 5068 | 0.3207 | 0.2405 | 0.1562 | 0.4186 | 0.4197 | 0.0001 | |
|
| 0.3231 | 29.0 | 5249 | 0.3221 | 0.2411 | 0.1597 | 0.4163 | 0.4175 | 0.0001 | |
|
| 0.3231 | 30.0 | 5430 | 0.3225 | 0.2413 | 0.1608 | 0.4164 | 0.4190 | 0.0001 | |
|
| 0.3215 | 31.0 | 5611 | 0.3224 | 0.2416 | 0.1535 | 0.4134 | 0.4164 | 0.0001 | |
|
| 0.3215 | 32.0 | 5792 | 0.3213 | 0.2408 | 0.1553 | 0.4180 | 0.4185 | 0.0001 | |
|
| 0.3215 | 33.0 | 5973 | 0.3216 | 0.2414 | 0.1583 | 0.4123 | 0.4142 | 0.0001 | |
|
| 0.3227 | 34.0 | 6154 | 0.3205 | 0.2406 | 0.1562 | 0.4172 | 0.4181 | 0.0001 | |
|
| 0.3227 | 35.0 | 6335 | 0.3198 | 0.2399 | 0.1535 | 0.4215 | 0.4224 | 0.0001 | |
|
| 0.3202 | 36.0 | 6516 | 0.3211 | 0.2406 | 0.1577 | 0.4187 | 0.4194 | 0.0001 | |
|
| 0.3202 | 37.0 | 6697 | 0.3204 | 0.2403 | 0.1520 | 0.4188 | 0.4203 | 0.0001 | |
|
| 0.3202 | 38.0 | 6878 | 0.3214 | 0.2409 | 0.1560 | 0.4170 | 0.4185 | 0.0001 | |
|
| 0.3195 | 39.0 | 7059 | 0.3195 | 0.2397 | 0.1520 | 0.4226 | 0.4232 | 0.0001 | |
|
| 0.3195 | 40.0 | 7240 | 0.3208 | 0.2404 | 0.1577 | 0.4204 | 0.4231 | 0.0001 | |
|
| 0.3195 | 41.0 | 7421 | 0.3198 | 0.2398 | 0.1547 | 0.4217 | 0.4233 | 0.0001 | |
|
| 0.3192 | 42.0 | 7602 | 0.3218 | 0.2410 | 0.1589 | 0.4174 | 0.4218 | 0.0001 | |
|
| 0.3192 | 43.0 | 7783 | 0.3190 | 0.2396 | 0.1544 | 0.4235 | 0.4254 | 0.0001 | |
|
| 0.3192 | 44.0 | 7964 | 0.3190 | 0.2396 | 0.1534 | 0.4230 | 0.4239 | 0.0001 | |
|
| 0.3178 | 45.0 | 8145 | 0.3198 | 0.2397 | 0.1566 | 0.4239 | 0.4260 | 0.0001 | |
|
| 0.3178 | 46.0 | 8326 | 0.3193 | 0.2398 | 0.1556 | 0.4213 | 0.4231 | 0.0001 | |
|
| 0.3175 | 47.0 | 8507 | 0.3190 | 0.2393 | 0.1524 | 0.4245 | 0.4257 | 0.0001 | |
|
| 0.3175 | 48.0 | 8688 | 0.3193 | 0.2398 | 0.1525 | 0.4215 | 0.4230 | 0.0001 | |
|
| 0.3175 | 49.0 | 8869 | 0.3207 | 0.2405 | 0.1558 | 0.4187 | 0.4196 | 0.0001 | |
|
| 0.3174 | 50.0 | 9050 | 0.3198 | 0.2400 | 0.1572 | 0.4218 | 0.4237 | 1e-05 | |
|
| 0.3174 | 51.0 | 9231 | 0.3244 | 0.2426 | 0.1602 | 0.4092 | 0.4173 | 1e-05 | |
|
| 0.3174 | 52.0 | 9412 | 0.3190 | 0.2396 | 0.1550 | 0.4227 | 0.4235 | 1e-05 | |
|
| 0.3152 | 53.0 | 9593 | 0.3189 | 0.2394 | 0.1552 | 0.4249 | 0.4270 | 1e-05 | |
|
| 0.3152 | 54.0 | 9774 | 0.3194 | 0.2396 | 0.1540 | 0.4227 | 0.4239 | 1e-05 | |
|
| 0.3152 | 55.0 | 9955 | 0.3185 | 0.2391 | 0.1539 | 0.4250 | 0.4258 | 1e-05 | |
|
| 0.317 | 56.0 | 10136 | 0.3181 | 0.2388 | 0.1527 | 0.4273 | 0.4281 | 1e-05 | |
|
| 0.317 | 57.0 | 10317 | 0.3187 | 0.2392 | 0.1532 | 0.4259 | 0.4274 | 1e-05 | |
|
| 0.317 | 58.0 | 10498 | 0.3201 | 0.2401 | 0.1567 | 0.4217 | 0.4259 | 1e-05 | |
|
| 0.314 | 59.0 | 10679 | 0.3181 | 0.2388 | 0.1528 | 0.4270 | 0.4282 | 1e-05 | |
|
| 0.314 | 60.0 | 10860 | 0.3182 | 0.2389 | 0.1534 | 0.4256 | 0.4268 | 1e-05 | |
|
| 0.314 | 61.0 | 11041 | 0.3186 | 0.2391 | 0.1510 | 0.4255 | 0.4266 | 1e-05 | |
|
| 0.314 | 62.0 | 11222 | 0.3203 | 0.2398 | 0.1596 | 0.4240 | 0.4262 | 1e-05 | |
|
| 0.314 | 63.0 | 11403 | 0.3196 | 0.2397 | 0.1570 | 0.4242 | 0.4276 | 1e-05 | |
|
| 0.3142 | 64.0 | 11584 | 0.3181 | 0.2391 | 0.1527 | 0.4244 | 0.4253 | 1e-05 | |
|
| 0.3142 | 65.0 | 11765 | 0.3185 | 0.2390 | 0.1550 | 0.4259 | 0.4264 | 1e-05 | |
|
| 0.3142 | 66.0 | 11946 | 0.3186 | 0.2389 | 0.1562 | 0.4278 | 0.4291 | 0.0000 | |
|
| 0.3131 | 67.0 | 12127 | 0.3181 | 0.2387 | 0.1526 | 0.4270 | 0.4279 | 0.0000 | |
|
| 0.3131 | 68.0 | 12308 | 0.3195 | 0.2397 | 0.1549 | 0.4221 | 0.4257 | 0.0000 | |
|
| 0.3131 | 69.0 | 12489 | 0.3183 | 0.2390 | 0.1540 | 0.4259 | 0.4275 | 0.0000 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.41.0 |
|
- Pytorch 2.5.0+cu124 |
|
- Datasets 3.0.2 |
|
- Tokenizers 0.19.1 |
|
|