|
--- |
|
license: apache-2.0 |
|
base_model: facebook/dinov2-large |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: drone-DinoVdeau-produttoria_binary-binary-large-2024_11_03-batch-size64_freeze |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# drone-DinoVdeau-produttoria_binary-binary-large-2024_11_03-batch-size64_freeze |
|
|
|
This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.2854 |
|
- F1 Micro: 0.8468 |
|
- F1 Macro: 0.6351 |
|
- Accuracy: 0.2786 |
|
- Learning Rate: 0.0000 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.001 |
|
- train_batch_size: 64 |
|
- eval_batch_size: 64 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 150 |
|
- mixed_precision_training: Native AMP |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Accuracy | Rate | |
|
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:--------:|:------:| |
|
| No log | 1.0 | 181 | 0.3236 | 0.8262 | 0.5774 | 0.2630 | 0.001 | |
|
| No log | 2.0 | 362 | 0.3146 | 0.8379 | 0.6199 | 0.2412 | 0.001 | |
|
| 0.3995 | 3.0 | 543 | 0.3090 | 0.8398 | 0.6044 | 0.2555 | 0.001 | |
|
| 0.3995 | 4.0 | 724 | 0.3074 | 0.8349 | 0.6003 | 0.2562 | 0.001 | |
|
| 0.3995 | 5.0 | 905 | 0.3039 | 0.8406 | 0.6248 | 0.2516 | 0.001 | |
|
| 0.3299 | 6.0 | 1086 | 0.3060 | 0.8420 | 0.6225 | 0.2596 | 0.001 | |
|
| 0.3299 | 7.0 | 1267 | 0.3014 | 0.8387 | 0.5955 | 0.2820 | 0.001 | |
|
| 0.3299 | 8.0 | 1448 | 0.3013 | 0.8391 | 0.5975 | 0.2703 | 0.001 | |
|
| 0.3216 | 9.0 | 1629 | 0.3010 | 0.8407 | 0.5974 | 0.2841 | 0.001 | |
|
| 0.3216 | 10.0 | 1810 | 0.3007 | 0.8376 | 0.5938 | 0.2711 | 0.001 | |
|
| 0.3216 | 11.0 | 1991 | 0.3036 | 0.8349 | 0.5762 | 0.2773 | 0.001 | |
|
| 0.3167 | 12.0 | 2172 | 0.3013 | 0.8385 | 0.6115 | 0.2674 | 0.001 | |
|
| 0.3167 | 13.0 | 2353 | 0.2978 | 0.8421 | 0.6146 | 0.2648 | 0.001 | |
|
| 0.315 | 14.0 | 2534 | 0.2977 | 0.8400 | 0.6059 | 0.2734 | 0.001 | |
|
| 0.315 | 15.0 | 2715 | 0.2981 | 0.8434 | 0.6075 | 0.2666 | 0.001 | |
|
| 0.315 | 16.0 | 2896 | 0.2974 | 0.8394 | 0.5933 | 0.2747 | 0.001 | |
|
| 0.3147 | 17.0 | 3077 | 0.2984 | 0.8438 | 0.6147 | 0.2664 | 0.001 | |
|
| 0.3147 | 18.0 | 3258 | 0.3023 | 0.8356 | 0.5804 | 0.2763 | 0.001 | |
|
| 0.3147 | 19.0 | 3439 | 0.2985 | 0.8424 | 0.6159 | 0.2739 | 0.001 | |
|
| 0.3122 | 20.0 | 3620 | 0.2968 | 0.8412 | 0.5984 | 0.2807 | 0.001 | |
|
| 0.3122 | 21.0 | 3801 | 0.3005 | 0.8419 | 0.6060 | 0.2703 | 0.001 | |
|
| 0.3122 | 22.0 | 3982 | 0.2982 | 0.8375 | 0.5804 | 0.2747 | 0.001 | |
|
| 0.3149 | 23.0 | 4163 | 0.2939 | 0.8436 | 0.6152 | 0.2781 | 0.001 | |
|
| 0.3149 | 24.0 | 4344 | 0.2948 | 0.8453 | 0.6229 | 0.2760 | 0.001 | |
|
| 0.3118 | 25.0 | 4525 | 0.2968 | 0.8427 | 0.6103 | 0.2737 | 0.001 | |
|
| 0.3118 | 26.0 | 4706 | 0.2956 | 0.8421 | 0.6045 | 0.2755 | 0.001 | |
|
| 0.3118 | 27.0 | 4887 | 0.2959 | 0.8438 | 0.6115 | 0.2765 | 0.001 | |
|
| 0.3126 | 28.0 | 5068 | 0.2955 | 0.8447 | 0.6191 | 0.2693 | 0.001 | |
|
| 0.3126 | 29.0 | 5249 | 0.3011 | 0.8438 | 0.6216 | 0.2664 | 0.001 | |
|
| 0.3126 | 30.0 | 5430 | 0.2921 | 0.8437 | 0.6025 | 0.2810 | 0.0001 | |
|
| 0.3093 | 31.0 | 5611 | 0.2904 | 0.8439 | 0.6072 | 0.2812 | 0.0001 | |
|
| 0.3093 | 32.0 | 5792 | 0.2903 | 0.8437 | 0.6112 | 0.2810 | 0.0001 | |
|
| 0.3093 | 33.0 | 5973 | 0.2889 | 0.8462 | 0.6202 | 0.2854 | 0.0001 | |
|
| 0.3049 | 34.0 | 6154 | 0.2896 | 0.8446 | 0.6151 | 0.2862 | 0.0001 | |
|
| 0.3049 | 35.0 | 6335 | 0.2887 | 0.8449 | 0.6112 | 0.2867 | 0.0001 | |
|
| 0.3012 | 36.0 | 6516 | 0.2889 | 0.8447 | 0.6120 | 0.2836 | 0.0001 | |
|
| 0.3012 | 37.0 | 6697 | 0.2883 | 0.8476 | 0.6256 | 0.2867 | 0.0001 | |
|
| 0.3012 | 38.0 | 6878 | 0.2905 | 0.8453 | 0.6057 | 0.2825 | 0.0001 | |
|
| 0.299 | 39.0 | 7059 | 0.2878 | 0.8471 | 0.6254 | 0.2854 | 0.0001 | |
|
| 0.299 | 40.0 | 7240 | 0.2886 | 0.8468 | 0.6223 | 0.2810 | 0.0001 | |
|
| 0.299 | 41.0 | 7421 | 0.2877 | 0.8473 | 0.6261 | 0.2843 | 0.0001 | |
|
| 0.2989 | 42.0 | 7602 | 0.2878 | 0.8477 | 0.6199 | 0.2856 | 0.0001 | |
|
| 0.2989 | 43.0 | 7783 | 0.2872 | 0.8479 | 0.6288 | 0.2830 | 0.0001 | |
|
| 0.2989 | 44.0 | 7964 | 0.2868 | 0.8464 | 0.6190 | 0.2841 | 0.0001 | |
|
| 0.2983 | 45.0 | 8145 | 0.2870 | 0.8463 | 0.6236 | 0.2838 | 0.0001 | |
|
| 0.2983 | 46.0 | 8326 | 0.2868 | 0.8460 | 0.6151 | 0.2825 | 0.0001 | |
|
| 0.298 | 47.0 | 8507 | 0.2872 | 0.8462 | 0.6211 | 0.2846 | 0.0001 | |
|
| 0.298 | 48.0 | 8688 | 0.2866 | 0.8467 | 0.6231 | 0.2836 | 0.0001 | |
|
| 0.298 | 49.0 | 8869 | 0.2863 | 0.8460 | 0.6161 | 0.2859 | 0.0001 | |
|
| 0.2965 | 50.0 | 9050 | 0.2864 | 0.8483 | 0.6255 | 0.2846 | 0.0001 | |
|
| 0.2965 | 51.0 | 9231 | 0.2891 | 0.8486 | 0.6278 | 0.2849 | 0.0001 | |
|
| 0.2965 | 52.0 | 9412 | 0.2856 | 0.8464 | 0.6255 | 0.2851 | 0.0001 | |
|
| 0.2956 | 53.0 | 9593 | 0.2872 | 0.8490 | 0.6458 | 0.2789 | 0.0001 | |
|
| 0.2956 | 54.0 | 9774 | 0.2856 | 0.8477 | 0.6244 | 0.2903 | 0.0001 | |
|
| 0.2956 | 55.0 | 9955 | 0.2857 | 0.8475 | 0.6340 | 0.2846 | 0.0001 | |
|
| 0.2958 | 56.0 | 10136 | 0.2862 | 0.8466 | 0.6241 | 0.2867 | 0.0001 | |
|
| 0.2958 | 57.0 | 10317 | 0.2871 | 0.8454 | 0.6249 | 0.2862 | 0.0001 | |
|
| 0.2958 | 58.0 | 10498 | 0.2858 | 0.8492 | 0.6334 | 0.2812 | 0.0001 | |
|
| 0.2954 | 59.0 | 10679 | 0.2862 | 0.8468 | 0.6178 | 0.2888 | 1e-05 | |
|
| 0.2954 | 60.0 | 10860 | 0.2847 | 0.8485 | 0.6276 | 0.2854 | 1e-05 | |
|
| 0.2923 | 61.0 | 11041 | 0.2849 | 0.8480 | 0.6224 | 0.2830 | 1e-05 | |
|
| 0.2923 | 62.0 | 11222 | 0.2855 | 0.8469 | 0.6248 | 0.2843 | 1e-05 | |
|
| 0.2923 | 63.0 | 11403 | 0.2849 | 0.8489 | 0.6275 | 0.2828 | 1e-05 | |
|
| 0.2918 | 64.0 | 11584 | 0.2846 | 0.8475 | 0.6371 | 0.2823 | 1e-05 | |
|
| 0.2918 | 65.0 | 11765 | 0.2860 | 0.8468 | 0.6241 | 0.2869 | 1e-05 | |
|
| 0.2918 | 66.0 | 11946 | 0.2847 | 0.8481 | 0.6347 | 0.2841 | 1e-05 | |
|
| 0.2906 | 67.0 | 12127 | 0.2853 | 0.8488 | 0.6287 | 0.2854 | 1e-05 | |
|
| 0.2906 | 68.0 | 12308 | 0.2853 | 0.8480 | 0.6321 | 0.2867 | 1e-05 | |
|
| 0.2906 | 69.0 | 12489 | 0.2848 | 0.8477 | 0.6397 | 0.2836 | 1e-05 | |
|
| 0.2918 | 70.0 | 12670 | 0.2853 | 0.8492 | 0.6381 | 0.2823 | 1e-05 | |
|
| 0.2918 | 71.0 | 12851 | 0.2851 | 0.8476 | 0.6325 | 0.2882 | 0.0000 | |
|
| 0.2918 | 72.0 | 13032 | 0.2845 | 0.8474 | 0.6236 | 0.2849 | 0.0000 | |
|
| 0.2918 | 73.0 | 13213 | 0.2845 | 0.8476 | 0.6333 | 0.2812 | 0.0000 | |
|
| 0.2918 | 74.0 | 13394 | 0.2845 | 0.8466 | 0.6300 | 0.2828 | 0.0000 | |
|
| 0.2913 | 75.0 | 13575 | 0.2851 | 0.8474 | 0.6235 | 0.2820 | 0.0000 | |
|
| 0.2913 | 76.0 | 13756 | 0.2860 | 0.8473 | 0.6186 | 0.2880 | 0.0000 | |
|
| 0.2913 | 77.0 | 13937 | 0.2858 | 0.8459 | 0.6173 | 0.2856 | 0.0000 | |
|
| 0.2913 | 78.0 | 14118 | 0.2844 | 0.8481 | 0.6326 | 0.2843 | 0.0000 | |
|
| 0.2913 | 79.0 | 14299 | 0.2871 | 0.8472 | 0.6179 | 0.2875 | 0.0000 | |
|
| 0.2913 | 80.0 | 14480 | 0.2848 | 0.8477 | 0.6287 | 0.2838 | 0.0000 | |
|
| 0.2915 | 81.0 | 14661 | 0.2848 | 0.8490 | 0.6305 | 0.2854 | 0.0000 | |
|
| 0.2915 | 82.0 | 14842 | 0.2851 | 0.8480 | 0.6394 | 0.2859 | 0.0000 | |
|
| 0.2913 | 83.0 | 15023 | 0.2846 | 0.8488 | 0.6255 | 0.2856 | 0.0000 | |
|
| 0.2913 | 84.0 | 15204 | 0.2857 | 0.8482 | 0.6458 | 0.2833 | 0.0000 | |
|
| 0.2913 | 85.0 | 15385 | 0.2855 | 0.8488 | 0.6340 | 0.2812 | 0.0000 | |
|
| 0.2922 | 86.0 | 15566 | 0.2849 | 0.8480 | 0.6363 | 0.2859 | 0.0000 | |
|
| 0.2922 | 87.0 | 15747 | 0.2845 | 0.8474 | 0.6328 | 0.2851 | 0.0000 | |
|
| 0.2922 | 88.0 | 15928 | 0.2854 | 0.8478 | 0.6371 | 0.2812 | 0.0000 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.41.0 |
|
- Pytorch 2.5.0+cu124 |
|
- Datasets 3.0.2 |
|
- Tokenizers 0.19.1 |
|
|