Edit model card

drone_DinoVdeau-large-2024_09_18-batch-size64_epochs100_freeze

This model is a fine-tuned version of facebook/dinov2-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2981
  • F1 Micro: 0.8388
  • F1 Macro: 0.6139
  • Roc Auc: 0.8667
  • Accuracy: 0.2677
  • Learning Rate: 1e-05

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1 Micro F1 Macro Roc Auc Accuracy Rate
No log 1.0 181 0.3341 0.8203 0.5745 0.8508 0.2409 0.001
No log 2.0 362 0.3186 0.8327 0.5964 0.8618 0.2401 0.001
0.41 3.0 543 0.3190 0.8323 0.5896 0.8611 0.2516 0.001
0.41 4.0 724 0.3149 0.8322 0.5800 0.8608 0.2573 0.001
0.41 5.0 905 0.3153 0.8314 0.5992 0.8601 0.2526 0.001
0.3412 6.0 1086 0.3147 0.8368 0.6059 0.8655 0.2505 0.001
0.3412 7.0 1267 0.3130 0.8297 0.5672 0.8583 0.2643 0.001
0.3412 8.0 1448 0.3127 0.8305 0.5794 0.8590 0.2643 0.001
0.3338 9.0 1629 0.3131 0.8327 0.5719 0.8608 0.2700 0.001
0.3338 10.0 1810 0.3097 0.8355 0.5897 0.8638 0.2518 0.001
0.3338 11.0 1991 0.3123 0.8332 0.5735 0.8612 0.2742 0.001
0.3303 12.0 2172 0.3085 0.8331 0.5806 0.8612 0.2706 0.001
0.3303 13.0 2353 0.3079 0.8348 0.5952 0.8628 0.2646 0.001
0.3278 14.0 2534 0.3165 0.8340 0.5969 0.8626 0.2534 0.001
0.3278 15.0 2715 0.3074 0.8351 0.5790 0.8631 0.2760 0.001
0.3278 16.0 2896 0.3095 0.8356 0.5889 0.8637 0.2635 0.001
0.3273 17.0 3077 0.3103 0.8395 0.6138 0.8680 0.2474 0.001
0.3273 18.0 3258 0.3063 0.8334 0.5715 0.8611 0.2763 0.001
0.3273 19.0 3439 0.3110 0.8337 0.5920 0.8617 0.2656 0.001
0.324 20.0 3620 0.3072 0.8375 0.5984 0.8655 0.2596 0.001
0.324 21.0 3801 0.3074 0.8389 0.6090 0.8672 0.2560 0.001
0.324 22.0 3982 0.3070 0.8355 0.5808 0.8634 0.2656 0.001
0.3263 23.0 4163 0.3077 0.8389 0.6160 0.8669 0.2627 0.001
0.3263 24.0 4344 0.3061 0.8363 0.5881 0.8640 0.2656 0.001
0.3244 25.0 4525 0.3043 0.8402 0.6102 0.8679 0.2664 0.001
0.3244 26.0 4706 0.3110 0.8327 0.5806 0.8610 0.2659 0.001
0.3244 27.0 4887 0.3052 0.8380 0.5850 0.8656 0.2713 0.001
0.3257 28.0 5068 0.3030 0.8397 0.5974 0.8674 0.2664 0.001
0.3257 29.0 5249 0.3067 0.8362 0.5902 0.8642 0.2666 0.001
0.3257 30.0 5430 0.3061 0.8363 0.5924 0.8644 0.2635 0.001
0.3243 31.0 5611 0.3028 0.8373 0.5867 0.8649 0.2708 0.001
0.3243 32.0 5792 0.3060 0.8388 0.6094 0.8667 0.2568 0.001
0.3243 33.0 5973 0.3069 0.8342 0.5866 0.8625 0.2651 0.001
0.3257 34.0 6154 0.3069 0.8363 0.5901 0.8641 0.2664 0.001
0.3257 35.0 6335 0.3041 0.8380 0.6009 0.8657 0.2627 0.001
0.324 36.0 6516 0.3045 0.8363 0.5947 0.8640 0.2661 0.001
0.324 37.0 6697 0.3037 0.8396 0.5995 0.8672 0.2760 0.001
0.324 38.0 6878 0.3015 0.8388 0.5860 0.8662 0.2737 0.0001
0.3203 39.0 7059 0.3005 0.8381 0.5995 0.8656 0.2737 0.0001
0.3203 40.0 7240 0.3010 0.8417 0.6126 0.8692 0.2695 0.0001
0.3203 41.0 7421 0.2990 0.8403 0.6073 0.8677 0.2742 0.0001
0.3165 42.0 7602 0.2996 0.8409 0.5992 0.8681 0.2713 0.0001
0.3165 43.0 7783 0.2986 0.8414 0.6092 0.8688 0.2695 0.0001
0.3165 44.0 7964 0.2982 0.8396 0.5954 0.8668 0.2750 0.0001
0.3138 45.0 8145 0.2977 0.8401 0.6028 0.8674 0.2758 0.0001
0.3138 46.0 8326 0.2982 0.8406 0.5966 0.8677 0.2755 0.0001
0.3125 47.0 8507 0.2997 0.8378 0.5893 0.8650 0.2768 0.0001
0.3125 48.0 8688 0.2978 0.8420 0.6135 0.8694 0.2747 0.0001
0.3125 49.0 8869 0.2981 0.8399 0.6017 0.8671 0.2747 0.0001
0.312 50.0 9050 0.2977 0.8410 0.6113 0.8684 0.2703 0.0001
0.312 51.0 9231 0.3001 0.8419 0.6110 0.8696 0.2716 0.0001
0.312 52.0 9412 0.2977 0.8380 0.6011 0.8653 0.2789 0.0001
0.3115 53.0 9593 0.2966 0.8425 0.6151 0.8699 0.2729 0.0001
0.3115 54.0 9774 0.2977 0.8399 0.5974 0.8669 0.2810 0.0001
0.3115 55.0 9955 0.2965 0.8407 0.6119 0.8680 0.2724 0.0001
0.3105 56.0 10136 0.2966 0.8408 0.6058 0.8679 0.2786 0.0001
0.3105 57.0 10317 0.2978 0.8399 0.6068 0.8672 0.2747 0.0001
0.3105 58.0 10498 0.2965 0.8427 0.6146 0.8699 0.2721 0.0001
0.3105 59.0 10679 0.2961 0.8406 0.6006 0.8676 0.2794 0.0001
0.3105 60.0 10860 0.2961 0.8429 0.6113 0.8700 0.2797 0.0001
0.308 61.0 11041 0.2963 0.8415 0.5999 0.8684 0.2797 0.0001
0.308 62.0 11222 0.2961 0.8405 0.6017 0.8676 0.2789 0.0001
0.308 63.0 11403 0.2955 0.8421 0.6108 0.8693 0.2737 0.0001
0.3083 64.0 11584 0.2952 0.8407 0.6127 0.8679 0.2791 0.0001
0.3083 65.0 11765 0.2984 0.8391 0.6022 0.8664 0.2781 0.0001
0.3083 66.0 11946 0.2957 0.8415 0.6104 0.8687 0.2739 0.0001
0.3051 67.0 12127 0.2962 0.8416 0.6142 0.8690 0.2760 0.0001
0.3051 68.0 12308 0.2967 0.8413 0.6084 0.8686 0.2773 0.0001
0.3051 69.0 12489 0.2960 0.8406 0.6161 0.8679 0.2729 0.0001
0.3066 70.0 12670 0.2972 0.8434 0.6248 0.8708 0.2682 0.0001
0.3066 71.0 12851 0.2965 0.8396 0.6081 0.8668 0.2802 1e-05
0.3061 72.0 13032 0.2961 0.8422 0.6034 0.8691 0.2807 1e-05
0.3061 73.0 13213 0.2954 0.8409 0.6080 0.8679 0.2810 1e-05
0.3061 74.0 13394 0.2954 0.8411 0.6093 0.8682 0.2810 1e-05

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
306M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for lombardata/drone_DinoVdeau-large-2024_09_18-batch-size64_epochs100_freeze

Finetuned
(10)
this model