selvaa's picture
End of training
f9eb593 verified
|
raw
history blame
13.1 kB
metadata
license: other
base_model: nvidia/segformer-b1-finetuned-cityscapes-1024-1024
tags:
  - vision
  - image-segmentation
  - generated_from_trainer
model-index:
  - name: segformer-b1-finetuned-cityscapes-1024-1024-full-ds
    results: []

segformer-b1-finetuned-cityscapes-1024-1024-full-ds

This model is a fine-tuned version of nvidia/segformer-b1-finetuned-cityscapes-1024-1024 on the selvaa/final_iteration dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0537
  • Mean Iou: 0.9119
  • Mean Accuracy: 0.9519
  • Overall Accuracy: 0.9830
  • Accuracy Default: 1e-06
  • Accuracy Pipe: 0.8919
  • Accuracy Floor: 0.9695
  • Accuracy Background: 0.9942
  • Iou Default: 1e-06
  • Iou Pipe: 0.7943
  • Iou Floor: 0.9593
  • Iou Background: 0.9822

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 3
  • eval_batch_size: 3
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 60
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Default Accuracy Pipe Accuracy Floor Accuracy Background Iou Default Iou Pipe Iou Floor Iou Background
0.6857 1.0 52 0.3047 0.7456 0.8104 0.9494 1e-06 0.4976 0.9560 0.9776 1e-06 0.3799 0.9109 0.9460
0.2657 2.0 104 0.1869 0.8168 0.8664 0.9656 1e-06 0.6511 0.9583 0.9897 1e-06 0.5513 0.9373 0.9619
0.1674 3.0 156 0.1333 0.8510 0.9041 0.9717 1e-06 0.7620 0.9601 0.9903 1e-06 0.6405 0.9424 0.9699
0.127 4.0 208 0.1039 0.8678 0.9158 0.9743 1e-06 0.7938 0.9625 0.9910 1e-06 0.6861 0.9455 0.9719
0.1047 5.0 260 0.0968 0.8756 0.9343 0.9761 1e-06 0.8516 0.9609 0.9903 1e-06 0.7024 0.9490 0.9753
0.0924 6.0 312 0.0843 0.8839 0.9355 0.9775 1e-06 0.8512 0.9641 0.9912 1e-06 0.7244 0.9513 0.9760
0.083 7.0 364 0.0749 0.8879 0.9422 0.9786 1e-06 0.8713 0.9637 0.9914 1e-06 0.7320 0.9541 0.9775
0.0761 8.0 416 0.0717 0.8895 0.9412 0.9789 1e-06 0.8678 0.9634 0.9923 1e-06 0.7364 0.9541 0.9779
0.0709 9.0 468 0.0723 0.8891 0.9289 0.9789 1e-06 0.8282 0.9635 0.9949 1e-06 0.7361 0.9543 0.9769
0.0664 10.0 520 0.0653 0.8952 0.9385 0.9800 1e-06 0.8554 0.9663 0.9936 1e-06 0.7507 0.9568 0.9783
0.0628 11.0 572 0.0668 0.8934 0.9317 0.9797 1e-06 0.8345 0.9658 0.9948 1e-06 0.7460 0.9566 0.9776
0.0599 12.0 624 0.0612 0.9000 0.9526 0.9808 1e-06 0.8987 0.9675 0.9914 1e-06 0.7624 0.9574 0.9801
0.0578 13.0 676 0.0604 0.8982 0.9458 0.9803 1e-06 0.8770 0.9686 0.9918 1e-06 0.7602 0.9549 0.9795
0.0542 14.0 728 0.0609 0.9003 0.9435 0.9809 1e-06 0.8698 0.9673 0.9936 1e-06 0.7636 0.9578 0.9795
0.0528 15.0 780 0.0562 0.9054 0.9461 0.9818 1e-06 0.8767 0.9672 0.9945 1e-06 0.7771 0.9586 0.9806
0.0505 16.0 832 0.0550 0.9039 0.9546 0.9815 1e-06 0.9044 0.9672 0.9921 1e-06 0.7734 0.9576 0.9808
0.0496 17.0 884 0.0594 0.9016 0.9447 0.9811 1e-06 0.8753 0.9638 0.9949 1e-06 0.7673 0.9578 0.9799
0.0478 18.0 936 0.0543 0.9042 0.9554 0.9816 1e-06 0.9056 0.9695 0.9913 1e-06 0.7732 0.9586 0.9808
0.0472 19.0 988 0.0554 0.9046 0.9510 0.9816 1e-06 0.8921 0.9683 0.9927 1e-06 0.7751 0.9582 0.9806
0.0457 20.0 1040 0.0590 0.9010 0.9407 0.9810 1e-06 0.8585 0.9703 0.9933 1e-06 0.7661 0.9572 0.9795
0.0439 21.0 1092 0.0558 0.9045 0.9484 0.9817 1e-06 0.8841 0.9674 0.9937 1e-06 0.7747 0.9579 0.9807
0.0432 22.0 1144 0.0564 0.9060 0.9469 0.9818 1e-06 0.8783 0.9683 0.9940 1e-06 0.7791 0.9582 0.9806
0.0425 23.0 1196 0.0541 0.9064 0.9531 0.9820 1e-06 0.8980 0.9686 0.9927 1e-06 0.7798 0.9580 0.9813
0.0424 24.0 1248 0.0562 0.9059 0.9405 0.9819 1e-06 0.8584 0.9675 0.9957 1e-06 0.7784 0.9592 0.9800
0.0412 25.0 1300 0.0553 0.9032 0.9537 0.9816 1e-06 0.9002 0.9693 0.9918 1e-06 0.7700 0.9584 0.9811
0.0404 26.0 1352 0.0533 0.9075 0.9514 0.9822 1e-06 0.8927 0.9678 0.9937 1e-06 0.7823 0.9586 0.9815
0.0397 27.0 1404 0.0526 0.9073 0.9525 0.9821 1e-06 0.8950 0.9697 0.9928 1e-06 0.7819 0.9584 0.9814
0.0394 28.0 1456 0.0523 0.9082 0.9563 0.9825 1e-06 0.9078 0.9681 0.9930 1e-06 0.7835 0.9590 0.9822
0.0388 29.0 1508 0.0526 0.9078 0.9541 0.9823 1e-06 0.8999 0.9701 0.9924 1e-06 0.7834 0.9584 0.9817
0.0384 30.0 1560 0.0531 0.9087 0.9512 0.9825 1e-06 0.8903 0.9695 0.9936 1e-06 0.7852 0.9593 0.9817
0.0379 31.0 1612 0.0534 0.9084 0.9525 0.9825 1e-06 0.8962 0.9674 0.9940 1e-06 0.7846 0.9585 0.9820
0.0371 32.0 1664 0.0530 0.9104 0.9513 0.9827 1e-06 0.8919 0.9675 0.9945 1e-06 0.7904 0.9590 0.9818
0.0365 33.0 1716 0.0522 0.9096 0.9535 0.9826 1e-06 0.8980 0.9690 0.9935 1e-06 0.7877 0.9593 0.9819
0.0362 34.0 1768 0.0523 0.9106 0.9528 0.9827 1e-06 0.8948 0.9702 0.9934 1e-06 0.7909 0.9590 0.9819
0.0368 35.0 1820 0.0532 0.9099 0.9501 0.9826 1e-06 0.8858 0.9710 0.9935 1e-06 0.7892 0.9590 0.9816
0.0365 36.0 1872 0.0513 0.9106 0.9556 0.9828 1e-06 0.9043 0.9695 0.9932 1e-06 0.7901 0.9594 0.9823
0.0357 37.0 1924 0.0535 0.9093 0.9511 0.9826 1e-06 0.8907 0.9685 0.9941 1e-06 0.7867 0.9596 0.9817
0.0359 38.0 1976 0.0518 0.9091 0.9571 0.9826 1e-06 0.9105 0.9678 0.9930 1e-06 0.7861 0.9590 0.9822
0.0344 39.0 2028 0.0535 0.9102 0.9505 0.9827 1e-06 0.8882 0.9689 0.9943 1e-06 0.7894 0.9593 0.9818
0.034 40.0 2080 0.0519 0.9115 0.9547 0.9830 1e-06 0.9009 0.9697 0.9936 1e-06 0.7923 0.9597 0.9824
0.0339 41.0 2132 0.0528 0.9120 0.9525 0.9830 1e-06 0.8935 0.9698 0.9940 1e-06 0.7946 0.9592 0.9823
0.0338 42.0 2184 0.0531 0.9118 0.9529 0.9830 1e-06 0.8957 0.9687 0.9944 1e-06 0.7934 0.9595 0.9824
0.0339 43.0 2236 0.0542 0.9113 0.9518 0.9829 1e-06 0.8917 0.9697 0.9941 1e-06 0.7923 0.9594 0.9823
0.0337 44.0 2288 0.0541 0.9092 0.9535 0.9825 1e-06 0.8982 0.9688 0.9935 1e-06 0.7865 0.9594 0.9818
0.0332 45.0 2340 0.0535 0.9111 0.9533 0.9829 1e-06 0.8960 0.9702 0.9936 1e-06 0.7916 0.9595 0.9822
0.0328 46.0 2392 0.0535 0.9112 0.9519 0.9828 1e-06 0.8914 0.9704 0.9937 1e-06 0.7924 0.9590 0.9822
0.0342 47.0 2444 0.0533 0.9118 0.9533 0.9829 1e-06 0.8963 0.9696 0.9939 1e-06 0.7938 0.9594 0.9822
0.0331 48.0 2496 0.0552 0.9101 0.9509 0.9827 1e-06 0.8891 0.9696 0.9940 1e-06 0.7889 0.9593 0.9820
0.0325 49.0 2548 0.0537 0.9119 0.9519 0.9830 1e-06 0.8919 0.9695 0.9942 1e-06 0.7943 0.9593 0.9822

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.0.1
  • Datasets 2.15.0
  • Tokenizers 0.15.0