|
--- |
|
license: other |
|
base_model: nvidia/mit-b0 |
|
tags: |
|
- vision |
|
- image-segmentation |
|
- generated_from_trainer |
|
model-index: |
|
- name: segformer-b0-finetuned-segments-stamp-verification2 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# segformer-b0-finetuned-segments-stamp-verification2 |
|
|
|
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the AliShah07/stamp-verification dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.0365 |
|
- Mean Iou: 0.1372 |
|
- Mean Accuracy: 0.2744 |
|
- Overall Accuracy: 0.2744 |
|
- Accuracy Unlabeled: nan |
|
- Accuracy Stamp: 0.2744 |
|
- Iou Unlabeled: 0.0 |
|
- Iou Stamp: 0.2744 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 6e-05 |
|
- train_batch_size: 2 |
|
- eval_batch_size: 2 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 20 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Stamp | Iou Unlabeled | Iou Stamp | |
|
|:-------------:|:-------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:--------------:|:-------------:|:---------:| |
|
| 0.4566 | 0.8333 | 20 | 0.4738 | 0.1430 | 0.2860 | 0.2860 | nan | 0.2860 | 0.0 | 0.2860 | |
|
| 0.3076 | 1.6667 | 40 | 0.3046 | 0.1307 | 0.2614 | 0.2614 | nan | 0.2614 | 0.0 | 0.2614 | |
|
| 0.2373 | 2.5 | 60 | 0.2226 | 0.0604 | 0.1209 | 0.1209 | nan | 0.1209 | 0.0 | 0.1209 | |
|
| 0.2184 | 3.3333 | 80 | 0.2220 | 0.1942 | 0.3884 | 0.3884 | nan | 0.3884 | 0.0 | 0.3884 | |
|
| 0.1578 | 4.1667 | 100 | 0.1704 | 0.2468 | 0.4936 | 0.4936 | nan | 0.4936 | 0.0 | 0.4936 | |
|
| 0.1412 | 5.0 | 120 | 0.1269 | 0.0376 | 0.0751 | 0.0751 | nan | 0.0751 | 0.0 | 0.0751 | |
|
| 0.1109 | 5.8333 | 140 | 0.1076 | 0.2741 | 0.5483 | 0.5483 | nan | 0.5483 | 0.0 | 0.5483 | |
|
| 0.106 | 6.6667 | 160 | 0.0892 | 0.0583 | 0.1166 | 0.1166 | nan | 0.1166 | 0.0 | 0.1166 | |
|
| 0.0899 | 7.5 | 180 | 0.0747 | 0.0173 | 0.0346 | 0.0346 | nan | 0.0346 | 0.0 | 0.0346 | |
|
| 0.0794 | 8.3333 | 200 | 0.0683 | 0.0189 | 0.0378 | 0.0378 | nan | 0.0378 | 0.0 | 0.0378 | |
|
| 0.0741 | 9.1667 | 220 | 0.0639 | 0.0981 | 0.1963 | 0.1963 | nan | 0.1963 | 0.0 | 0.1963 | |
|
| 0.0832 | 10.0 | 240 | 0.0559 | 0.0599 | 0.1198 | 0.1198 | nan | 0.1198 | 0.0 | 0.1198 | |
|
| 0.0575 | 10.8333 | 260 | 0.0527 | 0.0769 | 0.1538 | 0.1538 | nan | 0.1538 | 0.0 | 0.1538 | |
|
| 0.05 | 11.6667 | 280 | 0.0502 | 0.0852 | 0.1704 | 0.1704 | nan | 0.1704 | 0.0 | 0.1704 | |
|
| 0.0523 | 12.5 | 300 | 0.0446 | 0.1038 | 0.2076 | 0.2076 | nan | 0.2076 | 0.0 | 0.2076 | |
|
| 0.0481 | 13.3333 | 320 | 0.0431 | 0.0956 | 0.1913 | 0.1913 | nan | 0.1913 | 0.0 | 0.1913 | |
|
| 0.0471 | 14.1667 | 340 | 0.0420 | 0.1330 | 0.2660 | 0.2660 | nan | 0.2660 | 0.0 | 0.2660 | |
|
| 0.042 | 15.0 | 360 | 0.0412 | 0.1124 | 0.2248 | 0.2248 | nan | 0.2248 | 0.0 | 0.2248 | |
|
| 0.041 | 15.8333 | 380 | 0.0400 | 0.1144 | 0.2288 | 0.2288 | nan | 0.2288 | 0.0 | 0.2288 | |
|
| 0.0444 | 16.6667 | 400 | 0.0383 | 0.1415 | 0.2830 | 0.2830 | nan | 0.2830 | 0.0 | 0.2830 | |
|
| 0.0514 | 17.5 | 420 | 0.0377 | 0.0779 | 0.1559 | 0.1559 | nan | 0.1559 | 0.0 | 0.1559 | |
|
| 0.0434 | 18.3333 | 440 | 0.0374 | 0.1482 | 0.2964 | 0.2964 | nan | 0.2964 | 0.0 | 0.2964 | |
|
| 0.0383 | 19.1667 | 460 | 0.0363 | 0.1843 | 0.3686 | 0.3686 | nan | 0.3686 | 0.0 | 0.3686 | |
|
| 0.0411 | 20.0 | 480 | 0.0365 | 0.1372 | 0.2744 | 0.2744 | nan | 0.2744 | 0.0 | 0.2744 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.40.2 |
|
- Pytorch 2.2.1+cu121 |
|
- Datasets 2.19.1 |
|
- Tokenizers 0.19.1 |
|
|