LayoutLMv3_1

This model is a fine-tuned version of microsoft/layoutlmv3-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3772
  • Precision: 0.7355
  • Recall: 0.7550
  • F1: 0.7451
  • Accuracy: 0.9035

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 1500

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 0.37 10 2.5352 0.0089 0.0199 0.0123 0.0503
No log 0.74 20 2.1493 0.0377 0.0397 0.0387 0.6028
No log 1.11 30 1.7234 0.0 0.0 0.0 0.7804
No log 1.48 40 1.2971 0.0 0.0 0.0 0.7818
No log 1.85 50 1.0544 0.0 0.0 0.0 0.7818
No log 2.22 60 1.0210 0.0 0.0 0.0 0.7818
No log 2.59 70 0.9842 0.0 0.0 0.0 0.7818
No log 2.96 80 0.9651 0.0 0.0 0.0 0.7818
No log 3.33 90 0.9402 0.0 0.0 0.0 0.7818
No log 3.7 100 0.9205 0.0 0.0 0.0 0.7818
No log 4.07 110 0.9035 0.0 0.0 0.0 0.7818
No log 4.44 120 0.8807 0.0 0.0 0.0 0.7818
No log 4.81 130 0.8596 0.0 0.0 0.0 0.7818
No log 5.19 140 0.8382 0.0 0.0 0.0 0.7818
No log 5.56 150 0.8151 0.0 0.0 0.0 0.7818
No log 5.93 160 0.8039 0.0 0.0 0.0 0.7818
No log 6.3 170 0.7864 0.0 0.0 0.0 0.7818
No log 6.67 180 0.7588 0.0 0.0 0.0 0.7818
No log 7.04 190 0.7352 0.0 0.0 0.0 0.7818
No log 7.41 200 0.7232 0.0 0.0 0.0 0.7818
No log 7.78 210 0.7246 1.0 0.0132 0.0261 0.7846
No log 8.15 220 0.6916 1.0 0.0132 0.0261 0.7846
No log 8.52 230 0.6755 1.0 0.0199 0.0390 0.7860
No log 8.89 240 0.6686 1.0 0.0331 0.0641 0.7888
No log 9.26 250 0.6514 1.0 0.0265 0.0516 0.7874
No log 9.63 260 0.6430 1.0 0.0596 0.1125 0.7944
No log 10.0 270 0.6280 0.9474 0.1192 0.2118 0.8070
No log 10.37 280 0.6127 0.9286 0.1722 0.2905 0.8196
No log 10.74 290 0.6136 0.9024 0.2450 0.3854 0.8350
No log 11.11 300 0.5886 0.8810 0.2450 0.3834 0.8350
No log 11.48 310 0.5896 0.8909 0.3245 0.4757 0.8517
No log 11.85 320 0.5732 0.9310 0.3576 0.5167 0.8587
No log 12.22 330 0.5770 0.8533 0.4238 0.5664 0.8671
No log 12.59 340 0.5557 0.8649 0.4238 0.5689 0.8671
No log 12.96 350 0.5469 0.8222 0.4901 0.6141 0.8797
No log 13.33 360 0.5412 0.8242 0.4967 0.6198 0.8825
No log 13.7 370 0.5313 0.8454 0.5430 0.6613 0.8923
No log 14.07 380 0.5188 0.8381 0.5828 0.6875 0.8993
No log 14.44 390 0.5190 0.8333 0.5960 0.6950 0.8993
No log 14.81 400 0.5172 0.8165 0.5894 0.6846 0.8993
No log 15.19 410 0.5066 0.7966 0.6225 0.6989 0.9021
No log 15.56 420 0.4879 0.8087 0.6159 0.6992 0.9035
No log 15.93 430 0.4943 0.7833 0.6225 0.6937 0.9021
No log 16.3 440 0.4716 0.8205 0.6358 0.7164 0.9091
No log 16.67 450 0.4594 0.8264 0.6623 0.7353 0.9119
No log 17.04 460 0.4758 0.7761 0.6887 0.7298 0.9105
No log 17.41 470 0.4520 0.8430 0.6755 0.75 0.9217
No log 17.78 480 0.4582 0.8244 0.7152 0.7660 0.9245
No log 18.15 490 0.4475 0.8189 0.6887 0.7482 0.9217
0.6982 18.52 500 0.4627 0.7431 0.7086 0.7254 0.9105
0.6982 18.89 510 0.4419 0.7826 0.7152 0.7474 0.9189
0.6982 19.26 520 0.4351 0.7730 0.7219 0.7466 0.9147
0.6982 19.63 530 0.4213 0.7857 0.7285 0.7560 0.9189
0.6982 20.0 540 0.4389 0.7273 0.7417 0.7344 0.9091
0.6982 20.37 550 0.4208 0.7762 0.7351 0.7551 0.9189
0.6982 20.74 560 0.4301 0.74 0.7351 0.7375 0.9119
0.6982 21.11 570 0.4199 0.7568 0.7417 0.7492 0.9161
0.6982 21.48 580 0.4283 0.7006 0.7285 0.7143 0.9021
0.6982 21.85 590 0.4068 0.7857 0.7285 0.7560 0.9203
0.6982 22.22 600 0.4241 0.7179 0.7417 0.7296 0.9077
0.6982 22.59 610 0.3988 0.8321 0.7550 0.7917 0.9329
0.6982 22.96 620 0.4005 0.7671 0.7417 0.7542 0.9189
0.6982 23.33 630 0.3939 0.7651 0.7550 0.76 0.9189
0.6982 23.7 640 0.4007 0.7278 0.7616 0.7443 0.9119
0.6982 24.07 650 0.3857 0.7973 0.7815 0.7893 0.9217
0.6982 24.44 660 0.3893 0.7682 0.7682 0.7682 0.9175
0.6982 24.81 670 0.3946 0.7516 0.7616 0.7566 0.9147
0.6982 25.19 680 0.3893 0.7516 0.7616 0.7566 0.9161
0.6982 25.56 690 0.3969 0.7419 0.7616 0.7516 0.9119
0.6982 25.93 700 0.3854 0.7852 0.7748 0.7800 0.9217
0.6982 26.3 710 0.3858 0.7973 0.7815 0.7893 0.9231
0.6982 26.67 720 0.3831 0.7867 0.7815 0.7841 0.9217
0.6982 27.04 730 0.3996 0.7267 0.7748 0.75 0.9049
0.6982 27.41 740 0.3907 0.7358 0.7748 0.7548 0.9077
0.6982 27.78 750 0.3720 0.8013 0.8013 0.8013 0.9245
0.6982 28.15 760 0.3799 0.7895 0.7947 0.7921 0.9189
0.6982 28.52 770 0.3938 0.7178 0.7748 0.7452 0.9035
0.6982 28.89 780 0.3761 0.7763 0.7815 0.7789 0.9189
0.6982 29.26 790 0.3906 0.7267 0.7748 0.75 0.9063
0.6982 29.63 800 0.3780 0.7436 0.7682 0.7557 0.9105
0.6982 30.0 810 0.3773 0.7548 0.7748 0.7647 0.9133
0.6982 30.37 820 0.3716 0.7727 0.7881 0.7803 0.9175
0.6982 30.74 830 0.3747 0.7452 0.7748 0.7597 0.9119
0.6982 31.11 840 0.3747 0.7405 0.7748 0.7573 0.9133
0.6982 31.48 850 0.3821 0.7239 0.7815 0.7516 0.9077
0.6982 31.85 860 0.3649 0.7697 0.7748 0.7723 0.9175
0.6982 32.22 870 0.3804 0.7152 0.7815 0.7468 0.9049
0.6982 32.59 880 0.3715 0.75 0.7748 0.7622 0.9105
0.6982 32.96 890 0.3663 0.7632 0.7682 0.7657 0.9161
0.6982 33.33 900 0.3713 0.7516 0.7815 0.7662 0.9133
0.6982 33.7 910 0.3684 0.7597 0.7748 0.7672 0.9133
0.6982 34.07 920 0.3708 0.75 0.7748 0.7622 0.9119
0.6982 34.44 930 0.3699 0.8146 0.8146 0.8146 0.9259
0.6982 34.81 940 0.3726 0.7778 0.7881 0.7829 0.9189
0.6982 35.19 950 0.3763 0.7405 0.7748 0.7573 0.9105
0.6982 35.56 960 0.3883 0.7267 0.7748 0.75 0.9035
0.6982 35.93 970 0.3729 0.7616 0.7616 0.7616 0.9119
0.6982 36.3 980 0.3654 0.8108 0.7947 0.8027 0.9217
0.6982 36.67 990 0.3795 0.7195 0.7815 0.7492 0.9049
0.2195 37.04 1000 0.3819 0.7267 0.7748 0.75 0.9035
0.2195 37.41 1010 0.3760 0.7233 0.7616 0.7419 0.9035
0.2195 37.78 1020 0.3664 0.7468 0.7616 0.7541 0.9105
0.2195 38.15 1030 0.3753 0.7312 0.7748 0.7524 0.9077
0.2195 38.52 1040 0.3791 0.7284 0.7815 0.7540 0.9035
0.2195 38.89 1050 0.3665 0.7933 0.7881 0.7907 0.9203
0.2195 39.26 1060 0.3655 0.7763 0.7815 0.7789 0.9161
0.2195 39.63 1070 0.3811 0.7312 0.7748 0.7524 0.9091
0.2195 40.0 1080 0.3725 0.7342 0.7682 0.7508 0.9063
0.2195 40.37 1090 0.3639 0.7692 0.7947 0.7818 0.9161
0.2195 40.74 1100 0.3721 0.7312 0.7748 0.7524 0.9077
0.2195 41.11 1110 0.3782 0.7143 0.7616 0.7372 0.9021
0.2195 41.48 1120 0.3654 0.7748 0.7748 0.7748 0.9189
0.2195 41.85 1130 0.3717 0.7278 0.7616 0.7443 0.9049
0.2195 42.22 1140 0.3868 0.7195 0.7815 0.7492 0.9021
0.2195 42.59 1150 0.3913 0.7066 0.7815 0.7421 0.8993
0.2195 42.96 1160 0.3797 0.7222 0.7748 0.7476 0.9035
0.2195 43.33 1170 0.3709 0.7405 0.7748 0.7573 0.9105
0.2195 43.7 1180 0.3736 0.7358 0.7748 0.7548 0.9077
0.2195 44.07 1190 0.3664 0.7389 0.7682 0.7532 0.9091
0.2195 44.44 1200 0.3677 0.7358 0.7748 0.7548 0.9063
0.2195 44.81 1210 0.3805 0.7329 0.7815 0.7564 0.9077
0.2195 45.19 1220 0.3806 0.7329 0.7815 0.7564 0.9077
0.2195 45.56 1230 0.3712 0.7372 0.7616 0.7492 0.9035
0.2195 45.93 1240 0.3746 0.7308 0.7550 0.7427 0.9035
0.2195 46.3 1250 0.3725 0.7261 0.7550 0.7403 0.9049
0.2195 46.67 1260 0.3719 0.7355 0.7550 0.7451 0.9035
0.2195 47.04 1270 0.3718 0.7355 0.7550 0.7451 0.9063
0.2195 47.41 1280 0.3728 0.7355 0.7550 0.7451 0.9063
0.2195 47.78 1290 0.3740 0.7261 0.7550 0.7403 0.9035
0.2195 48.15 1300 0.3780 0.7325 0.7616 0.7468 0.9035
0.2195 48.52 1310 0.3796 0.7325 0.7616 0.7468 0.9035
0.2195 48.89 1320 0.3816 0.7325 0.7616 0.7468 0.9035
0.2195 49.26 1330 0.3816 0.7278 0.7616 0.7443 0.9021
0.2195 49.63 1340 0.3803 0.7278 0.7616 0.7443 0.9021
0.2195 50.0 1350 0.3777 0.7308 0.7550 0.7427 0.9049
0.2195 50.37 1360 0.3810 0.7325 0.7616 0.7468 0.9035
0.2195 50.74 1370 0.3793 0.7325 0.7616 0.7468 0.9063
0.2195 51.11 1380 0.3773 0.7308 0.7550 0.7427 0.9049
0.2195 51.48 1390 0.3791 0.7342 0.7682 0.7508 0.9049
0.2195 51.85 1400 0.3822 0.7342 0.7682 0.7508 0.9049
0.2195 52.22 1410 0.3830 0.7342 0.7682 0.7508 0.9049
0.2195 52.59 1420 0.3797 0.7342 0.7682 0.7508 0.9049
0.2195 52.96 1430 0.3791 0.7342 0.7682 0.7508 0.9049
0.2195 53.33 1440 0.3790 0.7342 0.7682 0.7508 0.9049
0.2195 53.7 1450 0.3792 0.7342 0.7682 0.7508 0.9049
0.2195 54.07 1460 0.3786 0.7325 0.7616 0.7468 0.9035
0.2195 54.44 1470 0.3778 0.7355 0.7550 0.7451 0.9035
0.2195 54.81 1480 0.3776 0.7355 0.7550 0.7451 0.9035
0.2195 55.19 1490 0.3774 0.7355 0.7550 0.7451 0.9035
0.1305 55.56 1500 0.3772 0.7355 0.7550 0.7451 0.9035

Framework versions

  • Transformers 4.29.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.