nutrition-extractor / README.md
raphael0202's picture
Model save
8b7b797 verified
|
raw
history blame
20.4 kB
metadata
license: cc-by-nc-sa-4.0
base_model: microsoft/layoutlmv3-large
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: nutrition-extractor
    results: []

nutrition-extractor

This model is a fine-tuned version of microsoft/layoutlmv3-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0532
  • Precision: 0.9536
  • Recall: 0.9633
  • F1: 0.9584
  • Accuracy: 0.9916

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 3000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
1.9852 0.1664 15 1.1500 0.0 0.0 0.0 0.8101
1.0244 0.3329 30 0.8342 0.05 0.0074 0.0129 0.8123
0.7826 0.4993 45 0.6795 0.0789 0.1138 0.0932 0.8479
0.6767 0.6657 60 0.5963 0.1193 0.1644 0.1383 0.8578
0.6031 0.8322 75 0.5406 0.1671 0.2248 0.1917 0.8691
0.5756 0.9986 90 0.4935 0.2291 0.3112 0.2639 0.8778
0.5215 1.1650 105 0.4302 0.3267 0.3948 0.3575 0.8905
0.4782 1.3315 120 0.3782 0.3939 0.4654 0.4267 0.9020
0.4208 1.4979 135 0.3405 0.4027 0.5044 0.4478 0.9081
0.3532 1.6644 150 0.2930 0.4960 0.5820 0.5356 0.9252
0.3458 1.8308 165 0.2658 0.5155 0.6033 0.5560 0.9301
0.302 1.9972 180 0.2321 0.6112 0.7009 0.6530 0.9474
0.2655 2.1637 195 0.2093 0.6471 0.7264 0.6845 0.9520
0.2598 2.3301 210 0.1951 0.7013 0.7557 0.7275 0.9570
0.2364 2.4965 225 0.1794 0.7091 0.7743 0.7402 0.9590
0.2218 2.6630 240 0.1676 0.7216 0.7933 0.7558 0.9621
0.206 2.8294 255 0.1572 0.7436 0.8110 0.7758 0.9650
0.2053 2.9958 270 0.1580 0.7381 0.8114 0.7730 0.9640
0.1876 3.1623 285 0.1406 0.7738 0.8309 0.8013 0.9687
0.1602 3.3287 300 0.1420 0.7714 0.8277 0.7986 0.9671
0.1706 3.4951 315 0.1323 0.7933 0.8379 0.8150 0.9691
0.1585 3.6616 330 0.1313 0.8060 0.8551 0.8298 0.9700
0.1574 3.8280 345 0.1267 0.8129 0.8639 0.8376 0.9717
0.15 3.9945 360 0.1157 0.8336 0.8746 0.8536 0.9754
0.1192 4.1609 375 0.1120 0.8348 0.8709 0.8525 0.9741
0.1313 4.3273 390 0.1130 0.8395 0.8792 0.8589 0.9745
0.1179 4.4938 405 0.1093 0.8370 0.8871 0.8613 0.9755
0.1327 4.6602 420 0.1102 0.8400 0.8853 0.8621 0.9746
0.1323 4.8266 435 0.0997 0.8611 0.8987 0.8795 0.9782
0.1254 4.9931 450 0.0949 0.8499 0.8969 0.8728 0.9775
0.0999 5.1595 465 0.0847 0.8658 0.8992 0.8822 0.9797
0.1017 5.3259 480 0.0803 0.8747 0.9108 0.8924 0.9810
0.091 5.4924 495 0.0796 0.8784 0.9057 0.8918 0.9806
0.0979 5.6588 510 0.0943 0.8607 0.8950 0.8775 0.9773
0.1024 5.8252 525 0.0804 0.8710 0.9062 0.8882 0.9805
0.0952 5.9917 540 0.0787 0.8845 0.9178 0.9008 0.9816
0.0742 6.1581 555 0.0776 0.8918 0.9150 0.9033 0.9823
0.0764 6.3245 570 0.0721 0.9028 0.9187 0.9107 0.9837
0.0813 6.4910 585 0.0664 0.9065 0.9229 0.9146 0.9844
0.0791 6.6574 600 0.0642 0.9026 0.9252 0.9138 0.9848
0.0792 6.8239 615 0.0673 0.8964 0.9248 0.9104 0.9841
0.078 6.9903 630 0.0693 0.8938 0.9224 0.9079 0.9833
0.0678 7.1567 645 0.0672 0.9082 0.9327 0.9203 0.9852
0.0685 7.3232 660 0.0655 0.8926 0.9224 0.9073 0.9840
0.0555 7.4896 675 0.0615 0.9156 0.9271 0.9213 0.9856
0.07 7.6560 690 0.0587 0.9173 0.9373 0.9272 0.9868
0.065 7.8225 705 0.0558 0.9205 0.9405 0.9304 0.9875
0.0599 7.9889 720 0.0579 0.9253 0.9433 0.9342 0.9878
0.0571 8.1553 735 0.0593 0.9148 0.9331 0.9239 0.9866
0.0563 8.3218 750 0.0605 0.9152 0.9322 0.9236 0.9863
0.0602 8.4882 765 0.0581 0.9252 0.9308 0.9280 0.9863
0.0582 8.6546 780 0.0581 0.9206 0.9373 0.9289 0.9872
0.0514 8.8211 795 0.0557 0.9245 0.9382 0.9313 0.9873
0.0467 8.9875 810 0.0520 0.9291 0.9498 0.9394 0.9883
0.0435 9.1540 825 0.0526 0.9229 0.9447 0.9337 0.9880
0.0531 9.3204 840 0.0502 0.9249 0.9443 0.9345 0.9884
0.0502 9.4868 855 0.0545 0.9171 0.9452 0.9309 0.9874
0.0377 9.6533 870 0.0618 0.9077 0.9368 0.9221 0.9851
0.0416 9.8197 885 0.0549 0.9267 0.9392 0.9329 0.9881
0.044 9.9861 900 0.0529 0.9366 0.9475 0.9420 0.9884
0.0383 10.1526 915 0.0490 0.9332 0.9475 0.9403 0.9889
0.0454 10.3190 930 0.0507 0.9264 0.9471 0.9366 0.9885
0.0416 10.4854 945 0.0467 0.9364 0.9498 0.9430 0.9891
0.0403 10.6519 960 0.0499 0.9314 0.9457 0.9385 0.9886
0.0354 10.8183 975 0.0523 0.9258 0.9452 0.9354 0.9883
0.0338 10.9847 990 0.0521 0.9214 0.9424 0.9318 0.9880
0.0347 11.1512 1005 0.0539 0.9235 0.9475 0.9354 0.9880
0.0364 11.3176 1020 0.0560 0.9194 0.9480 0.9335 0.9871
0.0363 11.4840 1035 0.0509 0.9286 0.9480 0.9382 0.9889
0.0308 11.6505 1050 0.0498 0.9389 0.9484 0.9436 0.9893
0.032 11.8169 1065 0.0491 0.9364 0.9443 0.9403 0.9891
0.0331 11.9834 1080 0.0455 0.9373 0.9443 0.9408 0.9892
0.0301 12.1498 1095 0.0486 0.9359 0.9489 0.9423 0.9892
0.0308 12.3162 1110 0.0513 0.9325 0.9503 0.9413 0.9891
0.0253 12.4827 1125 0.0510 0.9296 0.9503 0.9398 0.9892
0.0301 12.6491 1140 0.0533 0.9308 0.9489 0.9397 0.9886
0.0328 12.8155 1155 0.0549 0.9287 0.9443 0.9364 0.9885
0.0298 12.9820 1170 0.0504 0.9402 0.9498 0.9450 0.9895
0.0256 13.1484 1185 0.0515 0.9354 0.9419 0.9387 0.9888
0.0313 13.3148 1200 0.0483 0.9418 0.9545 0.9481 0.9905
0.022 13.4813 1215 0.0463 0.9361 0.9531 0.9445 0.9899
0.0245 13.6477 1230 0.0494 0.9368 0.9494 0.9430 0.9893
0.0251 13.8141 1245 0.0493 0.9404 0.9531 0.9467 0.9898
0.0259 13.9806 1260 0.0511 0.9386 0.9522 0.9454 0.9895
0.03 14.1470 1275 0.0535 0.9344 0.9457 0.9400 0.9889
0.0192 14.3135 1290 0.0491 0.9428 0.9494 0.9461 0.9899
0.0267 14.4799 1305 0.0490 0.9457 0.9545 0.9501 0.9901
0.0241 14.6463 1320 0.0506 0.9435 0.9540 0.9487 0.9899
0.0211 14.8128 1335 0.0510 0.9444 0.9540 0.9492 0.9903
0.0171 14.9792 1350 0.0499 0.9405 0.9545 0.9474 0.9898
0.0226 15.1456 1365 0.0511 0.9366 0.9540 0.9452 0.9894
0.024 15.3121 1380 0.0484 0.9445 0.9559 0.9501 0.9899
0.018 15.4785 1395 0.0482 0.9469 0.9517 0.9493 0.9903
0.0191 15.6449 1410 0.0491 0.9442 0.9512 0.9477 0.9899
0.0203 15.8114 1425 0.0451 0.9510 0.9554 0.9532 0.9912
0.0198 15.9778 1440 0.0447 0.9497 0.9549 0.9523 0.9911
0.0167 16.1442 1455 0.0444 0.9487 0.9540 0.9514 0.9909
0.0178 16.3107 1470 0.0513 0.9386 0.9512 0.9449 0.9892
0.024 16.4771 1485 0.0502 0.9430 0.9536 0.9483 0.9899
0.0206 16.6436 1500 0.0459 0.9483 0.9545 0.9514 0.9908
0.0188 16.8100 1515 0.0469 0.9474 0.9540 0.9507 0.9906
0.016 16.9764 1530 0.0463 0.9468 0.9582 0.9524 0.9906
0.0161 17.1429 1545 0.0455 0.9516 0.9596 0.9556 0.9911
0.0135 17.3093 1560 0.0475 0.9524 0.9573 0.9548 0.9909
0.0148 17.4757 1575 0.0479 0.9440 0.9545 0.9492 0.9905
0.0173 17.6422 1590 0.0455 0.9539 0.9605 0.9572 0.9915
0.0173 17.8086 1605 0.0456 0.9475 0.9554 0.9514 0.9913
0.0185 17.9750 1620 0.0461 0.9498 0.9577 0.9537 0.9908
0.0153 18.1415 1635 0.0472 0.9491 0.9605 0.9548 0.9911
0.0148 18.3079 1650 0.0446 0.9507 0.9587 0.9547 0.9913
0.0136 18.4743 1665 0.0441 0.9486 0.9601 0.9543 0.9914
0.0185 18.6408 1680 0.0478 0.9528 0.9573 0.9551 0.9915
0.0147 18.8072 1695 0.0493 0.9515 0.9652 0.9583 0.9912
0.0156 18.9736 1710 0.0509 0.9440 0.9545 0.9492 0.9903
0.0113 19.1401 1725 0.0460 0.9559 0.9573 0.9566 0.9911
0.014 19.3065 1740 0.0493 0.9439 0.9526 0.9482 0.9905
0.0147 19.4730 1755 0.0498 0.9476 0.9568 0.9522 0.9906
0.0126 19.6394 1770 0.0493 0.9474 0.9531 0.9502 0.9906
0.0167 19.8058 1785 0.0491 0.9463 0.9577 0.9520 0.9904
0.0126 19.9723 1800 0.0474 0.9492 0.9540 0.9516 0.9908
0.0107 20.1387 1815 0.0462 0.9524 0.9577 0.9551 0.9914
0.0115 20.3051 1830 0.0481 0.9504 0.9614 0.9559 0.9911
0.0128 20.4716 1845 0.0486 0.9475 0.9563 0.9519 0.9907
0.0113 20.6380 1860 0.0491 0.9477 0.9591 0.9534 0.9910
0.0119 20.8044 1875 0.0514 0.9494 0.9503 0.9499 0.9901
0.0122 20.9709 1890 0.0480 0.9481 0.9591 0.9536 0.9911
0.0123 21.1373 1905 0.0477 0.9467 0.9577 0.9522 0.9909
0.0116 21.3037 1920 0.0486 0.9485 0.9582 0.9533 0.9910
0.0108 21.4702 1935 0.0488 0.9442 0.9582 0.9511 0.9905
0.0115 21.6366 1950 0.0472 0.9498 0.9587 0.9542 0.9913
0.0083 21.8031 1965 0.0476 0.9490 0.9596 0.9543 0.9911
0.0094 21.9695 1980 0.0475 0.9482 0.9605 0.9543 0.9909
0.0118 22.1359 1995 0.0492 0.9449 0.9554 0.9501 0.9904
0.01 22.3024 2010 0.0486 0.9492 0.9554 0.9523 0.9909
0.0114 22.4688 2025 0.0497 0.9502 0.9577 0.9540 0.9910
0.0091 22.6352 2040 0.0499 0.9503 0.9582 0.9542 0.9910
0.0077 22.8017 2055 0.0502 0.9513 0.9614 0.9563 0.9911
0.01 22.9681 2070 0.0513 0.9544 0.9628 0.9586 0.9913
0.0087 23.1345 2085 0.0485 0.9500 0.9610 0.9554 0.9912
0.0073 23.3010 2100 0.0485 0.9557 0.9628 0.9593 0.9917
0.0083 23.4674 2115 0.0485 0.9535 0.9610 0.9572 0.9913
0.0117 23.6338 2130 0.0479 0.9557 0.9624 0.9590 0.9916
0.0095 23.8003 2145 0.0508 0.9498 0.9587 0.9542 0.9911
0.009 23.9667 2160 0.0513 0.9492 0.9628 0.9560 0.9910
0.0077 24.1331 2175 0.0504 0.9553 0.9628 0.9591 0.9915
0.0087 24.2996 2190 0.0500 0.9521 0.9610 0.9565 0.9913
0.0068 24.4660 2205 0.0506 0.9539 0.9610 0.9574 0.9913
0.0094 24.6325 2220 0.0500 0.9507 0.9591 0.9549 0.9913
0.0088 24.7989 2235 0.0486 0.9508 0.9596 0.9552 0.9914
0.0089 24.9653 2250 0.0507 0.9508 0.9610 0.9559 0.9911
0.0063 25.1318 2265 0.0479 0.9561 0.9610 0.9585 0.9917
0.0058 25.2982 2280 0.0506 0.9526 0.9619 0.9572 0.9911
0.0102 25.4646 2295 0.0499 0.9526 0.9624 0.9575 0.9912
0.0079 25.6311 2310 0.0543 0.9469 0.9614 0.9541 0.9905
0.009 25.7975 2325 0.0498 0.9526 0.9619 0.9572 0.9915
0.0068 25.9639 2340 0.0511 0.9509 0.9619 0.9564 0.9911
0.007 26.1304 2355 0.0492 0.9527 0.9633 0.9580 0.9914
0.0086 26.2968 2370 0.0516 0.9500 0.9610 0.9554 0.9913
0.0078 26.4632 2385 0.0503 0.9504 0.9610 0.9557 0.9914
0.0067 26.6297 2400 0.0514 0.9527 0.9628 0.9577 0.9915
0.0059 26.7961 2415 0.0504 0.9549 0.9628 0.9588 0.9919
0.0089 26.9626 2430 0.0520 0.9517 0.9605 0.9561 0.9916
0.0059 27.1290 2445 0.0512 0.9522 0.9624 0.9573 0.9917
0.0073 27.2954 2460 0.0526 0.9530 0.9610 0.9570 0.9916
0.0065 27.4619 2475 0.0530 0.9527 0.9628 0.9577 0.9916
0.0064 27.6283 2490 0.0515 0.9535 0.9610 0.9572 0.9917
0.0072 27.7947 2505 0.0542 0.9482 0.9610 0.9546 0.9907
0.0066 27.9612 2520 0.0537 0.9491 0.9610 0.9550 0.9909
0.006 28.1276 2535 0.0518 0.9531 0.9628 0.9579 0.9915
0.0074 28.2940 2550 0.0523 0.9521 0.9610 0.9565 0.9914
0.0068 28.4605 2565 0.0534 0.9495 0.9614 0.9555 0.9913
0.0055 28.6269 2580 0.0521 0.9548 0.9619 0.9584 0.9917
0.0056 28.7933 2595 0.0526 0.9522 0.9614 0.9568 0.9913
0.0066 28.9598 2610 0.0527 0.9522 0.9619 0.9570 0.9913
0.0053 29.1262 2625 0.0533 0.9531 0.9628 0.9579 0.9913
0.0063 29.2926 2640 0.0520 0.9530 0.9610 0.9570 0.9913
0.0059 29.4591 2655 0.0533 0.9504 0.9605 0.9554 0.9910
0.0059 29.6255 2670 0.0532 0.9526 0.9619 0.9572 0.9912
0.0062 29.7920 2685 0.0516 0.9535 0.9624 0.9579 0.9917
0.0064 29.9584 2700 0.0515 0.9522 0.9624 0.9573 0.9915
0.0055 30.1248 2715 0.0513 0.9549 0.9633 0.9591 0.9917
0.0064 30.2913 2730 0.0524 0.9540 0.9628 0.9584 0.9916
0.0055 30.4577 2745 0.0530 0.9531 0.9633 0.9582 0.9915
0.0065 30.6241 2760 0.0528 0.9536 0.9642 0.9589 0.9917
0.0068 30.7906 2775 0.0530 0.9518 0.9633 0.9575 0.9916
0.0047 30.9570 2790 0.0545 0.9532 0.9647 0.9589 0.9916
0.0051 31.1234 2805 0.0534 0.9545 0.9647 0.9596 0.9917
0.0044 31.2899 2820 0.0532 0.9531 0.9633 0.9582 0.9914
0.0068 31.4563 2835 0.0532 0.9527 0.9633 0.9580 0.9913
0.0045 31.6227 2850 0.0531 0.9545 0.9638 0.9591 0.9915
0.0047 31.7892 2865 0.0530 0.9540 0.9633 0.9586 0.9916
0.0075 31.9556 2880 0.0533 0.9549 0.9638 0.9593 0.9916
0.0055 32.1221 2895 0.0525 0.9553 0.9638 0.9595 0.9917
0.006 32.2885 2910 0.0523 0.9553 0.9638 0.9595 0.9917
0.0062 32.4549 2925 0.0525 0.9544 0.9633 0.9589 0.9917
0.0059 32.6214 2940 0.0525 0.9549 0.9638 0.9593 0.9917
0.0058 32.7878 2955 0.0531 0.9549 0.9642 0.9596 0.9917
0.005 32.9542 2970 0.0533 0.9536 0.9633 0.9584 0.9916
0.007 33.1207 2985 0.0533 0.9536 0.9633 0.9584 0.9916
0.0047 33.2871 3000 0.0532 0.9536 0.9633 0.9584 0.9916

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.5.1
  • Datasets 2.19.0
  • Tokenizers 0.19.1