layoutlm-funsd / README.md
Haios141's picture
End of training
28bd1fa
|
raw
history blame
9.34 kB
metadata
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layoutlm-funsd
    results: []

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3293
  • Answer: {'precision': 0.11451135241855874, 'recall': 0.1433868974042027, 'f1': 0.12733260153677278, 'number': 809}
  • Header: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}
  • Question: {'precision': 0.41704374057315236, 'recall': 0.5192488262910798, 'f1': 0.46256796319531585, 'number': 1065}
  • Overall Precision: 0.2860
  • Overall Recall: 0.3357
  • Overall F1: 0.3089
  • Overall Accuracy: 0.5623

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.9774 1.0 10 1.9285 {'precision': 0.018331226295828066, 'recall': 0.03584672435105068, 'f1': 0.024257632789627767, 'number': 809} {'precision': 0.00787878787878788, 'recall': 0.1092436974789916, 'f1': 0.014697569248162805, 'number': 119} {'precision': 0.06559356136820925, 'recall': 0.15305164319248826, 'f1': 0.09183098591549295, 'number': 1065} 0.0359 0.1029 0.0532 0.1843
1.8918 2.0 20 1.8488 {'precision': 0.02769385699899295, 'recall': 0.06798516687268233, 'f1': 0.03935599284436494, 'number': 809} {'precision': 0.003703703703703704, 'recall': 0.008403361344537815, 'f1': 0.0051413881748071984, 'number': 119} {'precision': 0.07554585152838428, 'recall': 0.1624413145539906, 'f1': 0.10312965722801788, 'number': 1065} 0.0504 0.1149 0.0700 0.2606
1.8117 3.0 30 1.7797 {'precision': 0.02564102564102564, 'recall': 0.0580964153275649, 'f1': 0.03557910673732021, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.0943496801705757, 'recall': 0.16619718309859155, 'f1': 0.120367222033322, 'number': 1065} 0.0601 0.1124 0.0783 0.3026
1.7441 4.0 40 1.7198 {'precision': 0.019028871391076115, 'recall': 0.03584672435105068, 'f1': 0.024860694384912133, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.12127512127512127, 'recall': 0.1643192488262911, 'f1': 0.13955342902711323, 'number': 1065} 0.0686 0.1024 0.0822 0.3324
1.6818 5.0 50 1.6641 {'precision': 0.0196078431372549, 'recall': 0.03337453646477132, 'f1': 0.024702653247941447, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.15128593040847202, 'recall': 0.18779342723004694, 'f1': 0.16757436112274823, 'number': 1065} 0.0841 0.1139 0.0968 0.3537
1.6335 6.0 60 1.6097 {'precision': 0.02643171806167401, 'recall': 0.04449938195302843, 'f1': 0.03316444035006909, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.18782870022539444, 'recall': 0.2347417840375587, 'f1': 0.20868113522537562, 'number': 1065} 0.1062 0.1435 0.1221 0.3821
1.5742 7.0 70 1.5578 {'precision': 0.033409263477600606, 'recall': 0.054388133498145856, 'f1': 0.04139228598306679, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.22088068181818182, 'recall': 0.292018779342723, 'f1': 0.2515163768701982, 'number': 1065} 0.1303 0.1781 0.1505 0.4189
1.5302 8.0 80 1.5083 {'precision': 0.0456656346749226, 'recall': 0.07292954264524104, 'f1': 0.05616373155640171, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.24610169491525424, 'recall': 0.3408450704225352, 'f1': 0.2858267716535433, 'number': 1065} 0.1525 0.2117 0.1773 0.4559
1.4774 9.0 90 1.4639 {'precision': 0.05325914149443561, 'recall': 0.08281829419035847, 'f1': 0.0648282535074988, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.28843537414965986, 'recall': 0.39812206572769954, 'f1': 0.33451676528599605, 'number': 1065} 0.1800 0.2464 0.2080 0.4889
1.4389 10.0 100 1.4263 {'precision': 0.059574468085106386, 'recall': 0.0865265760197775, 'f1': 0.07056451612903225, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.32748948106591863, 'recall': 0.4384976525821596, 'f1': 0.3749498193496587, 'number': 1065} 0.2065 0.2694 0.2338 0.5120
1.4007 11.0 110 1.3933 {'precision': 0.07123534715960325, 'recall': 0.09765142150803462, 'f1': 0.08237747653806049, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.360773085182534, 'recall': 0.4732394366197183, 'f1': 0.40942323314378554, 'number': 1065} 0.2326 0.2925 0.2592 0.5334
1.3866 12.0 120 1.3665 {'precision': 0.09439252336448598, 'recall': 0.12484548825710753, 'f1': 0.10750399148483236, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.38648052902277735, 'recall': 0.49389671361502346, 'f1': 0.4336356141797197, 'number': 1065} 0.2579 0.3146 0.2835 0.5428
1.3482 13.0 130 1.3469 {'precision': 0.10622009569377991, 'recall': 0.13720642768850433, 'f1': 0.11974110032362459, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.40044411547002223, 'recall': 0.507981220657277, 'f1': 0.44784768211920534, 'number': 1065} 0.2721 0.3271 0.2971 0.5537
1.3355 14.0 140 1.3345 {'precision': 0.11078431372549019, 'recall': 0.13967861557478367, 'f1': 0.12356478950246036, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.4114114114114114, 'recall': 0.5145539906103287, 'f1': 0.4572382144347101, 'number': 1065} 0.2810 0.3317 0.3043 0.5588
1.3066 15.0 150 1.3293 {'precision': 0.11451135241855874, 'recall': 0.1433868974042027, 'f1': 0.12733260153677278, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.41704374057315236, 'recall': 0.5192488262910798, 'f1': 0.46256796319531585, 'number': 1065} 0.2860 0.3357 0.3089 0.5623

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3