layoutlm-funsd / README.md
Haios141's picture
End of training
a8f71d2
|
raw
history blame
9.31 kB
metadata
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layoutlm-funsd
    results: []

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3215
  • Answer: {'precision': 0.10096818810511757, 'recall': 0.09023485784919653, 'f1': 0.09530026109660573, 'number': 809}
  • Header: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}
  • Question: {'precision': 0.3980815347721823, 'recall': 0.4676056338028169, 'f1': 0.43005181347150256, 'number': 1065}
  • Overall Precision: 0.2891
  • Overall Recall: 0.2865
  • Overall F1: 0.2878
  • Overall Accuracy: 0.5339

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.9471 1.0 10 1.8844 {'precision': 0.022006141248720572, 'recall': 0.05315203955500618, 'f1': 0.031125588128845458, 'number': 809} {'precision': 0.00702576112412178, 'recall': 0.05042016806722689, 'f1': 0.012332990750256937, 'number': 119} {'precision': 0.054583995760466346, 'recall': 0.09671361502347418, 'f1': 0.06978319783197831, 'number': 1065} 0.0324 0.0763 0.0455 0.2491
1.8584 2.0 20 1.8099 {'precision': 0.018408941485864562, 'recall': 0.034610630407911, 'f1': 0.024034334763948496, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.08241758241758242, 'recall': 0.11267605633802817, 'f1': 0.09520031733439112, 'number': 1065} 0.0469 0.0743 0.0575 0.3139
1.7841 3.0 30 1.7444 {'precision': 0.02190395956192081, 'recall': 0.032138442521631644, 'f1': 0.026052104208416832, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.10752688172043011, 'recall': 0.12206572769953052, 'f1': 0.11433597185576078, 'number': 1065} 0.0645 0.0783 0.0707 0.3426
1.7255 4.0 40 1.6851 {'precision': 0.026865671641791045, 'recall': 0.03337453646477132, 'f1': 0.029768467475192944, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.15547024952015356, 'recall': 0.15211267605633802, 'f1': 0.1537731371618415, 'number': 1065} 0.0922 0.0948 0.0935 0.3647
1.6607 5.0 50 1.6287 {'precision': 0.036458333333333336, 'recall': 0.04326328800988875, 'f1': 0.03957037874505371, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.2018348623853211, 'recall': 0.20657276995305165, 'f1': 0.20417633410672859, 'number': 1065} 0.1244 0.1279 0.1261 0.3943
1.6127 6.0 60 1.5738 {'precision': 0.045, 'recall': 0.05562422744128554, 'f1': 0.04975124378109452, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.24034334763948498, 'recall': 0.26291079812206575, 'f1': 0.25112107623318386, 'number': 1065} 0.1501 0.1631 0.1563 0.4234
1.5582 7.0 70 1.5242 {'precision': 0.05465587044534413, 'recall': 0.06674907292954264, 'f1': 0.060100166944908176, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.26282051282051283, 'recall': 0.307981220657277, 'f1': 0.2836143536532642, 'number': 1065} 0.1708 0.1917 0.1807 0.4483
1.5135 8.0 80 1.4789 {'precision': 0.05976520811099253, 'recall': 0.069221260815822, 'f1': 0.06414662084765177, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.29073482428115016, 'recall': 0.34178403755868547, 'f1': 0.31419939577039274, 'number': 1065} 0.1919 0.2107 0.2009 0.4679
1.4676 9.0 90 1.4380 {'precision': 0.06818181818181818, 'recall': 0.07416563658838071, 'f1': 0.07104795737122557, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.3149480415667466, 'recall': 0.3699530516431925, 'f1': 0.34024179620034545, 'number': 1065} 0.2130 0.2278 0.2202 0.4851
1.4233 10.0 100 1.4035 {'precision': 0.07664670658682635, 'recall': 0.07911001236093942, 'f1': 0.0778588807785888, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.3413848631239936, 'recall': 0.39812206572769954, 'f1': 0.3675769397485913, 'number': 1065} 0.2350 0.2449 0.2398 0.4988
1.3864 11.0 110 1.3744 {'precision': 0.0810126582278481, 'recall': 0.07911001236093942, 'f1': 0.08005003126954345, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.3583535108958838, 'recall': 0.4169014084507042, 'f1': 0.38541666666666663, 'number': 1065} 0.2504 0.2549 0.2526 0.5113
1.3746 12.0 120 1.3519 {'precision': 0.0870712401055409, 'recall': 0.0815822002472188, 'f1': 0.08423739629865987, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.3806818181818182, 'recall': 0.4403755868544601, 'f1': 0.40835872877666524, 'number': 1065} 0.2688 0.2684 0.2686 0.5175
1.3417 13.0 130 1.3352 {'precision': 0.09568733153638814, 'recall': 0.08776266996291718, 'f1': 0.09155383623468731, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.39403706688154716, 'recall': 0.4591549295774648, 'f1': 0.4241110147441457, 'number': 1065} 0.2824 0.2810 0.2817 0.5272
1.3318 14.0 140 1.3254 {'precision': 0.09686221009549795, 'recall': 0.08776266996291718, 'f1': 0.09208819714656291, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.3942307692307692, 'recall': 0.4619718309859155, 'f1': 0.4254215304798963, 'number': 1065} 0.2841 0.2825 0.2833 0.5314
1.3086 15.0 150 1.3215 {'precision': 0.10096818810511757, 'recall': 0.09023485784919653, 'f1': 0.09530026109660573, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.3980815347721823, 'recall': 0.4676056338028169, 'f1': 0.43005181347150256, 'number': 1065} 0.2891 0.2865 0.2878 0.5339

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3