File size: 9,318 Bytes
b23c987
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8fb8d46
 
 
 
 
 
 
 
b23c987
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8fb8d46
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b23c987
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
---
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
datasets:
- funsd
model-index:
- name: layoutlm-funsd
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlm-funsd

This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the funsd dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0339
- Answer: {'precision': 0.4001766784452297, 'recall': 0.5599505562422744, 'f1': 0.46676970633693976, 'number': 809}
- Header: {'precision': 0.3146067415730337, 'recall': 0.23529411764705882, 'f1': 0.2692307692307692, 'number': 119}
- Question: {'precision': 0.5092221331194867, 'recall': 0.596244131455399, 'f1': 0.5493079584775085, 'number': 1065}
- Overall Precision: 0.4522
- Overall Recall: 0.5600
- Overall F1: 0.5003
- Overall Accuracy: 0.6347

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Answer                                                                                                      | Header                                                                                                      | Question                                                                                                    | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 1.6941        | 1.0   | 10   | 1.4585          | {'precision': 0.09797822706065319, 'recall': 0.1557478368355995, 'f1': 0.12028639618138426, 'number': 809}  | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                 | {'precision': 0.2629193109700816, 'recall': 0.27230046948356806, 'f1': 0.26752767527675275, 'number': 1065} | 0.1741            | 0.2087         | 0.1899     | 0.3863           |
| 1.3912        | 2.0   | 20   | 1.3157          | {'precision': 0.19625137816979052, 'recall': 0.4400494437577256, 'f1': 0.27144491040792984, 'number': 809}  | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                 | {'precision': 0.2574061882817643, 'recall': 0.3671361502347418, 'f1': 0.3026315789473684, 'number': 1065}   | 0.2231            | 0.3748         | 0.2797     | 0.4259           |
| 1.2646        | 3.0   | 30   | 1.1981          | {'precision': 0.23537234042553193, 'recall': 0.43757725587144625, 'f1': 0.30609597924773024, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119}                                                 | {'precision': 0.35086633663366334, 'recall': 0.532394366197183, 'f1': 0.42297650130548303, 'number': 1065}  | 0.2908            | 0.4621         | 0.3570     | 0.4979           |
| 1.1512        | 4.0   | 40   | 1.0937          | {'precision': 0.2754578754578755, 'recall': 0.4647713226205192, 'f1': 0.3459061637534499, 'number': 809}    | {'precision': 0.12048192771084337, 'recall': 0.08403361344537816, 'f1': 0.09900990099009901, 'number': 119} | {'precision': 0.3988563259471051, 'recall': 0.523943661971831, 'f1': 0.45292207792207795, 'number': 1065}   | 0.3316            | 0.4737         | 0.3901     | 0.5719           |
| 1.052         | 5.0   | 50   | 1.0996          | {'precision': 0.2841163310961969, 'recall': 0.47095179233621753, 'f1': 0.35441860465116287, 'number': 809}  | {'precision': 0.23529411764705882, 'recall': 0.13445378151260504, 'f1': 0.17112299465240638, 'number': 119} | {'precision': 0.40622929092113985, 'recall': 0.5755868544600939, 'f1': 0.47630147630147635, 'number': 1065} | 0.3461            | 0.5068         | 0.4113     | 0.5719           |
| 0.9901        | 6.0   | 60   | 1.0590          | {'precision': 0.3064992614475628, 'recall': 0.5129789864029666, 'f1': 0.3837263060564031, 'number': 809}    | {'precision': 0.2345679012345679, 'recall': 0.15966386554621848, 'f1': 0.18999999999999997, 'number': 119}  | {'precision': 0.4610441767068273, 'recall': 0.5389671361502347, 'f1': 0.496969696969697, 'number': 1065}    | 0.3761            | 0.5058         | 0.4314     | 0.6011           |
| 0.9158        | 7.0   | 70   | 1.0134          | {'precision': 0.3295238095238095, 'recall': 0.4276885043263288, 'f1': 0.3722431414739107, 'number': 809}    | {'precision': 0.26506024096385544, 'recall': 0.18487394957983194, 'f1': 0.21782178217821785, 'number': 119} | {'precision': 0.45186226282501757, 'recall': 0.603755868544601, 'f1': 0.5168810289389068, 'number': 1065}   | 0.3955            | 0.5073         | 0.4445     | 0.6314           |
| 0.8626        | 8.0   | 80   | 1.0097          | {'precision': 0.3275862068965517, 'recall': 0.46971569839307786, 'f1': 0.3859827323514474, 'number': 809}   | {'precision': 0.3157894736842105, 'recall': 0.20168067226890757, 'f1': 0.24615384615384614, 'number': 119}  | {'precision': 0.44047619047619047, 'recall': 0.6253521126760564, 'f1': 0.5168800931315483, 'number': 1065}  | 0.3894            | 0.5369         | 0.4514     | 0.6276           |
| 0.8026        | 9.0   | 90   | 1.0030          | {'precision': 0.372310570626754, 'recall': 0.4919653893695921, 'f1': 0.42385516506922255, 'number': 809}    | {'precision': 0.2736842105263158, 'recall': 0.2184873949579832, 'f1': 0.2429906542056075, 'number': 119}    | {'precision': 0.49289454001495886, 'recall': 0.6187793427230047, 'f1': 0.5487094088259784, 'number': 1065}  | 0.4330            | 0.5434         | 0.4820     | 0.6410           |
| 0.794         | 10.0  | 100  | 1.0143          | {'precision': 0.3772893772893773, 'recall': 0.5092707045735476, 'f1': 0.4334560757496055, 'number': 809}    | {'precision': 0.2857142857142857, 'recall': 0.20168067226890757, 'f1': 0.23645320197044337, 'number': 119}  | {'precision': 0.4923572003218021, 'recall': 0.5746478873239437, 'f1': 0.5303292894280762, 'number': 1065}   | 0.4332            | 0.5258         | 0.4751     | 0.6380           |
| 0.7156        | 11.0  | 110  | 1.0071          | {'precision': 0.38151875571820676, 'recall': 0.515451174289246, 'f1': 0.43848580441640383, 'number': 809}   | {'precision': 0.2828282828282828, 'recall': 0.23529411764705882, 'f1': 0.25688073394495414, 'number': 119}  | {'precision': 0.5, 'recall': 0.6131455399061033, 'f1': 0.5508224377899621, 'number': 1065}                  | 0.4396            | 0.5509         | 0.4890     | 0.6393           |
| 0.7015        | 12.0  | 120  | 1.0361          | {'precision': 0.3828867761452031, 'recall': 0.5475896168108776, 'f1': 0.45066124109867756, 'number': 809}   | {'precision': 0.3111111111111111, 'recall': 0.23529411764705882, 'f1': 0.2679425837320574, 'number': 119}   | {'precision': 0.49387442572741197, 'recall': 0.6056338028169014, 'f1': 0.5440742302825812, 'number': 1065}  | 0.4371            | 0.5600         | 0.4910     | 0.6326           |
| 0.681         | 13.0  | 130  | 1.0591          | {'precision': 0.38740293356341676, 'recall': 0.5550061804697157, 'f1': 0.4563008130081301, 'number': 809}   | {'precision': 0.345679012345679, 'recall': 0.23529411764705882, 'f1': 0.27999999999999997, 'number': 119}   | {'precision': 0.5167074164629177, 'recall': 0.5953051643192488, 'f1': 0.5532286212914486, 'number': 1065}   | 0.4503            | 0.5575         | 0.4982     | 0.6299           |
| 0.6461        | 14.0  | 140  | 1.0191          | {'precision': 0.38854625550660793, 'recall': 0.5451174289245982, 'f1': 0.45370370370370366, 'number': 809}  | {'precision': 0.3333333333333333, 'recall': 0.23529411764705882, 'f1': 0.27586206896551724, 'number': 119}  | {'precision': 0.49961330239752516, 'recall': 0.6065727699530516, 'f1': 0.547921967769296, 'number': 1065}   | 0.4439            | 0.5595         | 0.4950     | 0.6351           |
| 0.6518        | 15.0  | 150  | 1.0339          | {'precision': 0.4001766784452297, 'recall': 0.5599505562422744, 'f1': 0.46676970633693976, 'number': 809}   | {'precision': 0.3146067415730337, 'recall': 0.23529411764705882, 'f1': 0.2692307692307692, 'number': 119}   | {'precision': 0.5092221331194867, 'recall': 0.596244131455399, 'f1': 0.5493079584775085, 'number': 1065}    | 0.4522            | 0.5600         | 0.5003     | 0.6347           |


### Framework versions

- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2