File size: 6,998 Bytes
570eb07
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7accdd6
 
19dde24
 
 
 
 
 
 
570eb07
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7accdd6
 
19dde24
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7accdd6
 
570eb07
 
02c60c5
 
570eb07
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
---
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
- generated_from_trainer
model-index:
- name: layoutlm-funsd
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# layoutlm-funsd

This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6097
- Answer: {'precision': 0.43703703703703706, 'recall': 0.6413043478260869, 'f1': 0.5198237885462555, 'number': 92}
- Header: {'precision': 0.2894736842105263, 'recall': 0.34375, 'f1': 0.3142857142857143, 'number': 32}
- Overall Precision: 0.4046
- Overall Recall: 0.5645
- Overall F1: 0.4714
- Overall Accuracy: 0.8656

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15

### Training results

| Training Loss | Epoch | Step | Validation Loss | Answer                                                                                                    | Header                                                                                         | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 1.4561        | 1.0   | 2    | 1.0789          | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92}                                                | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32}                                     | 0.0               | 0.0            | 0.0        | 0.8182           |
| 0.7649        | 2.0   | 4    | 0.9219          | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92}                                                | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32}                                     | 0.0               | 0.0            | 0.0        | 0.8182           |
| 0.5601        | 3.0   | 6    | 0.8338          | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92}                                                | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32}                                     | 0.0               | 0.0            | 0.0        | 0.8182           |
| 0.4611        | 4.0   | 8    | 0.7533          | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92}                                                | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32}                                     | 0.0               | 0.0            | 0.0        | 0.8182           |
| 0.3306        | 5.0   | 10   | 0.6861          | {'precision': 0.75, 'recall': 0.03260869565217391, 'f1': 0.06249999999999999, 'number': 92}               | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32}                                     | 0.75              | 0.0242         | 0.0469     | 0.8207           |
| 0.3001        | 6.0   | 12   | 0.6509          | {'precision': 0.43243243243243246, 'recall': 0.5217391304347826, 'f1': 0.47290640394088673, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32}                                     | 0.4324            | 0.3871         | 0.4085     | 0.8592           |
| 0.3436        | 7.0   | 14   | 0.6713          | {'precision': 0.33689839572192515, 'recall': 0.6847826086956522, 'f1': 0.45161290322580644, 'number': 92} | {'precision': 0.14285714285714285, 'recall': 0.03125, 'f1': 0.05128205128205128, 'number': 32} | 0.3299            | 0.5161         | 0.4025     | 0.8284           |
| 0.3624        | 8.0   | 16   | 0.6454          | {'precision': 0.3516483516483517, 'recall': 0.6956521739130435, 'f1': 0.46715328467153283, 'number': 92}  | {'precision': 0.4, 'recall': 0.0625, 'f1': 0.10810810810810811, 'number': 32}                  | 0.3529            | 0.5323         | 0.4244     | 0.8387           |
| 0.4258        | 9.0   | 18   | 0.6192          | {'precision': 0.3668639053254438, 'recall': 0.6739130434782609, 'f1': 0.475095785440613, 'number': 92}    | {'precision': 0.5555555555555556, 'recall': 0.15625, 'f1': 0.24390243902439024, 'number': 32}  | 0.3764            | 0.5403         | 0.4437     | 0.8528           |
| 0.2221        | 10.0  | 20   | 0.6282          | {'precision': 0.36942675159235666, 'recall': 0.6304347826086957, 'f1': 0.465863453815261, 'number': 92}   | {'precision': 0.3181818181818182, 'recall': 0.21875, 'f1': 0.25925925925925924, 'number': 32}  | 0.3631            | 0.5242         | 0.4290     | 0.8476           |
| 0.2069        | 11.0  | 22   | 0.6241          | {'precision': 0.40559440559440557, 'recall': 0.6304347826086957, 'f1': 0.4936170212765958, 'number': 92}  | {'precision': 0.34375, 'recall': 0.34375, 'f1': 0.34375, 'number': 32}                         | 0.3943            | 0.5565         | 0.4615     | 0.8592           |
| 0.2035        | 12.0  | 24   | 0.6218          | {'precision': 0.4084507042253521, 'recall': 0.6304347826086957, 'f1': 0.49572649572649574, 'number': 92}  | {'precision': 0.3125, 'recall': 0.3125, 'f1': 0.3125, 'number': 32}                            | 0.3908            | 0.5484         | 0.4564     | 0.8604           |
| 0.1729        | 13.0  | 26   | 0.6175          | {'precision': 0.41843971631205673, 'recall': 0.6413043478260869, 'f1': 0.5064377682403434, 'number': 92}  | {'precision': 0.3125, 'recall': 0.3125, 'f1': 0.3125, 'number': 32}                            | 0.3988            | 0.5565         | 0.4646     | 0.8643           |
| 0.1759        | 14.0  | 28   | 0.6127          | {'precision': 0.427536231884058, 'recall': 0.6413043478260869, 'f1': 0.5130434782608696, 'number': 92}    | {'precision': 0.3142857142857143, 'recall': 0.34375, 'f1': 0.3283582089552239, 'number': 32}   | 0.4046            | 0.5645         | 0.4714     | 0.8656           |
| 0.2299        | 15.0  | 30   | 0.6097          | {'precision': 0.43703703703703706, 'recall': 0.6413043478260869, 'f1': 0.5198237885462555, 'number': 92}  | {'precision': 0.2894736842105263, 'recall': 0.34375, 'f1': 0.3142857142857143, 'number': 32}   | 0.4046            | 0.5645         | 0.4714     | 0.8656           |


### Framework versions

- Transformers 4.39.0
- Pytorch 2.2.1
- Datasets 2.18.0
- Tokenizers 0.15.2