End of training
Browse files- README.md +25 -25
- logs/events.out.tfevents.1741102575.DESKTOP-HA84SVN.3577307.2 +2 -2
- model.safetensors +1 -1
README.md
CHANGED
@@ -16,14 +16,14 @@ should probably proofread and complete it, then remove this comment. -->
|
|
16 |
|
17 |
This model is a fine-tuned version of [pabloma09/layoutlm-funsd](https://huggingface.co/pabloma09/layoutlm-funsd) on the None dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- Loss: 0.
|
20 |
-
- Eader: {'precision': 0.
|
21 |
-
- Nswer: {'precision': 0.
|
22 |
-
- Uestion: {'precision': 0.
|
23 |
-
- Overall Precision: 0.
|
24 |
-
- Overall Recall: 0.
|
25 |
-
- Overall F1: 0.
|
26 |
-
- Overall Accuracy: 0.
|
27 |
|
28 |
## Model description
|
29 |
|
@@ -53,23 +53,23 @@ The following hyperparameters were used during training:
|
|
53 |
|
54 |
### Training results
|
55 |
|
56 |
-
| Training Loss | Epoch | Step | Validation Loss | Eader
|
57 |
-
|
58 |
-
| 0.
|
59 |
-
| 0.
|
60 |
-
| 0.
|
61 |
-
| 0.
|
62 |
-
| 0.
|
63 |
-
| 0.
|
64 |
-
| 0.
|
65 |
-
| 0.
|
66 |
-
| 0.
|
67 |
-
| 0.
|
68 |
-
| 0.
|
69 |
-
| 0.
|
70 |
-
| 0.
|
71 |
-
| 0.
|
72 |
-
| 0.
|
73 |
|
74 |
|
75 |
### Framework versions
|
|
|
16 |
|
17 |
This model is a fine-tuned version of [pabloma09/layoutlm-funsd](https://huggingface.co/pabloma09/layoutlm-funsd) on the None dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 0.5379
|
20 |
+
- Eader: {'precision': 0.7209302325581395, 'recall': 0.543859649122807, 'f1': 0.6200000000000001, 'number': 57}
|
21 |
+
- Nswer: {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141}
|
22 |
+
- Uestion: {'precision': 0.7290322580645161, 'recall': 0.7018633540372671, 'f1': 0.7151898734177216, 'number': 161}
|
23 |
+
- Overall Precision: 0.7235
|
24 |
+
- Overall Recall: 0.6852
|
25 |
+
- Overall F1: 0.7039
|
26 |
+
- Overall Accuracy: 0.9016
|
27 |
|
28 |
## Model description
|
29 |
|
|
|
53 |
|
54 |
### Training results
|
55 |
|
56 |
+
| Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
57 |
+
|:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
58 |
+
| 0.0751 | 1.0 | 12 | 0.4989 | {'precision': 0.5740740740740741, 'recall': 0.543859649122807, 'f1': 0.5585585585585585, 'number': 57} | {'precision': 0.673202614379085, 'recall': 0.7304964539007093, 'f1': 0.7006802721088436, 'number': 141} | {'precision': 0.6666666666666666, 'recall': 0.6708074534161491, 'f1': 0.6687306501547988, 'number': 161} | 0.6558 | 0.6741 | 0.6648 | 0.8675 |
|
59 |
+
| 0.0681 | 2.0 | 24 | 0.4233 | {'precision': 0.6739130434782609, 'recall': 0.543859649122807, 'f1': 0.6019417475728156, 'number': 57} | {'precision': 0.7394366197183099, 'recall': 0.7446808510638298, 'f1': 0.7420494699646644, 'number': 141} | {'precision': 0.7044025157232704, 'recall': 0.6956521739130435, 'f1': 0.7, 'number': 161} | 0.7147 | 0.6908 | 0.7025 | 0.9004 |
|
60 |
+
| 0.0499 | 3.0 | 36 | 0.4571 | {'precision': 0.775, 'recall': 0.543859649122807, 'f1': 0.6391752577319588, 'number': 57} | {'precision': 0.7083333333333334, 'recall': 0.723404255319149, 'f1': 0.7157894736842105, 'number': 141} | {'precision': 0.73125, 'recall': 0.7267080745341615, 'f1': 0.7289719626168223, 'number': 161} | 0.7267 | 0.6964 | 0.7112 | 0.8998 |
|
61 |
+
| 0.037 | 4.0 | 48 | 0.4636 | {'precision': 0.7045454545454546, 'recall': 0.543859649122807, 'f1': 0.613861386138614, 'number': 57} | {'precision': 0.7142857142857143, 'recall': 0.7446808510638298, 'f1': 0.7291666666666666, 'number': 141} | {'precision': 0.7222222222222222, 'recall': 0.7267080745341615, 'f1': 0.7244582043343654, 'number': 161} | 0.7167 | 0.7047 | 0.7107 | 0.9016 |
|
62 |
+
| 0.0329 | 5.0 | 60 | 0.5128 | {'precision': 0.6530612244897959, 'recall': 0.5614035087719298, 'f1': 0.6037735849056605, 'number': 57} | {'precision': 0.697986577181208, 'recall': 0.7375886524822695, 'f1': 0.7172413793103447, 'number': 141} | {'precision': 0.6706586826347305, 'recall': 0.6956521739130435, 'f1': 0.6829268292682926, 'number': 161} | 0.6795 | 0.6908 | 0.6851 | 0.8880 |
|
63 |
+
| 0.0263 | 6.0 | 72 | 0.5192 | {'precision': 0.6904761904761905, 'recall': 0.5087719298245614, 'f1': 0.5858585858585859, 'number': 57} | {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141} | {'precision': 0.7484276729559748, 'recall': 0.7391304347826086, 'f1': 0.7437500000000001, 'number': 161} | 0.7289 | 0.6964 | 0.7123 | 0.8995 |
|
64 |
+
| 0.023 | 7.0 | 84 | 0.5452 | {'precision': 0.6976744186046512, 'recall': 0.5263157894736842, 'f1': 0.6, 'number': 57} | {'precision': 0.7202797202797203, 'recall': 0.7304964539007093, 'f1': 0.7253521126760565, 'number': 141} | {'precision': 0.7, 'recall': 0.6956521739130435, 'f1': 0.6978193146417445, 'number': 161} | 0.7081 | 0.6825 | 0.6950 | 0.8956 |
|
65 |
+
| 0.0205 | 8.0 | 96 | 0.5398 | {'precision': 0.6666666666666666, 'recall': 0.5614035087719298, 'f1': 0.6095238095238096, 'number': 57} | {'precision': 0.7083333333333334, 'recall': 0.723404255319149, 'f1': 0.7157894736842105, 'number': 141} | {'precision': 0.7151898734177216, 'recall': 0.7018633540372671, 'f1': 0.7084639498432601, 'number': 161} | 0.7057 | 0.6880 | 0.6968 | 0.8971 |
|
66 |
+
| 0.0182 | 9.0 | 108 | 0.5025 | {'precision': 0.62, 'recall': 0.543859649122807, 'f1': 0.5794392523364487, 'number': 57} | {'precision': 0.7482014388489209, 'recall': 0.7375886524822695, 'f1': 0.7428571428571428, 'number': 141} | {'precision': 0.7088607594936709, 'recall': 0.6956521739130435, 'f1': 0.7021943573667712, 'number': 161} | 0.7118 | 0.6880 | 0.6997 | 0.9046 |
|
67 |
+
| 0.0175 | 10.0 | 120 | 0.5017 | {'precision': 0.6888888888888889, 'recall': 0.543859649122807, 'f1': 0.6078431372549019, 'number': 57} | {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141} | {'precision': 0.7133757961783439, 'recall': 0.6956521739130435, 'f1': 0.7044025157232704, 'number': 161} | 0.7122 | 0.6825 | 0.6970 | 0.9031 |
|
68 |
+
| 0.0157 | 11.0 | 132 | 0.5034 | {'precision': 0.7272727272727273, 'recall': 0.5614035087719298, 'f1': 0.6336633663366337, 'number': 57} | {'precision': 0.7357142857142858, 'recall': 0.7304964539007093, 'f1': 0.7330960854092528, 'number': 141} | {'precision': 0.7243589743589743, 'recall': 0.7018633540372671, 'f1': 0.7129337539432177, 'number': 161} | 0.7294 | 0.6908 | 0.7096 | 0.9037 |
|
69 |
+
| 0.0151 | 12.0 | 144 | 0.5181 | {'precision': 0.7209302325581395, 'recall': 0.543859649122807, 'f1': 0.6200000000000001, 'number': 57} | {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141} | {'precision': 0.7290322580645161, 'recall': 0.7018633540372671, 'f1': 0.7151898734177216, 'number': 161} | 0.7235 | 0.6852 | 0.7039 | 0.9040 |
|
70 |
+
| 0.0122 | 13.0 | 156 | 0.5368 | {'precision': 0.7209302325581395, 'recall': 0.543859649122807, 'f1': 0.6200000000000001, 'number': 57} | {'precision': 0.7394366197183099, 'recall': 0.7446808510638298, 'f1': 0.7420494699646644, 'number': 141} | {'precision': 0.7261146496815286, 'recall': 0.7080745341614907, 'f1': 0.7169811320754716, 'number': 161} | 0.7310 | 0.6964 | 0.7133 | 0.9019 |
|
71 |
+
| 0.0114 | 14.0 | 168 | 0.5372 | {'precision': 0.7272727272727273, 'recall': 0.5614035087719298, 'f1': 0.6336633663366337, 'number': 57} | {'precision': 0.7272727272727273, 'recall': 0.7375886524822695, 'f1': 0.7323943661971831, 'number': 141} | {'precision': 0.7197452229299363, 'recall': 0.7018633540372671, 'f1': 0.7106918238993711, 'number': 161} | 0.7238 | 0.6936 | 0.7084 | 0.9022 |
|
72 |
+
| 0.0126 | 15.0 | 180 | 0.5379 | {'precision': 0.7209302325581395, 'recall': 0.543859649122807, 'f1': 0.6200000000000001, 'number': 57} | {'precision': 0.7183098591549296, 'recall': 0.723404255319149, 'f1': 0.7208480565371025, 'number': 141} | {'precision': 0.7290322580645161, 'recall': 0.7018633540372671, 'f1': 0.7151898734177216, 'number': 161} | 0.7235 | 0.6852 | 0.7039 | 0.9016 |
|
73 |
|
74 |
|
75 |
### Framework versions
|
logs/events.out.tfevents.1741102575.DESKTOP-HA84SVN.3577307.2
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ac2d4e3ecc8dc8d39463a9b229e601d3398a26ee30b4119eaec5a2edcd423c00
|
3 |
+
size 16190
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 450548984
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:59a36a86148643326fc04e9ec26d92c5596012b296eaaa1dd76395a400055a40
|
3 |
size 450548984
|