ethangclark commited on
Commit
50aaadb
·
verified ·
1 Parent(s): 2d8bbe7

End of training

Browse files
README.md CHANGED
@@ -15,14 +15,13 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.5985
19
- - : {'precision': 0.17391304347826086, 'recall': 0.18181818181818182, 'f1': 0.17777777777777776, 'number': 22}
20
- - C: {'precision': 0.20408163265306123, 'recall': 0.2857142857142857, 'f1': 0.23809523809523808, 'number': 35}
21
- - H: {'precision': 0.41935483870967744, 'recall': 0.5, 'f1': 0.45614035087719296, 'number': 26}
22
- - Overall Precision: 0.2621
23
- - Overall Recall: 0.3253
24
- - Overall F1: 0.2903
25
- - Overall Accuracy: 0.8694
26
 
27
  ## Model description
28
 
@@ -51,23 +50,23 @@ The following hyperparameters were used during training:
51
 
52
  ### Training results
53
 
54
- | Training Loss | Epoch | Step | Validation Loss | | C | H | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
55
- |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
56
- | 1.3414 | 1.0 | 2 | 0.9941 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0 | 0.0 | 0.0 | 0.8182 |
57
- | 0.6808 | 2.0 | 4 | 0.8831 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0 | 0.0 | 0.0 | 0.8182 |
58
- | 0.5134 | 3.0 | 6 | 0.7517 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0 | 0.0 | 0.0 | 0.8182 |
59
- | 0.4175 | 4.0 | 8 | 0.6992 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0 | 0.0 | 0.0 | 0.8182 |
60
- | 0.3048 | 5.0 | 10 | 0.6476 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.13333333333333333, 'recall': 0.05714285714285714, 'f1': 0.08, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0667 | 0.0241 | 0.0354 | 0.8310 |
61
- | 0.2767 | 6.0 | 12 | 0.6375 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0 | 0.0 | 0.0 | 0.8399 |
62
- | 0.3514 | 7.0 | 14 | 0.6033 | {'precision': 0.047619047619047616, 'recall': 0.045454545454545456, 'f1': 0.046511627906976744, 'number': 22} | {'precision': 0.047619047619047616, 'recall': 0.02857142857142857, 'f1': 0.03571428571428571, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0476 | 0.0241 | 0.032 | 0.8656 |
63
- | 0.3766 | 8.0 | 16 | 0.6462 | {'precision': 0.13333333333333333, 'recall': 0.09090909090909091, 'f1': 0.10810810810810811, 'number': 22} | {'precision': 0.06666666666666667, 'recall': 0.02857142857142857, 'f1': 0.04, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.1 | 0.0361 | 0.0531 | 0.8271 |
64
- | 0.4447 | 9.0 | 18 | 0.6570 | {'precision': 0.06666666666666667, 'recall': 0.045454545454545456, 'f1': 0.05405405405405406, 'number': 22} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 35} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 26} | 0.0333 | 0.0120 | 0.0177 | 0.8182 |
65
- | 0.2359 | 10.0 | 20 | 0.6297 | {'precision': 0.15, 'recall': 0.13636363636363635, 'f1': 0.14285714285714282, 'number': 22} | {'precision': 0.08333333333333333, 'recall': 0.05714285714285714, 'f1': 0.06779661016949153, 'number': 35} | {'precision': 0.5, 'recall': 0.07692307692307693, 'f1': 0.13333333333333336, 'number': 26} | 0.1458 | 0.0843 | 0.1069 | 0.8438 |
66
- | 0.2136 | 11.0 | 22 | 0.6072 | {'precision': 0.20833333333333334, 'recall': 0.22727272727272727, 'f1': 0.21739130434782608, 'number': 22} | {'precision': 0.16666666666666666, 'recall': 0.17142857142857143, 'f1': 0.16901408450704225, 'number': 35} | {'precision': 0.42857142857142855, 'recall': 0.23076923076923078, 'f1': 0.3, 'number': 26} | 0.2297 | 0.2048 | 0.2166 | 0.8617 |
67
- | 0.2114 | 12.0 | 24 | 0.5978 | {'precision': 0.17391304347826086, 'recall': 0.18181818181818182, 'f1': 0.17777777777777776, 'number': 22} | {'precision': 0.1951219512195122, 'recall': 0.22857142857142856, 'f1': 0.21052631578947367, 'number': 35} | {'precision': 0.4090909090909091, 'recall': 0.34615384615384615, 'f1': 0.37500000000000006, 'number': 26} | 0.2442 | 0.2530 | 0.2485 | 0.8656 |
68
- | 0.1826 | 13.0 | 26 | 0.5982 | {'precision': 0.17391304347826086, 'recall': 0.18181818181818182, 'f1': 0.17777777777777776, 'number': 22} | {'precision': 0.18181818181818182, 'recall': 0.22857142857142856, 'f1': 0.20253164556962025, 'number': 35} | {'precision': 0.4230769230769231, 'recall': 0.4230769230769231, 'f1': 0.4230769230769231, 'number': 26} | 0.2473 | 0.2771 | 0.2614 | 0.8668 |
69
- | 0.1861 | 14.0 | 28 | 0.5983 | {'precision': 0.17391304347826086, 'recall': 0.18181818181818182, 'f1': 0.17777777777777776, 'number': 22} | {'precision': 0.21739130434782608, 'recall': 0.2857142857142857, 'f1': 0.24691358024691357, 'number': 35} | {'precision': 0.4642857142857143, 'recall': 0.5, 'f1': 0.4814814814814815, 'number': 26} | 0.2784 | 0.3253 | 0.3000 | 0.8707 |
70
- | 0.2442 | 15.0 | 30 | 0.5985 | {'precision': 0.17391304347826086, 'recall': 0.18181818181818182, 'f1': 0.17777777777777776, 'number': 22} | {'precision': 0.20408163265306123, 'recall': 0.2857142857142857, 'f1': 0.23809523809523808, 'number': 35} | {'precision': 0.41935483870967744, 'recall': 0.5, 'f1': 0.45614035087719296, 'number': 26} | 0.2621 | 0.3253 | 0.2903 | 0.8694 |
71
 
72
 
73
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.6328
19
+ - Answer: {'precision': 0.36627906976744184, 'recall': 0.6847826086956522, 'f1': 0.47727272727272735, 'number': 92}
20
+ - Header: {'precision': 0.8333333333333334, 'recall': 0.15625, 'f1': 0.2631578947368421, 'number': 32}
21
+ - Overall Precision: 0.3820
22
+ - Overall Recall: 0.5484
23
+ - Overall F1: 0.4503
24
+ - Overall Accuracy: 0.8540
 
25
 
26
  ## Model description
27
 
 
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
54
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
55
+ | 1.2939 | 1.0 | 2 | 0.9874 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
56
+ | 0.6826 | 2.0 | 4 | 0.8669 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
57
+ | 0.5215 | 3.0 | 6 | 0.7779 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
58
+ | 0.4118 | 4.0 | 8 | 0.7111 | {'precision': 0.45454545454545453, 'recall': 0.10869565217391304, 'f1': 0.1754385964912281, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.4545 | 0.0806 | 0.1370 | 0.8259 |
59
+ | 0.304 | 5.0 | 10 | 0.7495 | {'precision': 0.3150684931506849, 'recall': 0.5, 'f1': 0.3865546218487395, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3151 | 0.3710 | 0.3407 | 0.8041 |
60
+ | 0.303 | 6.0 | 12 | 0.6655 | {'precision': 0.4017857142857143, 'recall': 0.4891304347826087, 'f1': 0.4411764705882353, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.4018 | 0.3629 | 0.3814 | 0.8438 |
61
+ | 0.3767 | 7.0 | 14 | 0.6449 | {'precision': 0.38333333333333336, 'recall': 0.5, 'f1': 0.43396226415094347, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3833 | 0.3710 | 0.3770 | 0.8451 |
62
+ | 0.4003 | 8.0 | 16 | 0.6512 | {'precision': 0.3425414364640884, 'recall': 0.6739130434782609, 'f1': 0.45421245421245415, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3425 | 0.5 | 0.4066 | 0.8361 |
63
+ | 0.4865 | 9.0 | 18 | 0.7034 | {'precision': 0.3165137614678899, 'recall': 0.75, 'f1': 0.4451612903225806, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3165 | 0.5565 | 0.4035 | 0.8079 |
64
+ | 0.268 | 10.0 | 20 | 0.7160 | {'precision': 0.3150684931506849, 'recall': 0.75, 'f1': 0.4437299035369775, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3151 | 0.5565 | 0.4023 | 0.8079 |
65
+ | 0.311 | 11.0 | 22 | 0.7009 | {'precision': 0.32701421800947866, 'recall': 0.75, 'f1': 0.45544554455445546, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.3270 | 0.5565 | 0.4119 | 0.8169 |
66
+ | 0.2535 | 12.0 | 24 | 0.6770 | {'precision': 0.3469387755102041, 'recall': 0.7391304347826086, 'f1': 0.4722222222222222, 'number': 92} | {'precision': 0.5, 'recall': 0.125, 'f1': 0.2, 'number': 32} | 0.3529 | 0.5806 | 0.4390 | 0.8284 |
67
+ | 0.2197 | 13.0 | 26 | 0.6522 | {'precision': 0.35638297872340424, 'recall': 0.7282608695652174, 'f1': 0.4785714285714286, 'number': 92} | {'precision': 0.5, 'recall': 0.09375, 'f1': 0.15789473684210525, 'number': 32} | 0.3608 | 0.5645 | 0.4403 | 0.8387 |
68
+ | 0.2244 | 14.0 | 28 | 0.6379 | {'precision': 0.3615819209039548, 'recall': 0.6956521739130435, 'f1': 0.4758364312267658, 'number': 92} | {'precision': 0.5714285714285714, 'recall': 0.125, 'f1': 0.20512820512820512, 'number': 32} | 0.3696 | 0.5484 | 0.4416 | 0.8476 |
69
+ | 0.3203 | 15.0 | 30 | 0.6328 | {'precision': 0.36627906976744184, 'recall': 0.6847826086956522, 'f1': 0.47727272727272735, 'number': 92} | {'precision': 0.8333333333333334, 'recall': 0.15625, 'f1': 0.2631578947368421, 'number': 32} | 0.3820 | 0.5484 | 0.4503 | 0.8540 |
70
 
71
 
72
  ### Framework versions
logs/events.out.tfevents.1711208295.ethanmbp.lan.35239.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:95ab351b11c009f9308f1610812ea12e55bb747af207f9a60dc8e05d907a004f
3
- size 5462
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ee44b3d5605517613cd8ba884efac0fc4e0280efd094db2ce8f3ef3d4604da11
3
+ size 15638
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f84592a72b40e311042ce62e3dfaacde8aecad42266828f3cb69ba7ff9a8e035
3
  size 450552060
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:802c07c641cb86ae60fcde023995d6ee1f588ce18def5106a85b6066d873852c
3
  size 450552060