Theoreticallyhugo commited on
Commit
a7d5aef
·
verified ·
1 Parent(s): 2ffa757

Training in progress, epoch 15

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. README.md +31 -23
  2. meta_data/README_s42_e10.md +93 -0
  3. meta_data/README_s42_e11.md +94 -0
  4. meta_data/README_s42_e12.md +95 -0
  5. meta_data/README_s42_e13.md +96 -0
  6. meta_data/README_s42_e14.md +97 -0
  7. meta_data/README_s42_e15.md +98 -0
  8. meta_data/README_s42_e4.md +19 -19
  9. meta_data/README_s42_e5.md +20 -21
  10. meta_data/README_s42_e6.md +21 -22
  11. meta_data/README_s42_e7.md +22 -23
  12. meta_data/README_s42_e8.md +91 -0
  13. meta_data/README_s42_e9.md +92 -0
  14. meta_data/meta_s42_e10_cvi0.json +1 -0
  15. meta_data/meta_s42_e10_cvi1.json +1 -0
  16. meta_data/meta_s42_e10_cvi2.json +1 -0
  17. meta_data/meta_s42_e10_cvi3.json +1 -0
  18. meta_data/meta_s42_e10_cvi4.json +1 -0
  19. meta_data/meta_s42_e11_cvi0.json +1 -0
  20. meta_data/meta_s42_e11_cvi1.json +1 -0
  21. meta_data/meta_s42_e11_cvi2.json +1 -0
  22. meta_data/meta_s42_e11_cvi3.json +1 -0
  23. meta_data/meta_s42_e11_cvi4.json +1 -0
  24. meta_data/meta_s42_e12_cvi0.json +1 -0
  25. meta_data/meta_s42_e12_cvi1.json +1 -0
  26. meta_data/meta_s42_e12_cvi2.json +1 -0
  27. meta_data/meta_s42_e12_cvi3.json +1 -0
  28. meta_data/meta_s42_e12_cvi4.json +1 -0
  29. meta_data/meta_s42_e13_cvi0.json +1 -0
  30. meta_data/meta_s42_e13_cvi1.json +1 -0
  31. meta_data/meta_s42_e13_cvi2.json +1 -0
  32. meta_data/meta_s42_e13_cvi3.json +1 -0
  33. meta_data/meta_s42_e13_cvi4.json +1 -0
  34. meta_data/meta_s42_e14_cvi0.json +1 -0
  35. meta_data/meta_s42_e14_cvi1.json +1 -0
  36. meta_data/meta_s42_e14_cvi2.json +1 -0
  37. meta_data/meta_s42_e14_cvi3.json +1 -0
  38. meta_data/meta_s42_e14_cvi4.json +1 -0
  39. meta_data/meta_s42_e15_cvi0.json +1 -0
  40. meta_data/meta_s42_e15_cvi1.json +1 -0
  41. meta_data/meta_s42_e15_cvi2.json +1 -0
  42. meta_data/meta_s42_e15_cvi3.json +1 -0
  43. meta_data/meta_s42_e15_cvi4.json +1 -0
  44. meta_data/meta_s42_e16_cvi0.json +1 -0
  45. meta_data/meta_s42_e4_cvi0.json +1 -0
  46. meta_data/meta_s42_e4_cvi1.json +1 -0
  47. meta_data/meta_s42_e4_cvi2.json +1 -0
  48. meta_data/meta_s42_e4_cvi3.json +1 -0
  49. meta_data/meta_s42_e4_cvi4.json +1 -0
  50. meta_data/meta_s42_e5_cvi0.json +1 -0
README.md CHANGED
@@ -17,12 +17,12 @@ model-index:
17
  name: essays_su_g
18
  type: essays_su_g
19
  config: full_labels
20
- split: test
21
  args: full_labels
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.8431688702569063
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,17 +32,17 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.4438
36
- - B-claim: {'precision': 0.5970149253731343, 'recall': 0.5776173285198556, 'f1-score': 0.5871559633027523, 'support': 277.0}
37
- - B-majorclaim: {'precision': 0.6540880503144654, 'recall': 0.7375886524822695, 'f1-score': 0.6933333333333332, 'support': 141.0}
38
- - B-premise: {'precision': 0.7533039647577092, 'recall': 0.8003120124804992, 'f1-score': 0.7760968229954615, 'support': 641.0}
39
- - I-claim: {'precision': 0.6253114100647733, 'recall': 0.6153468987496935, 'f1-score': 0.6202891387618931, 'support': 4079.0}
40
- - I-majorclaim: {'precision': 0.7570308898109728, 'recall': 0.8045075943165115, 'f1-score': 0.7800475059382421, 'support': 2041.0}
41
- - I-premise: {'precision': 0.8764452113891286, 'recall': 0.8867743343518114, 'f1-score': 0.881579518333695, 'support': 11455.0}
42
- - O: {'precision': 0.9354231280460789, 'recall': 0.9105121293800539, 'f1-score': 0.9227995410588428, 'support': 9275.0}
43
- - Accuracy: 0.8432
44
- - Macro avg: {'precision': 0.7426596542508946, 'recall': 0.7618084214686707, 'f1-score': 0.7516145462463172, 'support': 27909.0}
45
- - Weighted avg: {'precision': 0.8438834099280034, 'recall': 0.8431688702569063, 'f1-score': 0.8433686534034867, 'support': 27909.0}
46
 
47
  ## Model description
48
 
@@ -67,19 +67,27 @@ The following hyperparameters were used during training:
67
  - seed: 42
68
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
  - lr_scheduler_type: linear
70
- - num_epochs: 7
71
 
72
  ### Training results
73
 
74
- | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
- |:-------------:|:-----:|:----:|:---------------:|:-----------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
- | No log | 1.0 | 41 | 0.7886 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.8888888888888888, 'recall': 0.0748829953198128, 'f1-score': 0.1381294964028777, 'support': 641.0} | {'precision': 0.47000821692686934, 'recall': 0.1402304486393724, 'f1-score': 0.21601208459214502, 'support': 4079.0} | {'precision': 0.5424354243542435, 'recall': 0.0720235178833905, 'f1-score': 0.12716262975778547, 'support': 2041.0} | {'precision': 0.7775630122158652, 'recall': 0.8779572239196858, 'f1-score': 0.8247160605190865, 'support': 11455.0} | {'precision': 0.6536142336038115, 'recall': 0.9466307277628032, 'f1-score': 0.7732957548000705, 'support': 9275.0} | 0.7024 | {'precision': 0.4760728251413826, 'recall': 0.3016749876464378, 'f1-score': 0.29704514658170933, 'support': 27909.0} | {'precision': 0.6651369922726568, 'recall': 0.7024257408004586, 'f1-score': 0.6395296795513288, 'support': 27909.0} |
77
- | No log | 2.0 | 82 | 0.5373 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.5765472312703583, 'recall': 0.828393135725429, 'f1-score': 0.679897567221511, 'support': 641.0} | {'precision': 0.5732105732105732, 'recall': 0.47315518509438587, 'f1-score': 0.5183991404781091, 'support': 4079.0} | {'precision': 0.5972927241962775, 'recall': 0.6918177364037237, 'f1-score': 0.6410896708286039, 'support': 2041.0} | {'precision': 0.858668504004823, 'recall': 0.870362287210825, 'f1-score': 0.8644758519032342, 'support': 11455.0} | {'precision': 0.8686365992742353, 'recall': 0.903288409703504, 'f1-score': 0.8856236786469344, 'support': 9275.0} | 0.7962 | {'precision': 0.4963365188508953, 'recall': 0.538145250591124, 'f1-score': 0.5127837012969133, 'support': 27909.0} | {'precision': 0.7818058448922789, 'recall': 0.7961947758787488, 'f1-score': 0.7874004427160499, 'support': 27909.0} |
78
- | No log | 3.0 | 123 | 0.4911 | {'precision': 0.34893617021276596, 'recall': 0.296028880866426, 'f1-score': 0.3203125, 'support': 277.0} | {'precision': 0.8333333333333334, 'recall': 0.03546099290780142, 'f1-score': 0.06802721088435375, 'support': 141.0} | {'precision': 0.6662371134020618, 'recall': 0.8065522620904836, 'f1-score': 0.7297106563161608, 'support': 641.0} | {'precision': 0.6018223234624146, 'recall': 0.3238538857563128, 'f1-score': 0.421102964615875, 'support': 4079.0} | {'precision': 0.7116279069767442, 'recall': 0.7496325330720235, 'f1-score': 0.7301360057265569, 'support': 2041.0} | {'precision': 0.7889374090247453, 'recall': 0.9463116542994325, 'f1-score': 0.8604881921016073, 'support': 11455.0} | {'precision': 0.9330078346769615, 'recall': 0.8859299191374663, 'f1-score': 0.908859639420418, 'support': 9275.0} | 0.8066 | {'precision': 0.6977002987270037, 'recall': 0.5776814468757066, 'f1-score': 0.5769481670092816, 'support': 27909.0} | {'precision': 0.7968542338095115, 'recall': 0.8066215199398044, 'f1-score': 0.7904444769227739, 'support': 27909.0} |
79
- | No log | 4.0 | 164 | 0.4471 | {'precision': 0.5464285714285714, 'recall': 0.5523465703971119, 'f1-score': 0.5493716337522441, 'support': 277.0} | {'precision': 0.6544117647058824, 'recall': 0.6312056737588653, 'f1-score': 0.6425992779783394, 'support': 141.0} | {'precision': 0.7510917030567685, 'recall': 0.8049921996879875, 'f1-score': 0.7771084337349398, 'support': 641.0} | {'precision': 0.6000949893137022, 'recall': 0.619514586908556, 'f1-score': 0.6096501809408926, 'support': 4079.0} | {'precision': 0.7037037037037037, 'recall': 0.8005879470847623, 'f1-score': 0.7490258996103598, 'support': 2041.0} | {'precision': 0.8949800652410294, 'recall': 0.8622435617634221, 'f1-score': 0.8783068783068784, 'support': 11455.0} | {'precision': 0.9216195734545848, 'recall': 0.9178436657681941, 'f1-score': 0.9197277441659464, 'support': 9275.0} | 0.8352 | {'precision': 0.7246186244148918, 'recall': 0.7412477436241284, 'f1-score': 0.7322557212128, 'support': 27909.0} | {'precision': 0.838766973613019, 'recall': 0.835178616216991, 'f1-score': 0.8365729339666597, 'support': 27909.0} |
80
- | No log | 5.0 | 205 | 0.4553 | {'precision': 0.5725490196078431, 'recall': 0.5270758122743683, 'f1-score': 0.5488721804511277, 'support': 277.0} | {'precision': 0.608433734939759, 'recall': 0.7163120567375887, 'f1-score': 0.6579804560260587, 'support': 141.0} | {'precision': 0.7355021216407355, 'recall': 0.8112324492979719, 'f1-score': 0.7715133531157271, 'support': 641.0} | {'precision': 0.5901240035429584, 'recall': 0.6533464084334396, 'f1-score': 0.6201279813845259, 'support': 4079.0} | {'precision': 0.7180370210934137, 'recall': 0.8172464478196962, 'f1-score': 0.7644362969752522, 'support': 2041.0} | {'precision': 0.8847149103239047, 'recall': 0.8655608904408555, 'f1-score': 0.8750330950489806, 'support': 11455.0} | {'precision': 0.9455065827132226, 'recall': 0.8904582210242588, 'f1-score': 0.9171571349250416, 'support': 9275.0} | 0.8339 | {'precision': 0.7221239134088339, 'recall': 0.7544617551468827, 'f1-score': 0.7364457854181019, 'support': 27909.0} | {'precision': 0.8417519193793559, 'recall': 0.8339245404708159, 'f1-score': 0.8369775321954073, 'support': 27909.0} |
81
- | No log | 6.0 | 246 | 0.4431 | {'precision': 0.5860805860805861, 'recall': 0.5776173285198556, 'f1-score': 0.5818181818181819, 'support': 277.0} | {'precision': 0.6503067484662577, 'recall': 0.75177304964539, 'f1-score': 0.6973684210526316, 'support': 141.0} | {'precision': 0.7481804949053857, 'recall': 0.8018720748829953, 'f1-score': 0.7740963855421686, 'support': 641.0} | {'precision': 0.634337807039757, 'recall': 0.614121108114734, 'f1-score': 0.6240657698056801, 'support': 4079.0} | {'precision': 0.7280740414279419, 'recall': 0.809407153356198, 'f1-score': 0.7665893271461718, 'support': 2041.0} | {'precision': 0.8748292349726776, 'recall': 0.8944565691837626, 'f1-score': 0.8845340354815039, 'support': 11455.0} | {'precision': 0.9417344173441734, 'recall': 0.8991913746630728, 'f1-score': 0.9199713198389498, 'support': 9275.0} | 0.8428 | {'precision': 0.7376490471766827, 'recall': 0.7640626654808583, 'f1-score': 0.7497776343836123, 'support': 27909.0} | {'precision': 0.8442738869920543, 'recall': 0.8428463936364614, 'f1-score': 0.8431306326473246, 'support': 27909.0} |
82
- | No log | 7.0 | 287 | 0.4438 | {'precision': 0.5970149253731343, 'recall': 0.5776173285198556, 'f1-score': 0.5871559633027523, 'support': 277.0} | {'precision': 0.6540880503144654, 'recall': 0.7375886524822695, 'f1-score': 0.6933333333333332, 'support': 141.0} | {'precision': 0.7533039647577092, 'recall': 0.8003120124804992, 'f1-score': 0.7760968229954615, 'support': 641.0} | {'precision': 0.6253114100647733, 'recall': 0.6153468987496935, 'f1-score': 0.6202891387618931, 'support': 4079.0} | {'precision': 0.7570308898109728, 'recall': 0.8045075943165115, 'f1-score': 0.7800475059382421, 'support': 2041.0} | {'precision': 0.8764452113891286, 'recall': 0.8867743343518114, 'f1-score': 0.881579518333695, 'support': 11455.0} | {'precision': 0.9354231280460789, 'recall': 0.9105121293800539, 'f1-score': 0.9227995410588428, 'support': 9275.0} | 0.8432 | {'precision': 0.7426596542508946, 'recall': 0.7618084214686707, 'f1-score': 0.7516145462463172, 'support': 27909.0} | {'precision': 0.8438834099280034, 'recall': 0.8431688702569063, 'f1-score': 0.8433686534034867, 'support': 27909.0} |
 
 
 
 
 
 
 
 
83
 
84
 
85
  ### Framework versions
 
17
  name: essays_su_g
18
  type: essays_su_g
19
  config: full_labels
20
+ split: train[80%:100%]
21
  args: full_labels
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.8435497302581556
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 0.6473
36
+ - B-claim: {'precision': 0.5985130111524164, 'recall': 0.5940959409594095, 'f1-score': 0.5962962962962963, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.6707317073170732, 'recall': 0.7913669064748201, 'f1-score': 0.7260726072607261, 'support': 139.0}
38
+ - B-premise: {'precision': 0.7566765578635015, 'recall': 0.8056872037914692, 'f1-score': 0.7804131599081868, 'support': 633.0}
39
+ - I-claim: {'precision': 0.6212043232115285, 'recall': 0.6033491627093227, 'f1-score': 0.6121465703055661, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.7810361681329423, 'recall': 0.793840039741679, 'f1-score': 0.7873860556787385, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8689666893269884, 'recall': 0.9020818630910374, 'f1-score': 0.8852146814404431, 'support': 11336.0}
42
+ - O: {'precision': 0.9395142986836132, 'recall': 0.8973553002384566, 'f1-score': 0.9179509923494844, 'support': 9226.0}
43
+ - Accuracy: 0.8435
44
+ - Macro avg: {'precision': 0.748091822241152, 'recall': 0.7696823452865992, 'f1-score': 0.7579257661770632, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8440071909900311, 'recall': 0.8435497302581556, 'f1-score': 0.8434243803550634, 'support': 27619.0}
46
 
47
  ## Model description
48
 
 
67
  - seed: 42
68
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
  - lr_scheduler_type: linear
70
+ - num_epochs: 15
71
 
72
  ### Training results
73
 
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.7008 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.7865168539325843, 'recall': 0.11058451816745656, 'f1-score': 0.19390581717451524, 'support': 633.0} | {'precision': 0.40153452685422, 'recall': 0.23544113971507125, 'f1-score': 0.29683314951945805, 'support': 4001.0} | {'precision': 0.5759865659109992, 'recall': 0.34078489816194735, 'f1-score': 0.4282147315855181, 'support': 2013.0} | {'precision': 0.7355271176112127, 'recall': 0.9582745236414961, 'f1-score': 0.8322543574027964, 'support': 11336.0} | {'precision': 0.8612315698178664, 'recall': 0.8610448731844786, 'f1-score': 0.8611382113821139, 'support': 9226.0} | 0.7424 | {'precision': 0.4801138048752689, 'recall': 0.35801856469577853, 'f1-score': 0.3731923238663431, 'support': 27619.0} | {'precision': 0.7077563864021956, 'recall': 0.7424236938339549, 'f1-score': 0.707906318183495, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5146 | {'precision': 0.19047619047619047, 'recall': 0.014760147601476014, 'f1-score': 0.0273972602739726, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.588774341351661, 'recall': 0.8120063191153238, 'f1-score': 0.6826029216467464, 'support': 633.0} | {'precision': 0.5761194029850746, 'recall': 0.3859035241189703, 'f1-score': 0.4622062565484209, 'support': 4001.0} | {'precision': 0.6959531416400426, 'recall': 0.6492796820665673, 'f1-score': 0.6718067334875354, 'support': 2013.0} | {'precision': 0.8084617153368591, 'recall': 0.9304869442484122, 'f1-score': 0.865192962309806, 'support': 11336.0} | {'precision': 0.8987938596491228, 'recall': 0.8884673748103187, 'f1-score': 0.8936007849122425, 'support': 9226.0} | 0.8007 | {'precision': 0.5369398073484215, 'recall': 0.5258434274230097, 'f1-score': 0.5146867027398178, 'support': 27619.0} | {'precision': 0.7816110201434078, 'recall': 0.8006806908287772, 'f1-score': 0.7854496816047499, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4449 | {'precision': 0.4583333333333333, 'recall': 0.5276752767527675, 'f1-score': 0.49056603773584906, 'support': 271.0} | {'precision': 0.6956521739130435, 'recall': 0.2302158273381295, 'f1-score': 0.34594594594594597, 'support': 139.0} | {'precision': 0.7202898550724638, 'recall': 0.7851500789889415, 'f1-score': 0.7513227513227514, 'support': 633.0} | {'precision': 0.586923245134485, 'recall': 0.6708322919270182, 'f1-score': 0.6260788430137625, 'support': 4001.0} | {'precision': 0.7267942583732058, 'recall': 0.754595131644312, 'f1-score': 0.7404338289056787, 'support': 2013.0} | {'precision': 0.8974713916574382, 'recall': 0.8578863796753705, 'f1-score': 0.8772325455529497, 'support': 11336.0} | {'precision': 0.9236111111111112, 'recall': 0.9081942336874052, 'f1-score': 0.9158377964804897, 'support': 9226.0} | 0.8320 | {'precision': 0.7155821955135829, 'recall': 0.6763641742877063, 'f1-score': 0.6782025355653467, 'support': 27619.0} | {'precision': 0.8393908547230632, 'recall': 0.8319997103443282, 'f1-score': 0.8344212165358135, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4465 | {'precision': 0.5707762557077626, 'recall': 0.4612546125461255, 'f1-score': 0.5102040816326531, 'support': 271.0} | {'precision': 0.6581196581196581, 'recall': 0.5539568345323741, 'f1-score': 0.6015625, 'support': 139.0} | {'precision': 0.7303851640513552, 'recall': 0.8088467614533965, 'f1-score': 0.767616191904048, 'support': 633.0} | {'precision': 0.6494704475572258, 'recall': 0.47513121719570106, 'f1-score': 0.5487875288683602, 'support': 4001.0} | {'precision': 0.7996837111228255, 'recall': 0.7536015896671634, 'f1-score': 0.7759590792838875, 'support': 2013.0} | {'precision': 0.8320863196030558, 'recall': 0.9319865913902611, 'f1-score': 0.8792077560021636, 'support': 11336.0} | {'precision': 0.928153625427657, 'recall': 0.9115543030565793, 'f1-score': 0.9197790780335757, 'support': 9226.0} | 0.8366 | {'precision': 0.7383821687985057, 'recall': 0.6994759871202287, 'f1-score': 0.7147308879606697, 'support': 27619.0} | {'precision': 0.8295906167856352, 'recall': 0.8366342010934502, 'f1-score': 0.8297935829927507, 'support': 27619.0} |
80
+ | No log | 5.0 | 205 | 0.4358 | {'precision': 0.6, 'recall': 0.6088560885608856, 'f1-score': 0.6043956043956044, 'support': 271.0} | {'precision': 0.6374269005847953, 'recall': 0.7841726618705036, 'f1-score': 0.703225806451613, 'support': 139.0} | {'precision': 0.7808, 'recall': 0.7709320695102686, 'f1-score': 0.7758346581875996, 'support': 633.0} | {'precision': 0.6116646415552855, 'recall': 0.6290927268182954, 'f1-score': 0.6202562838836866, 'support': 4001.0} | {'precision': 0.7328410078192876, 'recall': 0.8380526577247889, 'f1-score': 0.7819235225955967, 'support': 2013.0} | {'precision': 0.8857651245551601, 'recall': 0.8782639378969654, 'f1-score': 0.8819985825655563, 'support': 11336.0} | {'precision': 0.9360026993589022, 'recall': 0.9020160416215044, 'f1-score': 0.9186951482033449, 'support': 9226.0} | 0.8416 | {'precision': 0.7406429105533473, 'recall': 0.7730551691433158, 'f1-score': 0.7551899437547146, 'support': 27619.0} | {'precision': 0.8452341603615893, 'recall': 0.8415945544733697, 'f1-score': 0.8429891649448389, 'support': 27619.0} |
81
+ | No log | 6.0 | 246 | 0.4656 | {'precision': 0.5668789808917197, 'recall': 0.6568265682656826, 'f1-score': 0.6085470085470085, 'support': 271.0} | {'precision': 0.6455696202531646, 'recall': 0.7338129496402878, 'f1-score': 0.6868686868686869, 'support': 139.0} | {'precision': 0.7620528771384136, 'recall': 0.7740916271721959, 'f1-score': 0.768025078369906, 'support': 633.0} | {'precision': 0.6074982642906734, 'recall': 0.6560859785053736, 'f1-score': 0.6308579668348955, 'support': 4001.0} | {'precision': 0.7940131912734653, 'recall': 0.7774465971187282, 'f1-score': 0.7856425702811245, 'support': 2013.0} | {'precision': 0.8894866417434121, 'recall': 0.8605328158080452, 'f1-score': 0.8747702102856119, 'support': 11336.0} | {'precision': 0.920605732828556, 'recall': 0.9225016258400174, 'f1-score': 0.9215527042390774, 'support': 9226.0} | 0.8409 | {'precision': 0.7408721869170579, 'recall': 0.7687568803357615, 'f1-score': 0.7537520322037585, 'support': 27619.0} | {'precision': 0.844759622854032, 'recall': 0.8409428292117745, 'f1-score': 0.8425631787461125, 'support': 27619.0} |
82
+ | No log | 7.0 | 287 | 0.4964 | {'precision': 0.5975103734439834, 'recall': 0.5313653136531366, 'f1-score': 0.5625, 'support': 271.0} | {'precision': 0.6805555555555556, 'recall': 0.7050359712230215, 'f1-score': 0.6925795053003534, 'support': 139.0} | {'precision': 0.7341772151898734, 'recall': 0.8246445497630331, 'f1-score': 0.7767857142857142, 'support': 633.0} | {'precision': 0.6605504587155964, 'recall': 0.5578605348662834, 'f1-score': 0.6048780487804877, 'support': 4001.0} | {'precision': 0.8427997705106138, 'recall': 0.7297565822155986, 'f1-score': 0.7822151224707135, 'support': 2013.0} | {'precision': 0.8541922793213671, 'recall': 0.9193719124911786, 'f1-score': 0.8855843990313124, 'support': 11336.0} | {'precision': 0.9198913043478261, 'recall': 0.917298937784522, 'f1-score': 0.9185932920872679, 'support': 9226.0} | 0.8454 | {'precision': 0.7556681367264021, 'recall': 0.7407619717138249, 'f1-score': 0.746162297422264, 'support': 27619.0} | {'precision': 0.8411135771135726, 'recall': 0.8454324921249864, 'f1-score': 0.841777543839385, 'support': 27619.0} |
83
+ | No log | 8.0 | 328 | 0.5402 | {'precision': 0.5895522388059702, 'recall': 0.5830258302583026, 'f1-score': 0.5862708719851578, 'support': 271.0} | {'precision': 0.6242774566473989, 'recall': 0.7769784172661871, 'f1-score': 0.6923076923076924, 'support': 139.0} | {'precision': 0.7537091988130564, 'recall': 0.8025276461295419, 'f1-score': 0.7773527161438408, 'support': 633.0} | {'precision': 0.6078381795195954, 'recall': 0.6008497875531117, 'f1-score': 0.604323780794369, 'support': 4001.0} | {'precision': 0.7680074836295603, 'recall': 0.8156979632389468, 'f1-score': 0.7911346663454588, 'support': 2013.0} | {'precision': 0.8649226297341198, 'recall': 0.8924664784756527, 'f1-score': 0.8784787044675031, 'support': 11336.0} | {'precision': 0.9406701859077347, 'recall': 0.8884673748103187, 'f1-score': 0.9138238573021181, 'support': 9226.0} | 0.8376 | {'precision': 0.7355681961510622, 'recall': 0.7657162139617232, 'f1-score': 0.74909889847802, 'support': 27619.0} | {'precision': 0.8394578671455889, 'recall': 0.8376117889858431, 'f1-score': 0.8380825329114897, 'support': 27619.0} |
84
+ | No log | 9.0 | 369 | 0.5573 | {'precision': 0.5833333333333334, 'recall': 0.5682656826568265, 'f1-score': 0.5757009345794393, 'support': 271.0} | {'precision': 0.6604938271604939, 'recall': 0.7697841726618705, 'f1-score': 0.7109634551495018, 'support': 139.0} | {'precision': 0.7518248175182481, 'recall': 0.8135860979462876, 'f1-score': 0.7814871016691957, 'support': 633.0} | {'precision': 0.617191404297851, 'recall': 0.6173456635841039, 'f1-score': 0.6172685243033862, 'support': 4001.0} | {'precision': 0.7709631049353138, 'recall': 0.799304520615996, 'f1-score': 0.7848780487804878, 'support': 2013.0} | {'precision': 0.8666893096713933, 'recall': 0.9004057868736768, 'f1-score': 0.8832258901916671, 'support': 11336.0} | {'precision': 0.9458458690118028, 'recall': 0.8859744201170605, 'f1-score': 0.9149317215133199, 'support': 9226.0} | 0.8413 | {'precision': 0.7423345237040623, 'recall': 0.7649523349222601, 'f1-score': 0.752636525169571, 'support': 27619.0} | {'precision': 0.8435603253400192, 'recall': 0.8413048988015497, 'f1-score': 0.841905204414389, 'support': 27619.0} |
85
+ | No log | 10.0 | 410 | 0.6014 | {'precision': 0.5836298932384342, 'recall': 0.6051660516605166, 'f1-score': 0.5942028985507246, 'support': 271.0} | {'precision': 0.6457142857142857, 'recall': 0.8129496402877698, 'f1-score': 0.7197452229299364, 'support': 139.0} | {'precision': 0.7774294670846394, 'recall': 0.7835703001579779, 'f1-score': 0.7804878048780488, 'support': 633.0} | {'precision': 0.619513418610484, 'recall': 0.6173456635841039, 'f1-score': 0.6184276414621932, 'support': 4001.0} | {'precision': 0.741311042674879, 'recall': 0.8370591157476404, 'f1-score': 0.7862809146056929, 'support': 2013.0} | {'precision': 0.8770470496490772, 'recall': 0.8929075511644319, 'f1-score': 0.884906237705993, 'support': 11336.0} | {'precision': 0.937757909215956, 'recall': 0.8867331454584869, 'f1-score': 0.9115320334261839, 'support': 9226.0} | 0.8411 | {'precision': 0.7403432951696793, 'recall': 0.7765330668658467, 'f1-score': 0.7565118219369674, 'support': 27619.0} | {'precision': 0.8438003903638766, 'recall': 0.8411238640066621, 'f1-score': 0.8419322378651984, 'support': 27619.0} |
86
+ | No log | 11.0 | 451 | 0.5827 | {'precision': 0.5910652920962199, 'recall': 0.6346863468634686, 'f1-score': 0.6120996441281139, 'support': 271.0} | {'precision': 0.65625, 'recall': 0.7553956834532374, 'f1-score': 0.7023411371237458, 'support': 139.0} | {'precision': 0.7770897832817337, 'recall': 0.7930489731437599, 'f1-score': 0.7849882720875684, 'support': 633.0} | {'precision': 0.6108468125594672, 'recall': 0.6418395401149712, 'f1-score': 0.6259597806215723, 'support': 4001.0} | {'precision': 0.7910066428206438, 'recall': 0.7690014903129657, 'f1-score': 0.7798488664987405, 'support': 2013.0} | {'precision': 0.8831168831168831, 'recall': 0.8817925194071983, 'f1-score': 0.8824542043698962, 'support': 11336.0} | {'precision': 0.9239106392391064, 'recall': 0.905484500325168, 'f1-score': 0.9146047733742062, 'support': 9226.0} | 0.8416 | {'precision': 0.7476122933020077, 'recall': 0.768749864802967, 'f1-score': 0.7574709540291205, 'support': 27619.0} | {'precision': 0.8441508487148985, 'recall': 0.8416307614323473, 'f1-score': 0.8427657535850971, 'support': 27619.0} |
87
+ | No log | 12.0 | 492 | 0.6254 | {'precision': 0.5899280575539568, 'recall': 0.6051660516605166, 'f1-score': 0.5974499089253188, 'support': 271.0} | {'precision': 0.6491228070175439, 'recall': 0.7985611510791367, 'f1-score': 0.7161290322580646, 'support': 139.0} | {'precision': 0.7659574468085106, 'recall': 0.7962085308056872, 'f1-score': 0.7807900852052672, 'support': 633.0} | {'precision': 0.6134020618556701, 'recall': 0.6245938515371158, 'f1-score': 0.6189473684210526, 'support': 4001.0} | {'precision': 0.7651515151515151, 'recall': 0.8027819175360159, 'f1-score': 0.7835151515151515, 'support': 2013.0} | {'precision': 0.8724084312370421, 'recall': 0.890878616796048, 'f1-score': 0.8815467877094972, 'support': 11336.0} | {'precision': 0.9412571428571429, 'recall': 0.8926945588554086, 'f1-score': 0.9163328882955052, 'support': 9226.0} | 0.8411 | {'precision': 0.7424610660687689, 'recall': 0.7729835254671328, 'f1-score': 0.7563873174756939, 'support': 27619.0} | {'precision': 0.8437337218432961, 'recall': 0.8410514500887071, 'f1-score': 0.842051378351113, 'support': 27619.0} |
88
+ | 0.325 | 13.0 | 533 | 0.6340 | {'precision': 0.599290780141844, 'recall': 0.6236162361623616, 'f1-score': 0.6112115732368897, 'support': 271.0} | {'precision': 0.6832298136645962, 'recall': 0.7913669064748201, 'f1-score': 0.7333333333333333, 'support': 139.0} | {'precision': 0.7525622254758418, 'recall': 0.8120063191153238, 'f1-score': 0.7811550151975684, 'support': 633.0} | {'precision': 0.6177615571776156, 'recall': 0.6345913521619595, 'f1-score': 0.626063370731106, 'support': 4001.0} | {'precision': 0.7841762643965949, 'recall': 0.7779433681073026, 'f1-score': 0.7810473815461346, 'support': 2013.0} | {'precision': 0.8783854621701904, 'recall': 0.886908962597036, 'f1-score': 0.8826266350627688, 'support': 11336.0} | {'precision': 0.9332214765100671, 'recall': 0.9042922176457836, 'f1-score': 0.9185291203346911, 'support': 9226.0} | 0.8434 | {'precision': 0.7498039399338216, 'recall': 0.7758179088949412, 'f1-score': 0.7619952042060703, 'support': 27619.0} | {'precision': 0.8454773303227912, 'recall': 0.8434411093812231, 'f1-score': 0.84430920449428, 'support': 27619.0} |
89
+ | 0.325 | 14.0 | 574 | 0.6438 | {'precision': 0.6007462686567164, 'recall': 0.5940959409594095, 'f1-score': 0.5974025974025974, 'support': 271.0} | {'precision': 0.6607142857142857, 'recall': 0.7985611510791367, 'f1-score': 0.7231270358306189, 'support': 139.0} | {'precision': 0.7559523809523809, 'recall': 0.8025276461295419, 'f1-score': 0.778544061302682, 'support': 633.0} | {'precision': 0.6356897008207573, 'recall': 0.6000999750062485, 'f1-score': 0.6173823605039856, 'support': 4001.0} | {'precision': 0.7802303262955854, 'recall': 0.8077496274217586, 'f1-score': 0.7937515255064681, 'support': 2013.0} | {'precision': 0.8701530612244898, 'recall': 0.9026993648553282, 'f1-score': 0.8861274679598199, 'support': 11336.0} | {'precision': 0.9353205849268842, 'recall': 0.901257316280078, 'f1-score': 0.9179730624862, 'support': 9226.0} | 0.8456 | {'precision': 0.7484009440844428, 'recall': 0.772427288818786, 'f1-score': 0.7591868729989103, 'support': 27619.0} | {'precision': 0.8450878141879223, 'recall': 0.845613526919874, 'f1-score': 0.8449820141638845, 'support': 27619.0} |
90
+ | 0.325 | 15.0 | 615 | 0.6473 | {'precision': 0.5985130111524164, 'recall': 0.5940959409594095, 'f1-score': 0.5962962962962963, 'support': 271.0} | {'precision': 0.6707317073170732, 'recall': 0.7913669064748201, 'f1-score': 0.7260726072607261, 'support': 139.0} | {'precision': 0.7566765578635015, 'recall': 0.8056872037914692, 'f1-score': 0.7804131599081868, 'support': 633.0} | {'precision': 0.6212043232115285, 'recall': 0.6033491627093227, 'f1-score': 0.6121465703055661, 'support': 4001.0} | {'precision': 0.7810361681329423, 'recall': 0.793840039741679, 'f1-score': 0.7873860556787385, 'support': 2013.0} | {'precision': 0.8689666893269884, 'recall': 0.9020818630910374, 'f1-score': 0.8852146814404431, 'support': 11336.0} | {'precision': 0.9395142986836132, 'recall': 0.8973553002384566, 'f1-score': 0.9179509923494844, 'support': 9226.0} | 0.8435 | {'precision': 0.748091822241152, 'recall': 0.7696823452865992, 'f1-score': 0.7579257661770632, 'support': 27619.0} | {'precision': 0.8440071909900311, 'recall': 0.8435497302581556, 'f1-score': 0.8434243803550634, 'support': 27619.0} |
91
 
92
 
93
  ### Framework versions
meta_data/README_s42_e10.md ADDED
@@ -0,0 +1,93 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: allenai/longformer-base-4096
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - essays_su_g
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: longformer-full_labels
12
+ results:
13
+ - task:
14
+ name: Token Classification
15
+ type: token-classification
16
+ dataset:
17
+ name: essays_su_g
18
+ type: essays_su_g
19
+ config: full_labels
20
+ split: train[80%:100%]
21
+ args: full_labels
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.8502842246279735
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # longformer-full_labels
32
+
33
+ This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.5094
36
+ - B-claim: {'precision': 0.5934065934065934, 'recall': 0.5977859778597786, 'f1-score': 0.5955882352941178, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.6190476190476191, 'recall': 0.7482014388489209, 'f1-score': 0.6775244299674268, 'support': 139.0}
38
+ - B-premise: {'precision': 0.7654135338345864, 'recall': 0.8041074249605056, 'f1-score': 0.7842835130970724, 'support': 633.0}
39
+ - I-claim: {'precision': 0.6421972534332084, 'recall': 0.6428392901774557, 'f1-score': 0.6425181114164377, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.7737752161383286, 'recall': 0.8002980625931445, 'f1-score': 0.7868131868131869, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8804516462678849, 'recall': 0.9011115031757233, 'f1-score': 0.8906617839393146, 'support': 11336.0}
42
+ - O: {'precision': 0.9418631006346329, 'recall': 0.9008237589421201, 'f1-score': 0.9208864265927977, 'support': 9226.0}
43
+ - Accuracy: 0.8503
44
+ - Macro avg: {'precision': 0.7451649946804076, 'recall': 0.7707382080796641, 'f1-score': 0.7568965267314792, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8519076404793325, 'recall': 0.8502842246279735, 'f1-score': 0.8508360851093074, 'support': 27619.0}
46
+
47
+ ## Model description
48
+
49
+ More information needed
50
+
51
+ ## Intended uses & limitations
52
+
53
+ More information needed
54
+
55
+ ## Training and evaluation data
56
+
57
+ More information needed
58
+
59
+ ## Training procedure
60
+
61
+ ### Training hyperparameters
62
+
63
+ The following hyperparameters were used during training:
64
+ - learning_rate: 2e-05
65
+ - train_batch_size: 8
66
+ - eval_batch_size: 8
67
+ - seed: 42
68
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
+ - lr_scheduler_type: linear
70
+ - num_epochs: 10
71
+
72
+ ### Training results
73
+
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.6678 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.7840909090909091, 'recall': 0.10900473933649289, 'f1-score': 0.19140083217753123, 'support': 633.0} | {'precision': 0.4822983168891468, 'recall': 0.20769807548112973, 'f1-score': 0.29035639412997905, 'support': 4001.0} | {'precision': 0.5612813370473537, 'recall': 0.40039741679085944, 'f1-score': 0.4673818498115396, 'support': 2013.0} | {'precision': 0.7512732854252424, 'recall': 0.9498941425546931, 'f1-score': 0.8389886633682652, 'support': 11336.0} | {'precision': 0.8219942225321247, 'recall': 0.8944287882072404, 'f1-score': 0.8566831040747468, 'support': 9226.0} | 0.7504 | {'precision': 0.4858482958549681, 'recall': 0.36591759462434503, 'f1-score': 0.377830120508866, 'support': 27619.0} | {'precision': 0.7116846049265461, 'recall': 0.7504254317679858, 'f1-score': 0.711041172000772, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5097 | {'precision': 0.30303030303030304, 'recall': 0.03690036900369004, 'f1-score': 0.06578947368421052, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.5386138613861386, 'recall': 0.8593996840442338, 'f1-score': 0.6622032866707243, 'support': 633.0} | {'precision': 0.6033469018543646, 'recall': 0.33341664583854036, 'f1-score': 0.42949130714745654, 'support': 4001.0} | {'precision': 0.6347753743760399, 'recall': 0.7580725285643318, 'f1-score': 0.6909667194928685, 'support': 2013.0} | {'precision': 0.7955326460481099, 'recall': 0.9393966125617502, 'f1-score': 0.8614998786505945, 'support': 11336.0} | {'precision': 0.9321282798833819, 'recall': 0.8663559505744635, 'f1-score': 0.8980394359867422, 'support': 9226.0} | 0.7986 | {'precision': 0.5439181952254768, 'recall': 0.54193454151243, 'f1-score': 0.5154271573760851, 'support': 27619.0} | {'precision': 0.7868797260987861, 'recall': 0.7985806872080814, 'f1-score': 0.7819830122330254, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4526 | {'precision': 0.411522633744856, 'recall': 0.36900369003690037, 'f1-score': 0.38910505836575876, 'support': 271.0} | {'precision': 0.5555555555555556, 'recall': 0.03597122302158273, 'f1-score': 0.06756756756756756, 'support': 139.0} | {'precision': 0.6723016905071522, 'recall': 0.8167456556082149, 'f1-score': 0.7375178316690442, 'support': 633.0} | {'precision': 0.6122853368560106, 'recall': 0.4633841539615096, 'f1-score': 0.5275288092189502, 'support': 4001.0} | {'precision': 0.7304043934098852, 'recall': 0.726775956284153, 'f1-score': 0.728585657370518, 'support': 2013.0} | {'precision': 0.8314276639669684, 'recall': 0.9236944248412138, 'f1-score': 0.8751358127872962, 'support': 11336.0} | {'precision': 0.9284520227348713, 'recall': 0.9029915456319099, 'f1-score': 0.9155448101544041, 'support': 9226.0} | 0.8234 | {'precision': 0.677421328110757, 'recall': 0.6055095213407834, 'f1-score': 0.6058550781619341, 'support': 27619.0} | {'precision': 0.8155737667270567, 'recall': 0.8233824541076795, 'f1-score': 0.815609900299385, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4381 | {'precision': 0.5532786885245902, 'recall': 0.4981549815498155, 'f1-score': 0.5242718446601943, 'support': 271.0} | {'precision': 0.7291666666666666, 'recall': 0.5035971223021583, 'f1-score': 0.5957446808510638, 'support': 139.0} | {'precision': 0.7303370786516854, 'recall': 0.8214849921011058, 'f1-score': 0.7732342007434944, 'support': 633.0} | {'precision': 0.6406153846153846, 'recall': 0.5203699075231192, 'f1-score': 0.5742656185353744, 'support': 4001.0} | {'precision': 0.7463391591875296, 'recall': 0.7848981619473423, 'f1-score': 0.7651331719128329, 'support': 2013.0} | {'precision': 0.8485735186539868, 'recall': 0.9209597741707833, 'f1-score': 0.8832860950124793, 'support': 11336.0} | {'precision': 0.9354838709677419, 'recall': 0.9021244309559939, 'f1-score': 0.9185013518733102, 'support': 9226.0} | 0.8382 | {'precision': 0.7405420524667978, 'recall': 0.7073699100786169, 'f1-score': 0.7192052805126785, 'support': 27619.0} | {'precision': 0.8338202883646758, 'recall': 0.8381911003294833, 'f1-score': 0.8341800170128185, 'support': 27619.0} |
80
+ | No log | 5.0 | 205 | 0.4607 | {'precision': 0.565068493150685, 'recall': 0.6088560885608856, 'f1-score': 0.5861456483126111, 'support': 271.0} | {'precision': 0.6335403726708074, 'recall': 0.7338129496402878, 'f1-score': 0.68, 'support': 139.0} | {'precision': 0.7776, 'recall': 0.7677725118483413, 'f1-score': 0.7726550079491257, 'support': 633.0} | {'precision': 0.5777777777777777, 'recall': 0.6693326668332916, 'f1-score': 0.6201945345067159, 'support': 4001.0} | {'precision': 0.6935010482180294, 'recall': 0.821659215101838, 'f1-score': 0.7521600727603456, 'support': 2013.0} | {'precision': 0.9059289282684513, 'recall': 0.8478299223712068, 'f1-score': 0.8759170653907495, 'support': 11336.0} | {'precision': 0.9350314183123878, 'recall': 0.9032083243008888, 'f1-score': 0.9188444150402469, 'support': 9226.0} | 0.8338 | {'precision': 0.7269211483425913, 'recall': 0.764638811236677, 'f1-score': 0.7437023919942564, 'support': 27619.0} | {'precision': 0.8449738646800432, 'recall': 0.833810058293204, 'f1-score': 0.8379958389580837, 'support': 27619.0} |
81
+ | No log | 6.0 | 246 | 0.4511 | {'precision': 0.5901060070671378, 'recall': 0.6162361623616236, 'f1-score': 0.6028880866425992, 'support': 271.0} | {'precision': 0.6558441558441559, 'recall': 0.7266187050359713, 'f1-score': 0.689419795221843, 'support': 139.0} | {'precision': 0.7591463414634146, 'recall': 0.7867298578199052, 'f1-score': 0.7726920093095424, 'support': 633.0} | {'precision': 0.6242774566473989, 'recall': 0.6748312921769558, 'f1-score': 0.6485707422531829, 'support': 4001.0} | {'precision': 0.7863931965982992, 'recall': 0.7809239940387481, 'f1-score': 0.7836490528414757, 'support': 2013.0} | {'precision': 0.8978557857595223, 'recall': 0.8754410726887791, 'f1-score': 0.8865067667157979, 'support': 11336.0} | {'precision': 0.9268772543447371, 'recall': 0.9191415564708433, 'f1-score': 0.9229931972789116, 'support': 9226.0} | 0.8488 | {'precision': 0.748642885389238, 'recall': 0.7685603772275467, 'f1-score': 0.758102807180479, 'support': 27619.0} | {'precision': 0.8523779660551425, 'recall': 0.8487635323509178, 'f1-score': 0.8503464677801268, 'support': 27619.0} |
82
+ | No log | 7.0 | 287 | 0.4778 | {'precision': 0.610236220472441, 'recall': 0.5719557195571956, 'f1-score': 0.5904761904761905, 'support': 271.0} | {'precision': 0.64375, 'recall': 0.7410071942446043, 'f1-score': 0.6889632107023411, 'support': 139.0} | {'precision': 0.7492753623188406, 'recall': 0.8167456556082149, 'f1-score': 0.7815570672713531, 'support': 633.0} | {'precision': 0.6362002567394095, 'recall': 0.6193451637090728, 'f1-score': 0.6276595744680851, 'support': 4001.0} | {'precision': 0.8094989561586639, 'recall': 0.7704918032786885, 'f1-score': 0.7895138712140493, 'support': 2013.0} | {'precision': 0.869451476793249, 'recall': 0.9088743824982357, 'f1-score': 0.8887259553178642, 'support': 11336.0} | {'precision': 0.9413824260221368, 'recall': 0.9034251029698678, 'f1-score': 0.9220132743362831, 'support': 9226.0} | 0.8488 | {'precision': 0.7513992426435344, 'recall': 0.7616921459808399, 'f1-score': 0.7555584491123095, 'support': 27619.0} | {'precision': 0.8488866866818541, 'recall': 0.8487635323509178, 'f1-score': 0.8484072499438787, 'support': 27619.0} |
83
+ | No log | 8.0 | 328 | 0.4987 | {'precision': 0.5902255639097744, 'recall': 0.5793357933579336, 'f1-score': 0.584729981378026, 'support': 271.0} | {'precision': 0.5988372093023255, 'recall': 0.7410071942446043, 'f1-score': 0.6623794212218649, 'support': 139.0} | {'precision': 0.7652303120356612, 'recall': 0.8135860979462876, 'f1-score': 0.7886676875957122, 'support': 633.0} | {'precision': 0.6303245436105477, 'recall': 0.6213446638340415, 'f1-score': 0.625802391441158, 'support': 4001.0} | {'precision': 0.7663683466792275, 'recall': 0.8082463984103329, 'f1-score': 0.7867504835589942, 'support': 2013.0} | {'precision': 0.8685056693179896, 'recall': 0.9054340155257586, 'f1-score': 0.8865854711928823, 'support': 11336.0} | {'precision': 0.9521048359039778, 'recall': 0.889876436158682, 'f1-score': 0.9199394924085383, 'support': 9226.0} | 0.8459 | {'precision': 0.7387994972513576, 'recall': 0.7655472284968058, 'f1-score': 0.7506935612567395, 'support': 27619.0} | {'precision': 0.8480288117499208, 'recall': 0.8458669756327166, 'f1-score': 0.8463383164023096, 'support': 27619.0} |
84
+ | No log | 9.0 | 369 | 0.5090 | {'precision': 0.5892857142857143, 'recall': 0.6088560885608856, 'f1-score': 0.5989110707803992, 'support': 271.0} | {'precision': 0.6153846153846154, 'recall': 0.7482014388489209, 'f1-score': 0.6753246753246753, 'support': 139.0} | {'precision': 0.7649700598802395, 'recall': 0.8072669826224329, 'f1-score': 0.7855495772482706, 'support': 633.0} | {'precision': 0.6388339920948617, 'recall': 0.646338415396151, 'f1-score': 0.6425642937010809, 'support': 4001.0} | {'precision': 0.7708036138849262, 'recall': 0.8052657724788872, 'f1-score': 0.7876579203109816, 'support': 2013.0} | {'precision': 0.8802859357505813, 'recall': 0.9016407904022583, 'f1-score': 0.8908354033206956, 'support': 11336.0} | {'precision': 0.9471395881006865, 'recall': 0.897246910903967, 'f1-score': 0.9215184236891907, 'support': 9226.0} | 0.8504 | {'precision': 0.7438147884830892, 'recall': 0.7735451998876431, 'f1-score': 0.7574801949107562, 'support': 27619.0} | {'precision': 0.8528293791455703, 'recall': 0.8503566385459286, 'f1-score': 0.8512372697828916, 'support': 27619.0} |
85
+ | No log | 10.0 | 410 | 0.5094 | {'precision': 0.5934065934065934, 'recall': 0.5977859778597786, 'f1-score': 0.5955882352941178, 'support': 271.0} | {'precision': 0.6190476190476191, 'recall': 0.7482014388489209, 'f1-score': 0.6775244299674268, 'support': 139.0} | {'precision': 0.7654135338345864, 'recall': 0.8041074249605056, 'f1-score': 0.7842835130970724, 'support': 633.0} | {'precision': 0.6421972534332084, 'recall': 0.6428392901774557, 'f1-score': 0.6425181114164377, 'support': 4001.0} | {'precision': 0.7737752161383286, 'recall': 0.8002980625931445, 'f1-score': 0.7868131868131869, 'support': 2013.0} | {'precision': 0.8804516462678849, 'recall': 0.9011115031757233, 'f1-score': 0.8906617839393146, 'support': 11336.0} | {'precision': 0.9418631006346329, 'recall': 0.9008237589421201, 'f1-score': 0.9208864265927977, 'support': 9226.0} | 0.8503 | {'precision': 0.7451649946804076, 'recall': 0.7707382080796641, 'f1-score': 0.7568965267314792, 'support': 27619.0} | {'precision': 0.8519076404793325, 'recall': 0.8502842246279735, 'f1-score': 0.8508360851093074, 'support': 27619.0} |
86
+
87
+
88
+ ### Framework versions
89
+
90
+ - Transformers 4.37.2
91
+ - Pytorch 2.2.0+cu121
92
+ - Datasets 2.17.0
93
+ - Tokenizers 0.15.2
meta_data/README_s42_e11.md ADDED
@@ -0,0 +1,94 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: allenai/longformer-base-4096
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - essays_su_g
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: longformer-full_labels
12
+ results:
13
+ - task:
14
+ name: Token Classification
15
+ type: token-classification
16
+ dataset:
17
+ name: essays_su_g
18
+ type: essays_su_g
19
+ config: full_labels
20
+ split: train[80%:100%]
21
+ args: full_labels
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.8493790506535356
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # longformer-full_labels
32
+
33
+ This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.5365
36
+ - B-claim: {'precision': 0.5977859778597786, 'recall': 0.5977859778597786, 'f1-score': 0.5977859778597786, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.6287425149700598, 'recall': 0.7553956834532374, 'f1-score': 0.6862745098039216, 'support': 139.0}
38
+ - B-premise: {'precision': 0.7628398791540786, 'recall': 0.7977883096366508, 'f1-score': 0.7799227799227799, 'support': 633.0}
39
+ - I-claim: {'precision': 0.6403061224489796, 'recall': 0.6273431642089478, 'f1-score': 0.6337583638429491, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.7770823302840636, 'recall': 0.8017883755588674, 'f1-score': 0.7892420537897311, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8788191754884241, 'recall': 0.9007586450247, 'f1-score': 0.8896536702243519, 'support': 11336.0}
42
+ - O: {'precision': 0.9381107491856677, 'recall': 0.9052677216561891, 'f1-score': 0.9213966572894258, 'support': 9226.0}
43
+ - Accuracy: 0.8494
44
+ - Macro avg: {'precision': 0.7462409641987217, 'recall': 0.7694468396283387, 'f1-score': 0.7568620018189912, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8499840082982476, 'recall': 0.8493790506535356, 'f1-score': 0.8494664654905584, 'support': 27619.0}
46
+
47
+ ## Model description
48
+
49
+ More information needed
50
+
51
+ ## Intended uses & limitations
52
+
53
+ More information needed
54
+
55
+ ## Training and evaluation data
56
+
57
+ More information needed
58
+
59
+ ## Training procedure
60
+
61
+ ### Training hyperparameters
62
+
63
+ The following hyperparameters were used during training:
64
+ - learning_rate: 2e-05
65
+ - train_batch_size: 8
66
+ - eval_batch_size: 8
67
+ - seed: 42
68
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
+ - lr_scheduler_type: linear
70
+ - num_epochs: 11
71
+
72
+ ### Training results
73
+
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:---------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.6695 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.797752808988764, 'recall': 0.11216429699842022, 'f1-score': 0.1966759002770083, 'support': 633.0} | {'precision': 0.4816326530612245, 'recall': 0.20644838790302425, 'f1-score': 0.28901329601119663, 'support': 4001.0} | {'precision': 0.5686968838526912, 'recall': 0.3989071038251366, 'f1-score': 0.4689051094890511, 'support': 2013.0} | {'precision': 0.7477048388210119, 'recall': 0.9555398729710657, 'f1-score': 0.838942028424273, 'support': 11336.0} | {'precision': 0.8295683743444937, 'recall': 0.8916106655105137, 'f1-score': 0.8594713196113257, 'support': 9226.0} | 0.7516 | {'precision': 0.48933650843831217, 'recall': 0.3663814753154515, 'f1-score': 0.37900109340183646, 'support': 27619.0} | {'precision': 0.7135072404779539, 'recall': 0.7515840544552663, 'f1-score': 0.7119907765150532, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5067 | {'precision': 0.2857142857142857, 'recall': 0.04428044280442804, 'f1-score': 0.07667731629392971, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.5347018572825024, 'recall': 0.8641390205371248, 'f1-score': 0.6606280193236715, 'support': 633.0} | {'precision': 0.6003445305770887, 'recall': 0.34841289677580606, 'f1-score': 0.44092993832041755, 'support': 4001.0} | {'precision': 0.630060120240481, 'recall': 0.7809239940387481, 'f1-score': 0.6974267968056788, 'support': 2013.0} | {'precision': 0.796954695469547, 'recall': 0.9372794636556104, 'f1-score': 0.8614399221663696, 'support': 11336.0} | {'precision': 0.9419393218322427, 'recall': 0.8581183611532626, 'f1-score': 0.8980772502977711, 'support': 9226.0} | 0.7990 | {'precision': 0.5413878301594496, 'recall': 0.5475934541378543, 'f1-score': 0.5193113204582626, 'support': 27619.0} | {'precision': 0.7897025579144238, 'recall': 0.7989789637568341, 'f1-score': 0.784169650713732, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4498 | {'precision': 0.4083969465648855, 'recall': 0.3948339483394834, 'f1-score': 0.40150093808630394, 'support': 271.0} | {'precision': 0.5555555555555556, 'recall': 0.03597122302158273, 'f1-score': 0.06756756756756756, 'support': 139.0} | {'precision': 0.6736566186107471, 'recall': 0.8120063191153238, 'f1-score': 0.7363896848137534, 'support': 633.0} | {'precision': 0.6048208055819854, 'recall': 0.47663084228942765, 'f1-score': 0.5331283198210792, 'support': 4001.0} | {'precision': 0.7395727365208545, 'recall': 0.7223050173869846, 'f1-score': 0.7308368936918824, 'support': 2013.0} | {'precision': 0.8348814862267777, 'recall': 0.9197247706422018, 'f1-score': 0.8752518468770988, 'support': 11336.0} | {'precision': 0.9292715526843395, 'recall': 0.9042922176457836, 'f1-score': 0.9166117336849043, 'support': 9226.0} | 0.8239 | {'precision': 0.6780222431064493, 'recall': 0.6093949054915412, 'f1-score': 0.6087552835060842, 'support': 27619.0} | {'precision': 0.8168523939680793, 'recall': 0.8239255584923423, 'f1-score': 0.8170849481292589, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4370 | {'precision': 0.5577689243027888, 'recall': 0.5166051660516605, 'f1-score': 0.5363984674329502, 'support': 271.0} | {'precision': 0.7264150943396226, 'recall': 0.5539568345323741, 'f1-score': 0.6285714285714284, 'support': 139.0} | {'precision': 0.7417503586800573, 'recall': 0.8167456556082149, 'f1-score': 0.7774436090225564, 'support': 633.0} | {'precision': 0.6413476263399693, 'recall': 0.5233691577105724, 'f1-score': 0.5763831544178365, 'support': 4001.0} | {'precision': 0.7377124483233808, 'recall': 0.7978142076502732, 'f1-score': 0.7665871121718377, 'support': 2013.0} | {'precision': 0.850122249388753, 'recall': 0.920165843330981, 'f1-score': 0.8837583665169871, 'support': 11336.0} | {'precision': 0.9384389472495199, 'recall': 0.9004985909386516, 'f1-score': 0.9190773825985951, 'support': 9226.0} | 0.8390 | {'precision': 0.7419365212320131, 'recall': 0.7184507794032469, 'f1-score': 0.7268885029617416, 'support': 27619.0} | {'precision': 0.8352121949201599, 'recall': 0.8390238603859662, 'f1-score': 0.8353596745021875, 'support': 27619.0} |
80
+ | No log | 5.0 | 205 | 0.4501 | {'precision': 0.5734767025089605, 'recall': 0.5904059040590406, 'f1-score': 0.5818181818181818, 'support': 271.0} | {'precision': 0.6242424242424243, 'recall': 0.7410071942446043, 'f1-score': 0.6776315789473684, 'support': 139.0} | {'precision': 0.7786624203821656, 'recall': 0.7725118483412322, 'f1-score': 0.7755749405233942, 'support': 633.0} | {'precision': 0.6045918367346939, 'recall': 0.651587103224194, 'f1-score': 0.627210393359798, 'support': 4001.0} | {'precision': 0.6996615905245347, 'recall': 0.821659215101838, 'f1-score': 0.7557687914096413, 'support': 2013.0} | {'precision': 0.8977737392094502, 'recall': 0.8715596330275229, 'f1-score': 0.8844724945168077, 'support': 11336.0} | {'precision': 0.9372885179336792, 'recall': 0.9007153696076307, 'f1-score': 0.9186380720760557, 'support': 9226.0} | 0.8401 | {'precision': 0.7308138902194153, 'recall': 0.764206609658009, 'f1-score': 0.7458734932358925, 'support': 27619.0} | {'precision': 0.8467740645963788, 'recall': 0.8401100691552916, 'f1-score': 0.8427303257125204, 'support': 27619.0} |
81
+ | No log | 6.0 | 246 | 0.4564 | {'precision': 0.5827814569536424, 'recall': 0.6494464944649446, 'f1-score': 0.6143106457242583, 'support': 271.0} | {'precision': 0.6455696202531646, 'recall': 0.7338129496402878, 'f1-score': 0.6868686868686869, 'support': 139.0} | {'precision': 0.7644305772230889, 'recall': 0.7740916271721959, 'f1-score': 0.7692307692307692, 'support': 633.0} | {'precision': 0.6035729476818376, 'recall': 0.7093226693326669, 'f1-score': 0.6521889003791796, 'support': 4001.0} | {'precision': 0.7793035801863659, 'recall': 0.7893691008445107, 'f1-score': 0.7843040473840079, 'support': 2013.0} | {'precision': 0.9100216023292946, 'recall': 0.8547106563161609, 'f1-score': 0.8814993403993995, 'support': 11336.0} | {'precision': 0.9284775465498357, 'recall': 0.9188163884673748, 'f1-score': 0.9236217040749619, 'support': 9226.0} | 0.8458 | {'precision': 0.7448796187396043, 'recall': 0.775652840891163, 'f1-score': 0.7588605848658947, 'support': 27619.0} | {'precision': 0.8543873676272021, 'recall': 0.8458307686737391, 'f1-score': 0.8490929509306417, 'support': 27619.0} |
82
+ | No log | 7.0 | 287 | 0.4905 | {'precision': 0.5934959349593496, 'recall': 0.5387453874538746, 'f1-score': 0.5647969052224371, 'support': 271.0} | {'precision': 0.6540880503144654, 'recall': 0.7482014388489209, 'f1-score': 0.6979865771812079, 'support': 139.0} | {'precision': 0.7386363636363636, 'recall': 0.8214849921011058, 'f1-score': 0.7778608825729245, 'support': 633.0} | {'precision': 0.6343303691727297, 'recall': 0.588352911772057, 'f1-score': 0.6104771784232366, 'support': 4001.0} | {'precision': 0.8261105092091008, 'recall': 0.7575757575757576, 'f1-score': 0.790360196942213, 'support': 2013.0} | {'precision': 0.861314475873544, 'recall': 0.9132851093860268, 'f1-score': 0.8865387908888509, 'support': 11336.0} | {'precision': 0.9371991492219859, 'recall': 0.9074355083459787, 'f1-score': 0.9220772068946528, 'support': 9226.0} | 0.8463 | {'precision': 0.7493106931982199, 'recall': 0.7535830150691031, 'f1-score': 0.7500139625893603, 'support': 27619.0} | {'precision': 0.8447332983407098, 'recall': 0.8463014591404467, 'f1-score': 0.8448122070261145, 'support': 27619.0} |
83
+ | No log | 8.0 | 328 | 0.5157 | {'precision': 0.5984555984555985, 'recall': 0.5719557195571956, 'f1-score': 0.5849056603773586, 'support': 271.0} | {'precision': 0.6184971098265896, 'recall': 0.7697841726618705, 'f1-score': 0.6858974358974358, 'support': 139.0} | {'precision': 0.7532656023222061, 'recall': 0.8199052132701422, 'f1-score': 0.7851739788199698, 'support': 633.0} | {'precision': 0.6262678803641092, 'recall': 0.6018495376155961, 'f1-score': 0.613815957175631, 'support': 4001.0} | {'precision': 0.767628963558921, 'recall': 0.8057625434674615, 'f1-score': 0.786233640329617, 'support': 2013.0} | {'precision': 0.8621727858158401, 'recall': 0.9094036697247706, 'f1-score': 0.8851586313484738, 'support': 11336.0} | {'precision': 0.9516486077129209, 'recall': 0.8853240841101235, 'f1-score': 0.9172890111741254, 'support': 9226.0} | 0.8432 | {'precision': 0.7397052211508834, 'recall': 0.7662835629153086, 'f1-score': 0.7512106164460874, 'support': 27619.0} | {'precision': 0.8446868220958761, 'recall': 0.8431876606683805, 'f1-score': 0.8431332391052191, 'support': 27619.0} |
84
+ | No log | 9.0 | 369 | 0.5184 | {'precision': 0.594306049822064, 'recall': 0.6162361623616236, 'f1-score': 0.6050724637681159, 'support': 271.0} | {'precision': 0.6347305389221557, 'recall': 0.762589928057554, 'f1-score': 0.69281045751634, 'support': 139.0} | {'precision': 0.770739064856712, 'recall': 0.8072669826224329, 'f1-score': 0.7885802469135803, 'support': 633.0} | {'precision': 0.6278113663845224, 'recall': 0.6488377905523619, 'f1-score': 0.6381514257620453, 'support': 4001.0} | {'precision': 0.7695293546821931, 'recall': 0.7878787878787878, 'f1-score': 0.778595974472263, 'support': 2013.0} | {'precision': 0.8851227395824234, 'recall': 0.8937896965419901, 'f1-score': 0.8894351051222402, 'support': 11336.0} | {'precision': 0.9399887196841512, 'recall': 0.9032083243008888, 'f1-score': 0.9212315516002433, 'support': 9226.0} | 0.8484 | {'precision': 0.7460325477048889, 'recall': 0.7742582389022342, 'f1-score': 0.7591253178792611, 'support': 27619.0} | {'precision': 0.8510150796212144, 'recall': 0.8483652558021652, 'f1-score': 0.8494848758241932, 'support': 27619.0} |
85
+ | No log | 10.0 | 410 | 0.5304 | {'precision': 0.6036363636363636, 'recall': 0.6125461254612546, 'f1-score': 0.6080586080586081, 'support': 271.0} | {'precision': 0.630057803468208, 'recall': 0.7841726618705036, 'f1-score': 0.6987179487179487, 'support': 139.0} | {'precision': 0.7739938080495357, 'recall': 0.7898894154818326, 'f1-score': 0.7818608287724785, 'support': 633.0} | {'precision': 0.6411689961880559, 'recall': 0.630592351912022, 'f1-score': 0.6358366935483871, 'support': 4001.0} | {'precision': 0.7681839511966213, 'recall': 0.8132141082960755, 'f1-score': 0.7900579150579151, 'support': 2013.0} | {'precision': 0.8835794960903562, 'recall': 0.8971418489767113, 'f1-score': 0.8903090256500045, 'support': 11336.0} | {'precision': 0.9374231757738295, 'recall': 0.9092781270323, 'f1-score': 0.9231361760660248, 'support': 9226.0} | 0.8506 | {'precision': 0.7482919420575672, 'recall': 0.7766906627186714, 'f1-score': 0.7611395994101952, 'support': 27619.0} | {'precision': 0.8515042689670057, 'recall': 0.8506462942177486, 'f1-score': 0.8508849071769771, 'support': 27619.0} |
86
+ | No log | 11.0 | 451 | 0.5365 | {'precision': 0.5977859778597786, 'recall': 0.5977859778597786, 'f1-score': 0.5977859778597786, 'support': 271.0} | {'precision': 0.6287425149700598, 'recall': 0.7553956834532374, 'f1-score': 0.6862745098039216, 'support': 139.0} | {'precision': 0.7628398791540786, 'recall': 0.7977883096366508, 'f1-score': 0.7799227799227799, 'support': 633.0} | {'precision': 0.6403061224489796, 'recall': 0.6273431642089478, 'f1-score': 0.6337583638429491, 'support': 4001.0} | {'precision': 0.7770823302840636, 'recall': 0.8017883755588674, 'f1-score': 0.7892420537897311, 'support': 2013.0} | {'precision': 0.8788191754884241, 'recall': 0.9007586450247, 'f1-score': 0.8896536702243519, 'support': 11336.0} | {'precision': 0.9381107491856677, 'recall': 0.9052677216561891, 'f1-score': 0.9213966572894258, 'support': 9226.0} | 0.8494 | {'precision': 0.7462409641987217, 'recall': 0.7694468396283387, 'f1-score': 0.7568620018189912, 'support': 27619.0} | {'precision': 0.8499840082982476, 'recall': 0.8493790506535356, 'f1-score': 0.8494664654905584, 'support': 27619.0} |
87
+
88
+
89
+ ### Framework versions
90
+
91
+ - Transformers 4.37.2
92
+ - Pytorch 2.2.0+cu121
93
+ - Datasets 2.17.0
94
+ - Tokenizers 0.15.2
meta_data/README_s42_e12.md ADDED
@@ -0,0 +1,95 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: allenai/longformer-base-4096
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - essays_su_g
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: longformer-full_labels
12
+ results:
13
+ - task:
14
+ name: Token Classification
15
+ type: token-classification
16
+ dataset:
17
+ name: essays_su_g
18
+ type: essays_su_g
19
+ config: full_labels
20
+ split: train[80%:100%]
21
+ args: full_labels
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.8491980158586481
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # longformer-full_labels
32
+
33
+ This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.5617
36
+ - B-claim: {'precision': 0.6123188405797102, 'recall': 0.6236162361623616, 'f1-score': 0.6179159049360147, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.6358381502890174, 'recall': 0.7913669064748201, 'f1-score': 0.7051282051282051, 'support': 139.0}
38
+ - B-premise: {'precision': 0.7745398773006135, 'recall': 0.7977883096366508, 'f1-score': 0.7859922178988326, 'support': 633.0}
39
+ - I-claim: {'precision': 0.6346679930365581, 'recall': 0.6378405398650338, 'f1-score': 0.6362503116429817, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.7706247019551741, 'recall': 0.8027819175360159, 'f1-score': 0.7863746958637469, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8815161765981439, 'recall': 0.8965243472124206, 'f1-score': 0.8889569210583862, 'support': 11336.0}
42
+ - O: {'precision': 0.940029308984331, 'recall': 0.9038586603078257, 'f1-score': 0.9215892136818257, 'support': 9226.0}
43
+ - Accuracy: 0.8492
44
+ - Macro avg: {'precision': 0.7499335783919354, 'recall': 0.7791109881707328, 'f1-score': 0.7631724957442847, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8508908939063541, 'recall': 0.8491980158586481, 'f1-score': 0.8498283285739571, 'support': 27619.0}
46
+
47
+ ## Model description
48
+
49
+ More information needed
50
+
51
+ ## Intended uses & limitations
52
+
53
+ More information needed
54
+
55
+ ## Training and evaluation data
56
+
57
+ More information needed
58
+
59
+ ## Training procedure
60
+
61
+ ### Training hyperparameters
62
+
63
+ The following hyperparameters were used during training:
64
+ - learning_rate: 2e-05
65
+ - train_batch_size: 8
66
+ - eval_batch_size: 8
67
+ - seed: 42
68
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
+ - lr_scheduler_type: linear
70
+ - num_epochs: 12
71
+
72
+ ### Training results
73
+
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:---------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.6687 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.8152173913043478, 'recall': 0.11848341232227488, 'f1-score': 0.20689655172413793, 'support': 633.0} | {'precision': 0.4819484240687679, 'recall': 0.21019745063734066, 'f1-score': 0.292725374173338, 'support': 4001.0} | {'precision': 0.5730337078651685, 'recall': 0.38002980625931443, 'f1-score': 0.4569892473118279, 'support': 2013.0} | {'precision': 0.7500173454520225, 'recall': 0.9535991531404375, 'f1-score': 0.8396442580294381, 'support': 11336.0} | {'precision': 0.8226031492924059, 'recall': 0.8946455668762194, 'f1-score': 0.8571131879543095, 'support': 9226.0} | 0.7511 | {'precision': 0.49183143114038747, 'recall': 0.3652793413193696, 'f1-score': 0.37905265988472164, 'support': 27619.0} | {'precision': 0.7128917915472407, 'recall': 0.7511133639885585, 'f1-score': 0.7113947889219662, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5052 | {'precision': 0.23684210526315788, 'recall': 0.033210332103321034, 'f1-score': 0.05825242718446602, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.5299806576402321, 'recall': 0.8657187993680885, 'f1-score': 0.6574685062987402, 'support': 633.0} | {'precision': 0.5969750889679716, 'recall': 0.3354161459635091, 'f1-score': 0.4295087213954233, 'support': 4001.0} | {'precision': 0.6373355263157895, 'recall': 0.7699950322901142, 'f1-score': 0.6974128233970754, 'support': 2013.0} | {'precision': 0.7942319187089062, 'recall': 0.9377205363443896, 'f1-score': 0.860032362459547, 'support': 11336.0} | {'precision': 0.9385830484498409, 'recall': 0.8629958812052894, 'f1-score': 0.8992037946806707, 'support': 9226.0} | 0.7980 | {'precision': 0.5334211921922711, 'recall': 0.543579532467816, 'f1-score': 0.5145540907737033, 'support': 27619.0} | {'precision': 0.7869182790010324, 'recall': 0.7980375828234186, 'f1-score': 0.7820595043492083, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4368 | {'precision': 0.42758620689655175, 'recall': 0.4575645756457565, 'f1-score': 0.44206773618538325, 'support': 271.0} | {'precision': 0.6666666666666666, 'recall': 0.05755395683453238, 'f1-score': 0.10596026490066225, 'support': 139.0} | {'precision': 0.6956521739130435, 'recall': 0.8088467614533965, 'f1-score': 0.747991234477721, 'support': 633.0} | {'precision': 0.604012671594509, 'recall': 0.5718570357410647, 'f1-score': 0.5874951855180383, 'support': 4001.0} | {'precision': 0.7543684473290065, 'recall': 0.7506209637357178, 'f1-score': 0.7524900398406374, 'support': 2013.0} | {'precision': 0.864777465747596, 'recall': 0.8964361326746648, 'f1-score': 0.8803222592801144, 'support': 11336.0} | {'precision': 0.9297488660250027, 'recall': 0.9109039670496423, 'f1-score': 0.9202299479879551, 'support': 9226.0} | 0.8331 | {'precision': 0.7061160711674823, 'recall': 0.636254770447825, 'f1-score': 0.6337938097415016, 'support': 27619.0} | {'precision': 0.8314953158335541, 'recall': 0.8330859191136536, 'f1-score': 0.8306858540694795, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4355 | {'precision': 0.573076923076923, 'recall': 0.5498154981549815, 'f1-score': 0.5612052730696798, 'support': 271.0} | {'precision': 0.6747967479674797, 'recall': 0.5971223021582733, 'f1-score': 0.633587786259542, 'support': 139.0} | {'precision': 0.7654135338345864, 'recall': 0.8041074249605056, 'f1-score': 0.7842835130970724, 'support': 633.0} | {'precision': 0.6417287630402384, 'recall': 0.5381154711322169, 'f1-score': 0.5853724850462207, 'support': 4001.0} | {'precision': 0.7207167832167832, 'recall': 0.8191753601589667, 'f1-score': 0.766798418972332, 'support': 2013.0} | {'precision': 0.8538233355306526, 'recall': 0.9140790402258292, 'f1-score': 0.882924335378323, 'support': 11336.0} | {'precision': 0.941196542311192, 'recall': 0.8969217429004986, 'f1-score': 0.9185259185259186, 'support': 9226.0} | 0.8393 | {'precision': 0.7386789469968366, 'recall': 0.7313338342416102, 'f1-score': 0.7332425329070126, 'support': 27619.0} | {'precision': 0.8369016857060912, 'recall': 0.8392773090988088, 'f1-score': 0.8365761872374973, 'support': 27619.0} |
80
+ | No log | 5.0 | 205 | 0.4430 | {'precision': 0.5812274368231047, 'recall': 0.5940959409594095, 'f1-score': 0.5875912408759124, 'support': 271.0} | {'precision': 0.6352201257861635, 'recall': 0.7266187050359713, 'f1-score': 0.6778523489932886, 'support': 139.0} | {'precision': 0.779874213836478, 'recall': 0.7835703001579779, 'f1-score': 0.7817178881008668, 'support': 633.0} | {'precision': 0.6160631383472609, 'recall': 0.6633341664583854, 'f1-score': 0.6388253700806353, 'support': 4001.0} | {'precision': 0.7192906574394463, 'recall': 0.8261301539990065, 'f1-score': 0.7690173410404624, 'support': 2013.0} | {'precision': 0.8944205238607822, 'recall': 0.8795871559633027, 'f1-score': 0.8869418252979896, 'support': 11336.0} | {'precision': 0.9421346394805786, 'recall': 0.8964881855625406, 'f1-score': 0.9187447931130241, 'support': 9226.0} | 0.8442 | {'precision': 0.738318676510545, 'recall': 0.7671178011623707, 'f1-score': 0.751527258214597, 'support': 27619.0} | {'precision': 0.8502680966909907, 'recall': 0.8442376624787284, 'f1-score': 0.8466262475832265, 'support': 27619.0} |
81
+ | No log | 6.0 | 246 | 0.4543 | {'precision': 0.5979020979020979, 'recall': 0.6309963099630996, 'f1-score': 0.6140035906642729, 'support': 271.0} | {'precision': 0.6624203821656051, 'recall': 0.7482014388489209, 'f1-score': 0.7027027027027026, 'support': 139.0} | {'precision': 0.7640791476407914, 'recall': 0.7930489731437599, 'f1-score': 0.7782945736434108, 'support': 633.0} | {'precision': 0.6197598005891684, 'recall': 0.683579105223694, 'f1-score': 0.6501069645828381, 'support': 4001.0} | {'precision': 0.7838104639684107, 'recall': 0.7888723298559364, 'f1-score': 0.7863332508046547, 'support': 2013.0} | {'precision': 0.9023183359296252, 'recall': 0.8686485532815809, 'f1-score': 0.8851633781293541, 'support': 11336.0} | {'precision': 0.927675357259736, 'recall': 0.921742900498591, 'f1-score': 0.9246996139835807, 'support': 9226.0} | 0.8491 | {'precision': 0.7511379407793478, 'recall': 0.7764413729736547, 'f1-score': 0.7630434392158306, 'support': 27619.0} | {'precision': 0.8538561472323885, 'recall': 0.8490893949817155, 'f1-score': 0.8510876065793312, 'support': 27619.0} |
82
+ | No log | 7.0 | 287 | 0.5158 | {'precision': 0.6069868995633187, 'recall': 0.5129151291512916, 'f1-score': 0.5559999999999999, 'support': 271.0} | {'precision': 0.65625, 'recall': 0.7553956834532374, 'f1-score': 0.7023411371237458, 'support': 139.0} | {'precision': 0.7251381215469613, 'recall': 0.8293838862559242, 'f1-score': 0.7737656595431099, 'support': 633.0} | {'precision': 0.6443714541654225, 'recall': 0.5393651587103224, 'f1-score': 0.5872108843537415, 'support': 4001.0} | {'precision': 0.8169618894256575, 'recall': 0.7560854446100348, 'f1-score': 0.7853457172342622, 'support': 2013.0} | {'precision': 0.8469115865966895, 'recall': 0.9252822865208187, 'f1-score': 0.8843640655958855, 'support': 11336.0} | {'precision': 0.937927938040184, 'recall': 0.905701278994147, 'f1-score': 0.921532947339399, 'support': 9226.0} | 0.8434 | {'precision': 0.7477925556197478, 'recall': 0.7463041239565393, 'f1-score': 0.7443657730271634, 'support': 27619.0} | {'precision': 0.8396868823733444, 'recall': 0.8434049024222455, 'f1-score': 0.8398436140841861, 'support': 27619.0} |
83
+ | No log | 8.0 | 328 | 0.5215 | {'precision': 0.6023622047244095, 'recall': 0.5645756457564576, 'f1-score': 0.582857142857143, 'support': 271.0} | {'precision': 0.6358381502890174, 'recall': 0.7913669064748201, 'f1-score': 0.7051282051282051, 'support': 139.0} | {'precision': 0.7474747474747475, 'recall': 0.8183254344391785, 'f1-score': 0.7812971342383106, 'support': 633.0} | {'precision': 0.6325317481761686, 'recall': 0.5851037240689827, 'f1-score': 0.6078940534925993, 'support': 4001.0} | {'precision': 0.7624360762436077, 'recall': 0.8147044212617983, 'f1-score': 0.787704130643612, 'support': 2013.0} | {'precision': 0.8591619554373129, 'recall': 0.9116090331686661, 'f1-score': 0.8846087998630371, 'support': 11336.0} | {'precision': 0.9510384035270913, 'recall': 0.8884673748103187, 'f1-score': 0.9186887083216588, 'support': 9226.0} | 0.8434 | {'precision': 0.7415490408389079, 'recall': 0.7677360771400318, 'f1-score': 0.7525968820777951, 'support': 27619.0} | {'precision': 0.8437690270911894, 'recall': 0.8433686954632681, 'f1-score': 0.8426122630592147, 'support': 27619.0} |
84
+ | No log | 9.0 | 369 | 0.5319 | {'precision': 0.5919732441471572, 'recall': 0.6531365313653137, 'f1-score': 0.6210526315789474, 'support': 271.0} | {'precision': 0.6463414634146342, 'recall': 0.762589928057554, 'f1-score': 0.6996699669966997, 'support': 139.0} | {'precision': 0.7822706065318819, 'recall': 0.7946287519747235, 'f1-score': 0.7884012539184952, 'support': 633.0} | {'precision': 0.610633484162896, 'recall': 0.6745813546613346, 'f1-score': 0.6410165063531648, 'support': 4001.0} | {'precision': 0.7723537941034316, 'recall': 0.793840039741679, 'f1-score': 0.7829495345418912, 'support': 2013.0} | {'precision': 0.8914777309794785, 'recall': 0.8775582215949188, 'f1-score': 0.8844632140475661, 'support': 11336.0} | {'precision': 0.9402143260011281, 'recall': 0.9034251029698678, 'f1-score': 0.9214526560168039, 'support': 9226.0} | 0.8460 | {'precision': 0.7478949499058011, 'recall': 0.779965704337913, 'f1-score': 0.7627151090647954, 'support': 27619.0} | {'precision': 0.8517160358539021, 'recall': 0.8460118034686267, 'f1-score': 0.8484376348204832, 'support': 27619.0} |
85
+ | No log | 10.0 | 410 | 0.5510 | {'precision': 0.6185185185185185, 'recall': 0.6162361623616236, 'f1-score': 0.6173752310536044, 'support': 271.0} | {'precision': 0.6416184971098265, 'recall': 0.7985611510791367, 'f1-score': 0.7115384615384616, 'support': 139.0} | {'precision': 0.7849293563579278, 'recall': 0.7898894154818326, 'f1-score': 0.7874015748031495, 'support': 633.0} | {'precision': 0.6464102564102564, 'recall': 0.6300924768807799, 'f1-score': 0.6381470699911405, 'support': 4001.0} | {'precision': 0.7707271885132005, 'recall': 0.8266269249875807, 'f1-score': 0.7976989453499521, 'support': 2013.0} | {'precision': 0.8825823223570191, 'recall': 0.8984650670430487, 'f1-score': 0.8904528763769891, 'support': 11336.0} | {'precision': 0.9364653243847875, 'recall': 0.9074355083459787, 'f1-score': 0.9217218980513047, 'support': 9226.0} | 0.8516 | {'precision': 0.7544644948073624, 'recall': 0.7810438151685688, 'f1-score': 0.766333722452086, 'support': 27619.0} | {'precision': 0.852174493195955, 'recall': 0.8515876751511641, 'f1-score': 0.8516460470210601, 'support': 27619.0} |
86
+ | No log | 11.0 | 451 | 0.5596 | {'precision': 0.6131386861313869, 'recall': 0.6199261992619927, 'f1-score': 0.6165137614678898, 'support': 271.0} | {'precision': 0.6342857142857142, 'recall': 0.7985611510791367, 'f1-score': 0.7070063694267515, 'support': 139.0} | {'precision': 0.7740458015267175, 'recall': 0.8009478672985783, 'f1-score': 0.7872670807453416, 'support': 633.0} | {'precision': 0.6319064211050274, 'recall': 0.6345913521619595, 'f1-score': 0.633246040653448, 'support': 4001.0} | {'precision': 0.763653483992467, 'recall': 0.8057625434674615, 'f1-score': 0.784143098863911, 'support': 2013.0} | {'precision': 0.8795128692347556, 'recall': 0.898288637967537, 'f1-score': 0.8888016060050623, 'support': 11336.0} | {'precision': 0.9422399090392268, 'recall': 0.8982224149143724, 'f1-score': 0.9197047888574441, 'support': 9226.0} | 0.8479 | {'precision': 0.7483975550450422, 'recall': 0.7794714523072911, 'f1-score': 0.7623832494314069, 'support': 27619.0} | {'precision': 0.849887853693214, 'recall': 0.8478583583764799, 'f1-score': 0.8485621503732786, 'support': 27619.0} |
87
+ | No log | 12.0 | 492 | 0.5617 | {'precision': 0.6123188405797102, 'recall': 0.6236162361623616, 'f1-score': 0.6179159049360147, 'support': 271.0} | {'precision': 0.6358381502890174, 'recall': 0.7913669064748201, 'f1-score': 0.7051282051282051, 'support': 139.0} | {'precision': 0.7745398773006135, 'recall': 0.7977883096366508, 'f1-score': 0.7859922178988326, 'support': 633.0} | {'precision': 0.6346679930365581, 'recall': 0.6378405398650338, 'f1-score': 0.6362503116429817, 'support': 4001.0} | {'precision': 0.7706247019551741, 'recall': 0.8027819175360159, 'f1-score': 0.7863746958637469, 'support': 2013.0} | {'precision': 0.8815161765981439, 'recall': 0.8965243472124206, 'f1-score': 0.8889569210583862, 'support': 11336.0} | {'precision': 0.940029308984331, 'recall': 0.9038586603078257, 'f1-score': 0.9215892136818257, 'support': 9226.0} | 0.8492 | {'precision': 0.7499335783919354, 'recall': 0.7791109881707328, 'f1-score': 0.7631724957442847, 'support': 27619.0} | {'precision': 0.8508908939063541, 'recall': 0.8491980158586481, 'f1-score': 0.8498283285739571, 'support': 27619.0} |
88
+
89
+
90
+ ### Framework versions
91
+
92
+ - Transformers 4.37.2
93
+ - Pytorch 2.2.0+cu121
94
+ - Datasets 2.17.0
95
+ - Tokenizers 0.15.2
meta_data/README_s42_e13.md ADDED
@@ -0,0 +1,96 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: allenai/longformer-base-4096
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - essays_su_g
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: longformer-full_labels
12
+ results:
13
+ - task:
14
+ name: Token Classification
15
+ type: token-classification
16
+ dataset:
17
+ name: essays_su_g
18
+ type: essays_su_g
19
+ config: full_labels
20
+ split: train[80%:100%]
21
+ args: full_labels
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.8498135341612658
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # longformer-full_labels
32
+
33
+ This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.5984
36
+ - B-claim: {'precision': 0.6085271317829457, 'recall': 0.5793357933579336, 'f1-score': 0.5935727788279772, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.6473988439306358, 'recall': 0.8057553956834532, 'f1-score': 0.717948717948718, 'support': 139.0}
38
+ - B-premise: {'precision': 0.7566371681415929, 'recall': 0.8104265402843602, 'f1-score': 0.782608695652174, 'support': 633.0}
39
+ - I-claim: {'precision': 0.6461824953445066, 'recall': 0.6070982254436391, 'f1-score': 0.6260309278350515, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.7872444011684518, 'recall': 0.8032786885245902, 'f1-score': 0.7951807228915663, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8729847308709374, 'recall': 0.902787579393084, 'f1-score': 0.8876360640097141, 'support': 11336.0}
42
+ - O: {'precision': 0.9370403387564074, 'recall': 0.9114459137220897, 'f1-score': 0.924065934065934, 'support': 9226.0}
43
+ - Accuracy: 0.8498
44
+ - Macro avg: {'precision': 0.7508593014279253, 'recall': 0.7743040194870214, 'f1-score': 0.7610062630330193, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8488808008037289, 'recall': 0.8498135341612658, 'f1-score': 0.849023051738306, 'support': 27619.0}
46
+
47
+ ## Model description
48
+
49
+ More information needed
50
+
51
+ ## Intended uses & limitations
52
+
53
+ More information needed
54
+
55
+ ## Training and evaluation data
56
+
57
+ More information needed
58
+
59
+ ## Training procedure
60
+
61
+ ### Training hyperparameters
62
+
63
+ The following hyperparameters were used during training:
64
+ - learning_rate: 2e-05
65
+ - train_batch_size: 8
66
+ - eval_batch_size: 8
67
+ - seed: 42
68
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
+ - lr_scheduler_type: linear
70
+ - num_epochs: 13
71
+
72
+ ### Training results
73
+
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.7101 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.8404255319148937, 'recall': 0.12480252764612954, 'f1-score': 0.21733149931224208, 'support': 633.0} | {'precision': 0.38269030239833157, 'recall': 0.18345413646588352, 'f1-score': 0.24801486737624598, 'support': 4001.0} | {'precision': 0.5911330049261084, 'recall': 0.35767511177347244, 'f1-score': 0.4456824512534819, 'support': 2013.0} | {'precision': 0.7166710319539026, 'recall': 0.9655081157374735, 'f1-score': 0.8226849067949489, 'support': 11336.0} | {'precision': 0.8668421629922124, 'recall': 0.8566009104704098, 'f1-score': 0.8616911083247015, 'support': 9226.0} | 0.7379 | {'precision': 0.4853945763122069, 'recall': 0.35543440029905266, 'f1-score': 0.3707721190088029, 'support': 27619.0} | {'precision': 0.7015008731130634, 'recall': 0.737934030920743, 'f1-score': 0.6989013131048012, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5175 | {'precision': 0.15151515151515152, 'recall': 0.01845018450184502, 'f1-score': 0.03289473684210526, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.5121722846441947, 'recall': 0.8641390205371248, 'f1-score': 0.643151087595532, 'support': 633.0} | {'precision': 0.5933900851276915, 'recall': 0.2961759560109973, 'f1-score': 0.3951317105701901, 'support': 4001.0} | {'precision': 0.6496097137901128, 'recall': 0.7441629408842524, 'f1-score': 0.693679092382496, 'support': 2013.0} | {'precision': 0.7886889460154242, 'recall': 0.9472477064220184, 'f1-score': 0.8607270249689393, 'support': 11336.0} | {'precision': 0.9334883720930233, 'recall': 0.8701495772815955, 'f1-score': 0.9007068327162572, 'support': 9226.0} | 0.7966 | {'precision': 0.5184092218836568, 'recall': 0.5343321979482619, 'f1-score': 0.5037557835822172, 'support': 27619.0} | {'precision': 0.7820712321103896, 'recall': 0.7965893044643181, 'f1-score': 0.7770176289068236, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4528 | {'precision': 0.3985239852398524, 'recall': 0.3985239852398524, 'f1-score': 0.3985239852398524, 'support': 271.0} | {'precision': 0.7142857142857143, 'recall': 0.07194244604316546, 'f1-score': 0.13071895424836602, 'support': 139.0} | {'precision': 0.6731266149870802, 'recall': 0.8230647709320695, 'f1-score': 0.740582800284293, 'support': 633.0} | {'precision': 0.5998698763825634, 'recall': 0.46088477880529866, 'f1-score': 0.5212720848056537, 'support': 4001.0} | {'precision': 0.7392622536634664, 'recall': 0.726775956284153, 'f1-score': 0.7329659318637275, 'support': 2013.0} | {'precision': 0.8280879329432231, 'recall': 0.9237826393789696, 'f1-score': 0.8733216579100993, 'support': 11336.0} | {'precision': 0.9327389685137117, 'recall': 0.8958378495556037, 'f1-score': 0.9139160723171339, 'support': 9226.0} | 0.8213 | {'precision': 0.6979850494308016, 'recall': 0.6144017751770161, 'f1-score': 0.6159002123813037, 'support': 27619.0} | {'precision': 0.8151722975109748, 'recall': 0.8212824504869836, 'f1-score': 0.814214594179237, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4361 | {'precision': 0.5652173913043478, 'recall': 0.5276752767527675, 'f1-score': 0.5458015267175571, 'support': 271.0} | {'precision': 0.6846846846846847, 'recall': 0.5467625899280576, 'f1-score': 0.6080000000000001, 'support': 139.0} | {'precision': 0.7442528735632183, 'recall': 0.8183254344391785, 'f1-score': 0.7795334838224228, 'support': 633.0} | {'precision': 0.6549935149156939, 'recall': 0.5048737815546114, 'f1-score': 0.5702187720536345, 'support': 4001.0} | {'precision': 0.7530747398297067, 'recall': 0.7908594138102335, 'f1-score': 0.771504724981827, 'support': 2013.0} | {'precision': 0.8468891769280622, 'recall': 0.9221947776993649, 'f1-score': 0.8829391891891892, 'support': 11336.0} | {'precision': 0.9297992680492403, 'recall': 0.9087361803598526, 'f1-score': 0.919147070109083, 'support': 9226.0} | 0.8395 | {'precision': 0.7398445213249935, 'recall': 0.7170610649348665, 'f1-score': 0.7253063952676734, 'support': 27619.0} | {'precision': 0.8340160546838721, 'recall': 0.8395307578116514, 'f1-score': 0.8345487796390205, 'support': 27619.0} |
80
+ | No log | 5.0 | 205 | 0.4563 | {'precision': 0.5789473684210527, 'recall': 0.6088560885608856, 'f1-score': 0.5935251798561151, 'support': 271.0} | {'precision': 0.6264367816091954, 'recall': 0.7841726618705036, 'f1-score': 0.6964856230031948, 'support': 139.0} | {'precision': 0.7835218093699515, 'recall': 0.7661927330173776, 'f1-score': 0.7747603833865815, 'support': 633.0} | {'precision': 0.5847826086956521, 'recall': 0.6723319170207448, 'f1-score': 0.6255086617835135, 'support': 4001.0} | {'precision': 0.7051926298157454, 'recall': 0.8365623447590661, 'f1-score': 0.7652806180413542, 'support': 2013.0} | {'precision': 0.8987082984852709, 'recall': 0.8531227946365562, 'f1-score': 0.8753224419604472, 'support': 11336.0} | {'precision': 0.9433575978161965, 'recall': 0.8989811402557988, 'f1-score': 0.9206349206349206, 'support': 9226.0} | 0.8363 | {'precision': 0.7315638706018665, 'recall': 0.7743170971601333, 'f1-score': 0.7502168326665896, 'support': 27619.0} | {'precision': 0.8468945727618169, 'recall': 0.8363083384626525, 'f1-score': 0.8402796324188655, 'support': 27619.0} |
81
+ | No log | 6.0 | 246 | 0.4648 | {'precision': 0.5723270440251572, 'recall': 0.6715867158671587, 'f1-score': 0.6179966044142615, 'support': 271.0} | {'precision': 0.6688311688311688, 'recall': 0.7410071942446043, 'f1-score': 0.7030716723549487, 'support': 139.0} | {'precision': 0.766295707472178, 'recall': 0.7614533965244866, 'f1-score': 0.7638668779714738, 'support': 633.0} | {'precision': 0.584217225286381, 'recall': 0.6883279180204949, 'f1-score': 0.632013769363167, 'support': 4001.0} | {'precision': 0.8006103763987793, 'recall': 0.7819175360158966, 'f1-score': 0.7911535561698919, 'support': 2013.0} | {'precision': 0.9042451932229202, 'recall': 0.8380381086803105, 'f1-score': 0.8698837102829411, 'support': 11336.0} | {'precision': 0.9182383197599657, 'recall': 0.9287882072404076, 'f1-score': 0.9234831339584008, 'support': 9226.0} | 0.8387 | {'precision': 0.7449664335709356, 'recall': 0.7730170109419084, 'f1-score': 0.7573527606450121, 'support': 27619.0} | {'precision': 0.8474023461664165, 'recall': 0.8386979977551685, 'f1-score': 0.8418504692229696, 'support': 27619.0} |
82
+ | No log | 7.0 | 287 | 0.5209 | {'precision': 0.6069868995633187, 'recall': 0.5129151291512916, 'f1-score': 0.5559999999999999, 'support': 271.0} | {'precision': 0.673202614379085, 'recall': 0.7410071942446043, 'f1-score': 0.7054794520547945, 'support': 139.0} | {'precision': 0.7305555555555555, 'recall': 0.8309636650868878, 'f1-score': 0.7775314116777531, 'support': 633.0} | {'precision': 0.6616257088846881, 'recall': 0.5248687828042989, 'f1-score': 0.5853658536585367, 'support': 4001.0} | {'precision': 0.8349673202614379, 'recall': 0.7615499254843517, 'f1-score': 0.7965705378020265, 'support': 2013.0} | {'precision': 0.8421978725105974, 'recall': 0.9288990825688074, 'f1-score': 0.883426318218046, 'support': 11336.0} | {'precision': 0.9342514438027544, 'recall': 0.9117710817255582, 'f1-score': 0.9228743828853538, 'support': 9226.0} | 0.8452 | {'precision': 0.7548267735653482, 'recall': 0.7445678372951142, 'f1-score': 0.7467497080423586, 'support': 27619.0} | {'precision': 0.8405453803571916, 'recall': 0.8451790434121438, 'f1-score': 0.8405597632184714, 'support': 27619.0} |
83
+ | No log | 8.0 | 328 | 0.5321 | {'precision': 0.6086956521739131, 'recall': 0.5682656826568265, 'f1-score': 0.5877862595419848, 'support': 271.0} | {'precision': 0.6257309941520468, 'recall': 0.7697841726618705, 'f1-score': 0.6903225806451613, 'support': 139.0} | {'precision': 0.7475035663338089, 'recall': 0.8278041074249605, 'f1-score': 0.7856071964017991, 'support': 633.0} | {'precision': 0.6394031500414479, 'recall': 0.5783554111472132, 'f1-score': 0.6073490813648293, 'support': 4001.0} | {'precision': 0.7633802816901408, 'recall': 0.8077496274217586, 'f1-score': 0.7849384503982622, 'support': 2013.0} | {'precision': 0.8544407894736842, 'recall': 0.9165490472829922, 'f1-score': 0.8844058563159686, 'support': 11336.0} | {'precision': 0.9488642981945253, 'recall': 0.8829395187513549, 'f1-score': 0.9147156251754535, 'support': 9226.0} | 0.8422 | {'precision': 0.7411455331513668, 'recall': 0.7644925096209967, 'f1-score': 0.7507321499776369, 'support': 27619.0} | {'precision': 0.8421811869506509, 'recall': 0.8422100727759876, 'f1-score': 0.8409938879259506, 'support': 27619.0} |
84
+ | No log | 9.0 | 369 | 0.5343 | {'precision': 0.6086956521739131, 'recall': 0.6199261992619927, 'f1-score': 0.6142595978062158, 'support': 271.0} | {'precision': 0.6545454545454545, 'recall': 0.7769784172661871, 'f1-score': 0.7105263157894737, 'support': 139.0} | {'precision': 0.7649700598802395, 'recall': 0.8072669826224329, 'f1-score': 0.7855495772482706, 'support': 633.0} | {'precision': 0.6530825496342738, 'recall': 0.6248437890527369, 'f1-score': 0.6386511687316387, 'support': 4001.0} | {'precision': 0.7928818586258033, 'recall': 0.7968206656731247, 'f1-score': 0.7948463825569871, 'support': 2013.0} | {'precision': 0.8815653075328266, 'recall': 0.9002293577981652, 'f1-score': 0.8907995810055866, 'support': 11336.0} | {'precision': 0.933722338434438, 'recall': 0.9192499458053327, 'f1-score': 0.9264296247747009, 'support': 9226.0} | 0.8537 | {'precision': 0.7556376029752785, 'recall': 0.7779021939257103, 'f1-score': 0.7658660354161249, 'support': 27619.0} | {'precision': 0.8529333238319222, 'recall': 0.8536514718128825, 'f1-score': 0.8531479508284731, 'support': 27619.0} |
85
+ | No log | 10.0 | 410 | 0.5675 | {'precision': 0.6072727272727273, 'recall': 0.6162361623616236, 'f1-score': 0.6117216117216118, 'support': 271.0} | {'precision': 0.6408839779005525, 'recall': 0.8345323741007195, 'f1-score': 0.725, 'support': 139.0} | {'precision': 0.7824726134585289, 'recall': 0.7898894154818326, 'f1-score': 0.7861635220125786, 'support': 633.0} | {'precision': 0.6368590537527867, 'recall': 0.6425893526618346, 'f1-score': 0.639711370987808, 'support': 4001.0} | {'precision': 0.7728544776119403, 'recall': 0.8231495280675608, 'f1-score': 0.7972095261005533, 'support': 2013.0} | {'precision': 0.8844841165660279, 'recall': 0.8915843330980946, 'f1-score': 0.8880200325088962, 'support': 11336.0} | {'precision': 0.9394347240915208, 'recall': 0.9078690656839367, 'f1-score': 0.923382207033403, 'support': 9226.0} | 0.8506 | {'precision': 0.7520373843791549, 'recall': 0.786550033065086, 'f1-score': 0.7673154671949788, 'support': 27619.0} | {'precision': 0.852548057268436, 'recall': 0.8506462942177486, 'f1-score': 0.8513769639807444, 'support': 27619.0} |
86
+ | No log | 11.0 | 451 | 0.5722 | {'precision': 0.608540925266904, 'recall': 0.6309963099630996, 'f1-score': 0.6195652173913043, 'support': 271.0} | {'precision': 0.653179190751445, 'recall': 0.8129496402877698, 'f1-score': 0.7243589743589745, 'support': 139.0} | {'precision': 0.7749244712990937, 'recall': 0.8104265402843602, 'f1-score': 0.7922779922779922, 'support': 633.0} | {'precision': 0.6357715430861723, 'recall': 0.6343414146463384, 'f1-score': 0.6350556737145002, 'support': 4001.0} | {'precision': 0.7891918208373905, 'recall': 0.8052657724788872, 'f1-score': 0.7971477747725597, 'support': 2013.0} | {'precision': 0.8812837015133067, 'recall': 0.8938779110797459, 'f1-score': 0.8875361303319611, 'support': 11336.0} | {'precision': 0.9391673177810024, 'recall': 0.9119878603945372, 'f1-score': 0.925378058839703, 'support': 9226.0} | 0.8510 | {'precision': 0.7545798529336165, 'recall': 0.7856922070192482, 'f1-score': 0.7687599745267136, 'support': 27619.0} | {'precision': 0.8520796727625882, 'recall': 0.8509721568485463, 'f1-score': 0.8513799850069879, 'support': 27619.0} |
87
+ | No log | 12.0 | 492 | 0.5920 | {'precision': 0.6021897810218978, 'recall': 0.6088560885608856, 'f1-score': 0.6055045871559632, 'support': 271.0} | {'precision': 0.6491228070175439, 'recall': 0.7985611510791367, 'f1-score': 0.7161290322580646, 'support': 139.0} | {'precision': 0.7657657657657657, 'recall': 0.8056872037914692, 'f1-score': 0.7852193995381064, 'support': 633.0} | {'precision': 0.6368209255533199, 'recall': 0.6328417895526118, 'f1-score': 0.634825122226401, 'support': 4001.0} | {'precision': 0.785024154589372, 'recall': 0.8072528564331843, 'f1-score': 0.7959833455792311, 'support': 2013.0} | {'precision': 0.8778249199064854, 'recall': 0.894318983768525, 'f1-score': 0.8859951933580948, 'support': 11336.0} | {'precision': 0.9398631212835185, 'recall': 0.9079774550184262, 'f1-score': 0.9236451844092839, 'support': 9226.0} | 0.8493 | {'precision': 0.7509444964482718, 'recall': 0.7793565040291771, 'f1-score': 0.7639002663607349, 'support': 27619.0} | {'precision': 0.8504480910210725, 'recall': 0.8493428436945581, 'f1-score': 0.8497092338772945, 'support': 27619.0} |
88
+ | 0.3248 | 13.0 | 533 | 0.5984 | {'precision': 0.6085271317829457, 'recall': 0.5793357933579336, 'f1-score': 0.5935727788279772, 'support': 271.0} | {'precision': 0.6473988439306358, 'recall': 0.8057553956834532, 'f1-score': 0.717948717948718, 'support': 139.0} | {'precision': 0.7566371681415929, 'recall': 0.8104265402843602, 'f1-score': 0.782608695652174, 'support': 633.0} | {'precision': 0.6461824953445066, 'recall': 0.6070982254436391, 'f1-score': 0.6260309278350515, 'support': 4001.0} | {'precision': 0.7872444011684518, 'recall': 0.8032786885245902, 'f1-score': 0.7951807228915663, 'support': 2013.0} | {'precision': 0.8729847308709374, 'recall': 0.902787579393084, 'f1-score': 0.8876360640097141, 'support': 11336.0} | {'precision': 0.9370403387564074, 'recall': 0.9114459137220897, 'f1-score': 0.924065934065934, 'support': 9226.0} | 0.8498 | {'precision': 0.7508593014279253, 'recall': 0.7743040194870214, 'f1-score': 0.7610062630330193, 'support': 27619.0} | {'precision': 0.8488808008037289, 'recall': 0.8498135341612658, 'f1-score': 0.849023051738306, 'support': 27619.0} |
89
+
90
+
91
+ ### Framework versions
92
+
93
+ - Transformers 4.37.2
94
+ - Pytorch 2.2.0+cu121
95
+ - Datasets 2.17.0
96
+ - Tokenizers 0.15.2
meta_data/README_s42_e14.md ADDED
@@ -0,0 +1,97 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: allenai/longformer-base-4096
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - essays_su_g
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: longformer-full_labels
12
+ results:
13
+ - task:
14
+ name: Token Classification
15
+ type: token-classification
16
+ dataset:
17
+ name: essays_su_g
18
+ type: essays_su_g
19
+ config: full_labels
20
+ split: train[80%:100%]
21
+ args: full_labels
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.8502480176689959
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # longformer-full_labels
32
+
33
+ This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.6161
36
+ - B-claim: {'precision': 0.608058608058608, 'recall': 0.6125461254612546, 'f1-score': 0.6102941176470588, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.650887573964497, 'recall': 0.7913669064748201, 'f1-score': 0.7142857142857143, 'support': 139.0}
38
+ - B-premise: {'precision': 0.77526395173454, 'recall': 0.8120063191153238, 'f1-score': 0.7932098765432098, 'support': 633.0}
39
+ - I-claim: {'precision': 0.6432855280312908, 'recall': 0.6165958510372407, 'f1-score': 0.6296579887697805, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.7802516940948693, 'recall': 0.8007948335817189, 'f1-score': 0.7903898014219172, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8767545361177679, 'recall': 0.9036697247706422, 'f1-score': 0.8900086880973067, 'support': 11336.0}
42
+ - O: {'precision': 0.9373950050397581, 'recall': 0.9072187296769998, 'f1-score': 0.9220600385568715, 'support': 9226.0}
43
+ - Accuracy: 0.8502
44
+ - Macro avg: {'precision': 0.7531281281487615, 'recall': 0.7777426414454286, 'f1-score': 0.7642723179031227, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8500571031828417, 'recall': 0.8502480176689959, 'f1-score': 0.8498916673068139, 'support': 27619.0}
46
+
47
+ ## Model description
48
+
49
+ More information needed
50
+
51
+ ## Intended uses & limitations
52
+
53
+ More information needed
54
+
55
+ ## Training and evaluation data
56
+
57
+ More information needed
58
+
59
+ ## Training procedure
60
+
61
+ ### Training hyperparameters
62
+
63
+ The following hyperparameters were used during training:
64
+ - learning_rate: 2e-05
65
+ - train_batch_size: 8
66
+ - eval_batch_size: 8
67
+ - seed: 42
68
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
+ - lr_scheduler_type: linear
70
+ - num_epochs: 14
71
+
72
+ ### Training results
73
+
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:-----------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:---------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.6666 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.8426966292134831, 'recall': 0.11848341232227488, 'f1-score': 0.2077562326869806, 'support': 633.0} | {'precision': 0.47888829502939606, 'recall': 0.22394401399650088, 'f1-score': 0.3051771117166212, 'support': 4001.0} | {'precision': 0.5680592991913747, 'recall': 0.4187779433681073, 'f1-score': 0.48212753788961965, 'support': 2013.0} | {'precision': 0.7552974304822245, 'recall': 0.946453775582216, 'f1-score': 0.840139383735954, 'support': 11336.0} | {'precision': 0.8265797392176529, 'recall': 0.893236505527856, 'f1-score': 0.8586163784121692, 'support': 9226.0} | 0.7525 | {'precision': 0.49593162759059023, 'recall': 0.37155652154242214, 'f1-score': 0.3848309492059064, 'support': 27619.0} | {'precision': 0.7162112585519226, 'recall': 0.7525254353886817, 'f1-score': 0.715755849752066, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5095 | {'precision': 0.15625, 'recall': 0.01845018450184502, 'f1-score': 0.033003300330033, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.5287356321839081, 'recall': 0.8720379146919431, 'f1-score': 0.6583184257602862, 'support': 633.0} | {'precision': 0.5973516429622364, 'recall': 0.3044238940264934, 'f1-score': 0.40331125827814573, 'support': 4001.0} | {'precision': 0.6463730569948186, 'recall': 0.7436661698956781, 'f1-score': 0.6916146916146916, 'support': 2013.0} | {'precision': 0.7885180829167892, 'recall': 0.9462773465067043, 'f1-score': 0.860224538893344, 'support': 11336.0} | {'precision': 0.9335973904939422, 'recall': 0.8686321265987427, 'f1-score': 0.8999438517686693, 'support': 9226.0} | 0.7970 | {'precision': 0.5215465436502421, 'recall': 0.536212519460201, 'f1-score': 0.5066308666635957, 'support': 27619.0} | {'precision': 0.7828015788057759, 'recall': 0.7970237879720482, 'f1-score': 0.7779396620369899, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4365 | {'precision': 0.4470198675496689, 'recall': 0.4981549815498155, 'f1-score': 0.4712041884816754, 'support': 271.0} | {'precision': 0.6190476190476191, 'recall': 0.09352517985611511, 'f1-score': 0.1625, 'support': 139.0} | {'precision': 0.7084507042253522, 'recall': 0.7946287519747235, 'f1-score': 0.7490692479523456, 'support': 633.0} | {'precision': 0.6072289156626506, 'recall': 0.6298425393651587, 'f1-score': 0.6183290393816709, 'support': 4001.0} | {'precision': 0.7582529202640934, 'recall': 0.741679085941381, 'f1-score': 0.7498744349573079, 'support': 2013.0} | {'precision': 0.8812472551602987, 'recall': 0.8850564573041637, 'f1-score': 0.883147748778663, 'support': 11336.0} | {'precision': 0.9257872715260955, 'recall': 0.9113375243876003, 'f1-score': 0.9185055713349356, 'support': 9226.0} | 0.8366 | {'precision': 0.706719221919397, 'recall': 0.6506035029112798, 'f1-score': 0.6503757472695141, 'support': 27619.0} | {'precision': 0.8379252532887874, 'recall': 0.8365617871754951, 'f1-score': 0.8361407608696381, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4331 | {'precision': 0.5692883895131086, 'recall': 0.5608856088560885, 'f1-score': 0.5650557620817843, 'support': 271.0} | {'precision': 0.6716417910447762, 'recall': 0.6474820143884892, 'f1-score': 0.6593406593406593, 'support': 139.0} | {'precision': 0.7767441860465116, 'recall': 0.7914691943127962, 'f1-score': 0.784037558685446, 'support': 633.0} | {'precision': 0.6376226197345644, 'recall': 0.5523619095226193, 'f1-score': 0.5919378599169679, 'support': 4001.0} | {'precision': 0.7044233154195949, 'recall': 0.8464977645305514, 'f1-score': 0.7689530685920577, 'support': 2013.0} | {'precision': 0.8617065994115174, 'recall': 0.9041990119971771, 'f1-score': 0.882441565150015, 'support': 11336.0} | {'precision': 0.9424542249516661, 'recall': 0.8982224149143724, 'f1-score': 0.9198068705255563, 'support': 9226.0} | 0.8398 | {'precision': 0.737697303731677, 'recall': 0.7430168455031563, 'f1-score': 0.7387961920417838, 'support': 27619.0} | {'precision': 0.8389816922448817, 'recall': 0.839784206524494, 'f1-score': 0.838075814201577, 'support': 27619.0} |
80
+ | No log | 5.0 | 205 | 0.4395 | {'precision': 0.6, 'recall': 0.6088560885608856, 'f1-score': 0.6043956043956044, 'support': 271.0} | {'precision': 0.6459627329192547, 'recall': 0.7482014388489209, 'f1-score': 0.6933333333333332, 'support': 139.0} | {'precision': 0.7828843106180665, 'recall': 0.7804107424960506, 'f1-score': 0.7816455696202531, 'support': 633.0} | {'precision': 0.6231617647058824, 'recall': 0.6778305423644089, 'f1-score': 0.6493475398060576, 'support': 4001.0} | {'precision': 0.7367718986216096, 'recall': 0.8231495280675608, 'f1-score': 0.7775692163303614, 'support': 2013.0} | {'precision': 0.8981981981981982, 'recall': 0.879498941425547, 'f1-score': 0.888750222856124, 'support': 11336.0} | {'precision': 0.940684668399051, 'recall': 0.9024495989594624, 'f1-score': 0.9211705482104332, 'support': 9226.0} | 0.8483 | {'precision': 0.7468090819231518, 'recall': 0.7743424115318336, 'f1-score': 0.759458862078881, 'support': 27619.0} | {'precision': 0.8539439576536069, 'recall': 0.8482566349252326, 'f1-score': 0.8505675271015494, 'support': 27619.0} |
81
+ | No log | 6.0 | 246 | 0.4533 | {'precision': 0.597864768683274, 'recall': 0.6199261992619927, 'f1-score': 0.6086956521739131, 'support': 271.0} | {'precision': 0.660377358490566, 'recall': 0.7553956834532374, 'f1-score': 0.7046979865771812, 'support': 139.0} | {'precision': 0.7571214392803598, 'recall': 0.7977883096366508, 'f1-score': 0.7769230769230769, 'support': 633.0} | {'precision': 0.6311822892133773, 'recall': 0.6698325418645339, 'f1-score': 0.6499333090820903, 'support': 4001.0} | {'precision': 0.8006134969325154, 'recall': 0.7779433681073026, 'f1-score': 0.7891156462585035, 'support': 2013.0} | {'precision': 0.89441998382604, 'recall': 0.8780875088214538, 'f1-score': 0.8861784998887158, 'support': 11336.0} | {'precision': 0.9252804705369786, 'recall': 0.9207673964881855, 'f1-score': 0.9230184169066116, 'support': 9226.0} | 0.8499 | {'precision': 0.7524085438518731, 'recall': 0.7742487153761939, 'f1-score': 0.7626517982585845, 'support': 27619.0} | {'precision': 0.8525236084761163, 'recall': 0.8498859480792208, 'f1-score': 0.8510468229928801, 'support': 27619.0} |
82
+ | No log | 7.0 | 287 | 0.5152 | {'precision': 0.6147186147186147, 'recall': 0.5239852398523985, 'f1-score': 0.5657370517928287, 'support': 271.0} | {'precision': 0.6645569620253164, 'recall': 0.7553956834532374, 'f1-score': 0.7070707070707071, 'support': 139.0} | {'precision': 0.7232876712328767, 'recall': 0.8341232227488151, 'f1-score': 0.7747615553925166, 'support': 633.0} | {'precision': 0.6592077411551255, 'recall': 0.5448637840539865, 'f1-score': 0.5966064586754242, 'support': 4001.0} | {'precision': 0.8155080213903744, 'recall': 0.7575757575757576, 'f1-score': 0.7854751480813804, 'support': 2013.0} | {'precision': 0.8490016975183898, 'recall': 0.9265172900494001, 'f1-score': 0.8860674062513181, 'support': 11336.0} | {'precision': 0.9357685433422699, 'recall': 0.9079774550184262, 'f1-score': 0.9216635493453625, 'support': 9226.0} | 0.8458 | {'precision': 0.7517213216261383, 'recall': 0.7500626332502888, 'f1-score': 0.7481974109442197, 'support': 27619.0} | {'precision': 0.8419419566807417, 'recall': 0.8457945617147615, 'f1-score': 0.8420990467307141, 'support': 27619.0} |
83
+ | No log | 8.0 | 328 | 0.5139 | {'precision': 0.6188679245283019, 'recall': 0.6051660516605166, 'f1-score': 0.6119402985074627, 'support': 271.0} | {'precision': 0.6453488372093024, 'recall': 0.7985611510791367, 'f1-score': 0.7138263665594855, 'support': 139.0} | {'precision': 0.7691154422788605, 'recall': 0.8104265402843602, 'f1-score': 0.7892307692307692, 'support': 633.0} | {'precision': 0.6418028465998946, 'recall': 0.6085978505373657, 'f1-score': 0.6247594611930725, 'support': 4001.0} | {'precision': 0.7742395437262357, 'recall': 0.8092399403874814, 'f1-score': 0.7913529268885109, 'support': 2013.0} | {'precision': 0.8695688926458157, 'recall': 0.9074629498941426, 'f1-score': 0.8881118881118881, 'support': 11336.0} | {'precision': 0.9440081939228405, 'recall': 0.8990895295902883, 'f1-score': 0.9210014989174485, 'support': 9226.0} | 0.8485 | {'precision': 0.7518502401301788, 'recall': 0.7769348590618987, 'f1-score': 0.7628890299155195, 'support': 27619.0} | {'precision': 0.8486012066263791, 'recall': 0.8484738766790977, 'f1-score': 0.8480427604721086, 'support': 27619.0} |
84
+ | No log | 9.0 | 369 | 0.5370 | {'precision': 0.6145454545454545, 'recall': 0.6236162361623616, 'f1-score': 0.619047619047619, 'support': 271.0} | {'precision': 0.6484848484848484, 'recall': 0.7697841726618705, 'f1-score': 0.7039473684210525, 'support': 139.0} | {'precision': 0.7679640718562875, 'recall': 0.8104265402843602, 'f1-score': 0.7886241352805534, 'support': 633.0} | {'precision': 0.6341463414634146, 'recall': 0.6368407898025493, 'f1-score': 0.6354907095647836, 'support': 4001.0} | {'precision': 0.7907324364723468, 'recall': 0.7883755588673621, 'f1-score': 0.7895522388059701, 'support': 2013.0} | {'precision': 0.8828484585323491, 'recall': 0.8967889908256881, 'f1-score': 0.889764124108354, 'support': 11336.0} | {'precision': 0.9367963437743841, 'recall': 0.9109039670496423, 'f1-score': 0.9236687366049349, 'support': 9226.0} | 0.8506 | {'precision': 0.7536454221612979, 'recall': 0.7766766079505477, 'f1-score': 0.7642992759761811, 'support': 27619.0} | {'precision': 0.8516825218148875, 'recall': 0.8506462942177486, 'f1-score': 0.8510413312248658, 'support': 27619.0} |
85
+ | No log | 10.0 | 410 | 0.5749 | {'precision': 0.635036496350365, 'recall': 0.6420664206642066, 'f1-score': 0.6385321100917432, 'support': 271.0} | {'precision': 0.6368715083798883, 'recall': 0.8201438848920863, 'f1-score': 0.7169811320754718, 'support': 139.0} | {'precision': 0.8061889250814332, 'recall': 0.7819905213270142, 'f1-score': 0.793905372894948, 'support': 633.0} | {'precision': 0.6558861578266494, 'recall': 0.6335916020994752, 'f1-score': 0.6445461479786423, 'support': 4001.0} | {'precision': 0.76497277676951, 'recall': 0.8375558867362146, 'f1-score': 0.7996205833530947, 'support': 2013.0} | {'precision': 0.8854955190115723, 'recall': 0.8977593507410021, 'f1-score': 0.8915852643567392, 'support': 11336.0} | {'precision': 0.9348164627363738, 'recall': 0.9109039670496423, 'f1-score': 0.9227053140096618, 'support': 9226.0} | 0.8539 | {'precision': 0.7598954065936846, 'recall': 0.7891445190728058, 'f1-score': 0.7725537035371859, 'support': 27619.0} | {'precision': 0.8543981398882914, 'recall': 0.8539411274847025, 'f1-score': 0.8538904318182887, 'support': 27619.0} |
86
+ | No log | 11.0 | 451 | 0.5764 | {'precision': 0.6126760563380281, 'recall': 0.6420664206642066, 'f1-score': 0.627027027027027, 'support': 271.0} | {'precision': 0.6488095238095238, 'recall': 0.7841726618705036, 'f1-score': 0.7100977198697069, 'support': 139.0} | {'precision': 0.7900466562986003, 'recall': 0.8025276461295419, 'f1-score': 0.7962382445141066, 'support': 633.0} | {'precision': 0.6296115318837039, 'recall': 0.6440889777555611, 'f1-score': 0.6367679762787251, 'support': 4001.0} | {'precision': 0.7710437710437711, 'recall': 0.7963238946845504, 'f1-score': 0.7834799608993158, 'support': 2013.0} | {'precision': 0.887019020071873, 'recall': 0.8927311220889202, 'f1-score': 0.8898659045944164, 'support': 11336.0} | {'precision': 0.935480263893548, 'recall': 0.9067851723390419, 'f1-score': 0.9209092410149156, 'support': 9226.0} | 0.8493 | {'precision': 0.7535266890484354, 'recall': 0.7812422707903323, 'f1-score': 0.7663408677426019, 'support': 27619.0} | {'precision': 0.8513521360262348, 'recall': 0.8493066367355806, 'f1-score': 0.8501875195565032, 'support': 27619.0} |
87
+ | No log | 12.0 | 492 | 0.6054 | {'precision': 0.6133828996282528, 'recall': 0.6088560885608856, 'f1-score': 0.611111111111111, 'support': 271.0} | {'precision': 0.6285714285714286, 'recall': 0.7913669064748201, 'f1-score': 0.7006369426751592, 'support': 139.0} | {'precision': 0.7722473604826546, 'recall': 0.8088467614533965, 'f1-score': 0.7901234567901234, 'support': 633.0} | {'precision': 0.6284111196123437, 'recall': 0.6158460384903774, 'f1-score': 0.6220651350669023, 'support': 4001.0} | {'precision': 0.7676864244741873, 'recall': 0.7978142076502732, 'f1-score': 0.7824604141291109, 'support': 2013.0} | {'precision': 0.871963174494928, 'recall': 0.9023465067043048, 'f1-score': 0.8868946980534964, 'support': 11336.0} | {'precision': 0.9429744525547445, 'recall': 0.8961630175590722, 'f1-score': 0.918972990996999, 'support': 9226.0} | 0.8456 | {'precision': 0.7464624085455057, 'recall': 0.7744627895561614, 'f1-score': 0.7588949641175574, 'support': 27619.0} | {'precision': 0.8467545269899819, 'recall': 0.8455773199608965, 'f1-score': 0.8457730665631786, 'support': 27619.0} |
88
+ | 0.3148 | 13.0 | 533 | 0.6185 | {'precision': 0.6159695817490495, 'recall': 0.5977859778597786, 'f1-score': 0.6067415730337079, 'support': 271.0} | {'precision': 0.6473988439306358, 'recall': 0.8057553956834532, 'f1-score': 0.717948717948718, 'support': 139.0} | {'precision': 0.7628865979381443, 'recall': 0.8183254344391785, 'f1-score': 0.7896341463414633, 'support': 633.0} | {'precision': 0.6313704779508846, 'recall': 0.5976005998500374, 'f1-score': 0.6140215716486902, 'support': 4001.0} | {'precision': 0.7710437710437711, 'recall': 0.7963238946845504, 'f1-score': 0.7834799608993158, 'support': 2013.0} | {'precision': 0.8684410646387832, 'recall': 0.9066690190543402, 'f1-score': 0.8871434120236503, 'support': 11336.0} | {'precision': 0.9418380097693968, 'recall': 0.8986559722523304, 'f1-score': 0.919740418215098, 'support': 9226.0} | 0.8456 | {'precision': 0.7484211924315236, 'recall': 0.7744451848319527, 'f1-score': 0.759815685730092, 'support': 27619.0} | {'precision': 0.8455082802681305, 'recall': 0.845613526919874, 'f1-score': 0.8450736282751178, 'support': 27619.0} |
89
+ | 0.3148 | 14.0 | 574 | 0.6161 | {'precision': 0.608058608058608, 'recall': 0.6125461254612546, 'f1-score': 0.6102941176470588, 'support': 271.0} | {'precision': 0.650887573964497, 'recall': 0.7913669064748201, 'f1-score': 0.7142857142857143, 'support': 139.0} | {'precision': 0.77526395173454, 'recall': 0.8120063191153238, 'f1-score': 0.7932098765432098, 'support': 633.0} | {'precision': 0.6432855280312908, 'recall': 0.6165958510372407, 'f1-score': 0.6296579887697805, 'support': 4001.0} | {'precision': 0.7802516940948693, 'recall': 0.8007948335817189, 'f1-score': 0.7903898014219172, 'support': 2013.0} | {'precision': 0.8767545361177679, 'recall': 0.9036697247706422, 'f1-score': 0.8900086880973067, 'support': 11336.0} | {'precision': 0.9373950050397581, 'recall': 0.9072187296769998, 'f1-score': 0.9220600385568715, 'support': 9226.0} | 0.8502 | {'precision': 0.7531281281487615, 'recall': 0.7777426414454286, 'f1-score': 0.7642723179031227, 'support': 27619.0} | {'precision': 0.8500571031828417, 'recall': 0.8502480176689959, 'f1-score': 0.8498916673068139, 'support': 27619.0} |
90
+
91
+
92
+ ### Framework versions
93
+
94
+ - Transformers 4.37.2
95
+ - Pytorch 2.2.0+cu121
96
+ - Datasets 2.17.0
97
+ - Tokenizers 0.15.2
meta_data/README_s42_e15.md ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: allenai/longformer-base-4096
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - essays_su_g
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: longformer-full_labels
12
+ results:
13
+ - task:
14
+ name: Token Classification
15
+ type: token-classification
16
+ dataset:
17
+ name: essays_su_g
18
+ type: essays_su_g
19
+ config: full_labels
20
+ split: train[80%:100%]
21
+ args: full_labels
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.8435497302581556
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # longformer-full_labels
32
+
33
+ This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.6473
36
+ - B-claim: {'precision': 0.5985130111524164, 'recall': 0.5940959409594095, 'f1-score': 0.5962962962962963, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.6707317073170732, 'recall': 0.7913669064748201, 'f1-score': 0.7260726072607261, 'support': 139.0}
38
+ - B-premise: {'precision': 0.7566765578635015, 'recall': 0.8056872037914692, 'f1-score': 0.7804131599081868, 'support': 633.0}
39
+ - I-claim: {'precision': 0.6212043232115285, 'recall': 0.6033491627093227, 'f1-score': 0.6121465703055661, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.7810361681329423, 'recall': 0.793840039741679, 'f1-score': 0.7873860556787385, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8689666893269884, 'recall': 0.9020818630910374, 'f1-score': 0.8852146814404431, 'support': 11336.0}
42
+ - O: {'precision': 0.9395142986836132, 'recall': 0.8973553002384566, 'f1-score': 0.9179509923494844, 'support': 9226.0}
43
+ - Accuracy: 0.8435
44
+ - Macro avg: {'precision': 0.748091822241152, 'recall': 0.7696823452865992, 'f1-score': 0.7579257661770632, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8440071909900311, 'recall': 0.8435497302581556, 'f1-score': 0.8434243803550634, 'support': 27619.0}
46
+
47
+ ## Model description
48
+
49
+ More information needed
50
+
51
+ ## Intended uses & limitations
52
+
53
+ More information needed
54
+
55
+ ## Training and evaluation data
56
+
57
+ More information needed
58
+
59
+ ## Training procedure
60
+
61
+ ### Training hyperparameters
62
+
63
+ The following hyperparameters were used during training:
64
+ - learning_rate: 2e-05
65
+ - train_batch_size: 8
66
+ - eval_batch_size: 8
67
+ - seed: 42
68
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
+ - lr_scheduler_type: linear
70
+ - num_epochs: 15
71
+
72
+ ### Training results
73
+
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.7008 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.7865168539325843, 'recall': 0.11058451816745656, 'f1-score': 0.19390581717451524, 'support': 633.0} | {'precision': 0.40153452685422, 'recall': 0.23544113971507125, 'f1-score': 0.29683314951945805, 'support': 4001.0} | {'precision': 0.5759865659109992, 'recall': 0.34078489816194735, 'f1-score': 0.4282147315855181, 'support': 2013.0} | {'precision': 0.7355271176112127, 'recall': 0.9582745236414961, 'f1-score': 0.8322543574027964, 'support': 11336.0} | {'precision': 0.8612315698178664, 'recall': 0.8610448731844786, 'f1-score': 0.8611382113821139, 'support': 9226.0} | 0.7424 | {'precision': 0.4801138048752689, 'recall': 0.35801856469577853, 'f1-score': 0.3731923238663431, 'support': 27619.0} | {'precision': 0.7077563864021956, 'recall': 0.7424236938339549, 'f1-score': 0.707906318183495, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5146 | {'precision': 0.19047619047619047, 'recall': 0.014760147601476014, 'f1-score': 0.0273972602739726, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.588774341351661, 'recall': 0.8120063191153238, 'f1-score': 0.6826029216467464, 'support': 633.0} | {'precision': 0.5761194029850746, 'recall': 0.3859035241189703, 'f1-score': 0.4622062565484209, 'support': 4001.0} | {'precision': 0.6959531416400426, 'recall': 0.6492796820665673, 'f1-score': 0.6718067334875354, 'support': 2013.0} | {'precision': 0.8084617153368591, 'recall': 0.9304869442484122, 'f1-score': 0.865192962309806, 'support': 11336.0} | {'precision': 0.8987938596491228, 'recall': 0.8884673748103187, 'f1-score': 0.8936007849122425, 'support': 9226.0} | 0.8007 | {'precision': 0.5369398073484215, 'recall': 0.5258434274230097, 'f1-score': 0.5146867027398178, 'support': 27619.0} | {'precision': 0.7816110201434078, 'recall': 0.8006806908287772, 'f1-score': 0.7854496816047499, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4449 | {'precision': 0.4583333333333333, 'recall': 0.5276752767527675, 'f1-score': 0.49056603773584906, 'support': 271.0} | {'precision': 0.6956521739130435, 'recall': 0.2302158273381295, 'f1-score': 0.34594594594594597, 'support': 139.0} | {'precision': 0.7202898550724638, 'recall': 0.7851500789889415, 'f1-score': 0.7513227513227514, 'support': 633.0} | {'precision': 0.586923245134485, 'recall': 0.6708322919270182, 'f1-score': 0.6260788430137625, 'support': 4001.0} | {'precision': 0.7267942583732058, 'recall': 0.754595131644312, 'f1-score': 0.7404338289056787, 'support': 2013.0} | {'precision': 0.8974713916574382, 'recall': 0.8578863796753705, 'f1-score': 0.8772325455529497, 'support': 11336.0} | {'precision': 0.9236111111111112, 'recall': 0.9081942336874052, 'f1-score': 0.9158377964804897, 'support': 9226.0} | 0.8320 | {'precision': 0.7155821955135829, 'recall': 0.6763641742877063, 'f1-score': 0.6782025355653467, 'support': 27619.0} | {'precision': 0.8393908547230632, 'recall': 0.8319997103443282, 'f1-score': 0.8344212165358135, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4465 | {'precision': 0.5707762557077626, 'recall': 0.4612546125461255, 'f1-score': 0.5102040816326531, 'support': 271.0} | {'precision': 0.6581196581196581, 'recall': 0.5539568345323741, 'f1-score': 0.6015625, 'support': 139.0} | {'precision': 0.7303851640513552, 'recall': 0.8088467614533965, 'f1-score': 0.767616191904048, 'support': 633.0} | {'precision': 0.6494704475572258, 'recall': 0.47513121719570106, 'f1-score': 0.5487875288683602, 'support': 4001.0} | {'precision': 0.7996837111228255, 'recall': 0.7536015896671634, 'f1-score': 0.7759590792838875, 'support': 2013.0} | {'precision': 0.8320863196030558, 'recall': 0.9319865913902611, 'f1-score': 0.8792077560021636, 'support': 11336.0} | {'precision': 0.928153625427657, 'recall': 0.9115543030565793, 'f1-score': 0.9197790780335757, 'support': 9226.0} | 0.8366 | {'precision': 0.7383821687985057, 'recall': 0.6994759871202287, 'f1-score': 0.7147308879606697, 'support': 27619.0} | {'precision': 0.8295906167856352, 'recall': 0.8366342010934502, 'f1-score': 0.8297935829927507, 'support': 27619.0} |
80
+ | No log | 5.0 | 205 | 0.4358 | {'precision': 0.6, 'recall': 0.6088560885608856, 'f1-score': 0.6043956043956044, 'support': 271.0} | {'precision': 0.6374269005847953, 'recall': 0.7841726618705036, 'f1-score': 0.703225806451613, 'support': 139.0} | {'precision': 0.7808, 'recall': 0.7709320695102686, 'f1-score': 0.7758346581875996, 'support': 633.0} | {'precision': 0.6116646415552855, 'recall': 0.6290927268182954, 'f1-score': 0.6202562838836866, 'support': 4001.0} | {'precision': 0.7328410078192876, 'recall': 0.8380526577247889, 'f1-score': 0.7819235225955967, 'support': 2013.0} | {'precision': 0.8857651245551601, 'recall': 0.8782639378969654, 'f1-score': 0.8819985825655563, 'support': 11336.0} | {'precision': 0.9360026993589022, 'recall': 0.9020160416215044, 'f1-score': 0.9186951482033449, 'support': 9226.0} | 0.8416 | {'precision': 0.7406429105533473, 'recall': 0.7730551691433158, 'f1-score': 0.7551899437547146, 'support': 27619.0} | {'precision': 0.8452341603615893, 'recall': 0.8415945544733697, 'f1-score': 0.8429891649448389, 'support': 27619.0} |
81
+ | No log | 6.0 | 246 | 0.4656 | {'precision': 0.5668789808917197, 'recall': 0.6568265682656826, 'f1-score': 0.6085470085470085, 'support': 271.0} | {'precision': 0.6455696202531646, 'recall': 0.7338129496402878, 'f1-score': 0.6868686868686869, 'support': 139.0} | {'precision': 0.7620528771384136, 'recall': 0.7740916271721959, 'f1-score': 0.768025078369906, 'support': 633.0} | {'precision': 0.6074982642906734, 'recall': 0.6560859785053736, 'f1-score': 0.6308579668348955, 'support': 4001.0} | {'precision': 0.7940131912734653, 'recall': 0.7774465971187282, 'f1-score': 0.7856425702811245, 'support': 2013.0} | {'precision': 0.8894866417434121, 'recall': 0.8605328158080452, 'f1-score': 0.8747702102856119, 'support': 11336.0} | {'precision': 0.920605732828556, 'recall': 0.9225016258400174, 'f1-score': 0.9215527042390774, 'support': 9226.0} | 0.8409 | {'precision': 0.7408721869170579, 'recall': 0.7687568803357615, 'f1-score': 0.7537520322037585, 'support': 27619.0} | {'precision': 0.844759622854032, 'recall': 0.8409428292117745, 'f1-score': 0.8425631787461125, 'support': 27619.0} |
82
+ | No log | 7.0 | 287 | 0.4964 | {'precision': 0.5975103734439834, 'recall': 0.5313653136531366, 'f1-score': 0.5625, 'support': 271.0} | {'precision': 0.6805555555555556, 'recall': 0.7050359712230215, 'f1-score': 0.6925795053003534, 'support': 139.0} | {'precision': 0.7341772151898734, 'recall': 0.8246445497630331, 'f1-score': 0.7767857142857142, 'support': 633.0} | {'precision': 0.6605504587155964, 'recall': 0.5578605348662834, 'f1-score': 0.6048780487804877, 'support': 4001.0} | {'precision': 0.8427997705106138, 'recall': 0.7297565822155986, 'f1-score': 0.7822151224707135, 'support': 2013.0} | {'precision': 0.8541922793213671, 'recall': 0.9193719124911786, 'f1-score': 0.8855843990313124, 'support': 11336.0} | {'precision': 0.9198913043478261, 'recall': 0.917298937784522, 'f1-score': 0.9185932920872679, 'support': 9226.0} | 0.8454 | {'precision': 0.7556681367264021, 'recall': 0.7407619717138249, 'f1-score': 0.746162297422264, 'support': 27619.0} | {'precision': 0.8411135771135726, 'recall': 0.8454324921249864, 'f1-score': 0.841777543839385, 'support': 27619.0} |
83
+ | No log | 8.0 | 328 | 0.5402 | {'precision': 0.5895522388059702, 'recall': 0.5830258302583026, 'f1-score': 0.5862708719851578, 'support': 271.0} | {'precision': 0.6242774566473989, 'recall': 0.7769784172661871, 'f1-score': 0.6923076923076924, 'support': 139.0} | {'precision': 0.7537091988130564, 'recall': 0.8025276461295419, 'f1-score': 0.7773527161438408, 'support': 633.0} | {'precision': 0.6078381795195954, 'recall': 0.6008497875531117, 'f1-score': 0.604323780794369, 'support': 4001.0} | {'precision': 0.7680074836295603, 'recall': 0.8156979632389468, 'f1-score': 0.7911346663454588, 'support': 2013.0} | {'precision': 0.8649226297341198, 'recall': 0.8924664784756527, 'f1-score': 0.8784787044675031, 'support': 11336.0} | {'precision': 0.9406701859077347, 'recall': 0.8884673748103187, 'f1-score': 0.9138238573021181, 'support': 9226.0} | 0.8376 | {'precision': 0.7355681961510622, 'recall': 0.7657162139617232, 'f1-score': 0.74909889847802, 'support': 27619.0} | {'precision': 0.8394578671455889, 'recall': 0.8376117889858431, 'f1-score': 0.8380825329114897, 'support': 27619.0} |
84
+ | No log | 9.0 | 369 | 0.5573 | {'precision': 0.5833333333333334, 'recall': 0.5682656826568265, 'f1-score': 0.5757009345794393, 'support': 271.0} | {'precision': 0.6604938271604939, 'recall': 0.7697841726618705, 'f1-score': 0.7109634551495018, 'support': 139.0} | {'precision': 0.7518248175182481, 'recall': 0.8135860979462876, 'f1-score': 0.7814871016691957, 'support': 633.0} | {'precision': 0.617191404297851, 'recall': 0.6173456635841039, 'f1-score': 0.6172685243033862, 'support': 4001.0} | {'precision': 0.7709631049353138, 'recall': 0.799304520615996, 'f1-score': 0.7848780487804878, 'support': 2013.0} | {'precision': 0.8666893096713933, 'recall': 0.9004057868736768, 'f1-score': 0.8832258901916671, 'support': 11336.0} | {'precision': 0.9458458690118028, 'recall': 0.8859744201170605, 'f1-score': 0.9149317215133199, 'support': 9226.0} | 0.8413 | {'precision': 0.7423345237040623, 'recall': 0.7649523349222601, 'f1-score': 0.752636525169571, 'support': 27619.0} | {'precision': 0.8435603253400192, 'recall': 0.8413048988015497, 'f1-score': 0.841905204414389, 'support': 27619.0} |
85
+ | No log | 10.0 | 410 | 0.6014 | {'precision': 0.5836298932384342, 'recall': 0.6051660516605166, 'f1-score': 0.5942028985507246, 'support': 271.0} | {'precision': 0.6457142857142857, 'recall': 0.8129496402877698, 'f1-score': 0.7197452229299364, 'support': 139.0} | {'precision': 0.7774294670846394, 'recall': 0.7835703001579779, 'f1-score': 0.7804878048780488, 'support': 633.0} | {'precision': 0.619513418610484, 'recall': 0.6173456635841039, 'f1-score': 0.6184276414621932, 'support': 4001.0} | {'precision': 0.741311042674879, 'recall': 0.8370591157476404, 'f1-score': 0.7862809146056929, 'support': 2013.0} | {'precision': 0.8770470496490772, 'recall': 0.8929075511644319, 'f1-score': 0.884906237705993, 'support': 11336.0} | {'precision': 0.937757909215956, 'recall': 0.8867331454584869, 'f1-score': 0.9115320334261839, 'support': 9226.0} | 0.8411 | {'precision': 0.7403432951696793, 'recall': 0.7765330668658467, 'f1-score': 0.7565118219369674, 'support': 27619.0} | {'precision': 0.8438003903638766, 'recall': 0.8411238640066621, 'f1-score': 0.8419322378651984, 'support': 27619.0} |
86
+ | No log | 11.0 | 451 | 0.5827 | {'precision': 0.5910652920962199, 'recall': 0.6346863468634686, 'f1-score': 0.6120996441281139, 'support': 271.0} | {'precision': 0.65625, 'recall': 0.7553956834532374, 'f1-score': 0.7023411371237458, 'support': 139.0} | {'precision': 0.7770897832817337, 'recall': 0.7930489731437599, 'f1-score': 0.7849882720875684, 'support': 633.0} | {'precision': 0.6108468125594672, 'recall': 0.6418395401149712, 'f1-score': 0.6259597806215723, 'support': 4001.0} | {'precision': 0.7910066428206438, 'recall': 0.7690014903129657, 'f1-score': 0.7798488664987405, 'support': 2013.0} | {'precision': 0.8831168831168831, 'recall': 0.8817925194071983, 'f1-score': 0.8824542043698962, 'support': 11336.0} | {'precision': 0.9239106392391064, 'recall': 0.905484500325168, 'f1-score': 0.9146047733742062, 'support': 9226.0} | 0.8416 | {'precision': 0.7476122933020077, 'recall': 0.768749864802967, 'f1-score': 0.7574709540291205, 'support': 27619.0} | {'precision': 0.8441508487148985, 'recall': 0.8416307614323473, 'f1-score': 0.8427657535850971, 'support': 27619.0} |
87
+ | No log | 12.0 | 492 | 0.6254 | {'precision': 0.5899280575539568, 'recall': 0.6051660516605166, 'f1-score': 0.5974499089253188, 'support': 271.0} | {'precision': 0.6491228070175439, 'recall': 0.7985611510791367, 'f1-score': 0.7161290322580646, 'support': 139.0} | {'precision': 0.7659574468085106, 'recall': 0.7962085308056872, 'f1-score': 0.7807900852052672, 'support': 633.0} | {'precision': 0.6134020618556701, 'recall': 0.6245938515371158, 'f1-score': 0.6189473684210526, 'support': 4001.0} | {'precision': 0.7651515151515151, 'recall': 0.8027819175360159, 'f1-score': 0.7835151515151515, 'support': 2013.0} | {'precision': 0.8724084312370421, 'recall': 0.890878616796048, 'f1-score': 0.8815467877094972, 'support': 11336.0} | {'precision': 0.9412571428571429, 'recall': 0.8926945588554086, 'f1-score': 0.9163328882955052, 'support': 9226.0} | 0.8411 | {'precision': 0.7424610660687689, 'recall': 0.7729835254671328, 'f1-score': 0.7563873174756939, 'support': 27619.0} | {'precision': 0.8437337218432961, 'recall': 0.8410514500887071, 'f1-score': 0.842051378351113, 'support': 27619.0} |
88
+ | 0.325 | 13.0 | 533 | 0.6340 | {'precision': 0.599290780141844, 'recall': 0.6236162361623616, 'f1-score': 0.6112115732368897, 'support': 271.0} | {'precision': 0.6832298136645962, 'recall': 0.7913669064748201, 'f1-score': 0.7333333333333333, 'support': 139.0} | {'precision': 0.7525622254758418, 'recall': 0.8120063191153238, 'f1-score': 0.7811550151975684, 'support': 633.0} | {'precision': 0.6177615571776156, 'recall': 0.6345913521619595, 'f1-score': 0.626063370731106, 'support': 4001.0} | {'precision': 0.7841762643965949, 'recall': 0.7779433681073026, 'f1-score': 0.7810473815461346, 'support': 2013.0} | {'precision': 0.8783854621701904, 'recall': 0.886908962597036, 'f1-score': 0.8826266350627688, 'support': 11336.0} | {'precision': 0.9332214765100671, 'recall': 0.9042922176457836, 'f1-score': 0.9185291203346911, 'support': 9226.0} | 0.8434 | {'precision': 0.7498039399338216, 'recall': 0.7758179088949412, 'f1-score': 0.7619952042060703, 'support': 27619.0} | {'precision': 0.8454773303227912, 'recall': 0.8434411093812231, 'f1-score': 0.84430920449428, 'support': 27619.0} |
89
+ | 0.325 | 14.0 | 574 | 0.6438 | {'precision': 0.6007462686567164, 'recall': 0.5940959409594095, 'f1-score': 0.5974025974025974, 'support': 271.0} | {'precision': 0.6607142857142857, 'recall': 0.7985611510791367, 'f1-score': 0.7231270358306189, 'support': 139.0} | {'precision': 0.7559523809523809, 'recall': 0.8025276461295419, 'f1-score': 0.778544061302682, 'support': 633.0} | {'precision': 0.6356897008207573, 'recall': 0.6000999750062485, 'f1-score': 0.6173823605039856, 'support': 4001.0} | {'precision': 0.7802303262955854, 'recall': 0.8077496274217586, 'f1-score': 0.7937515255064681, 'support': 2013.0} | {'precision': 0.8701530612244898, 'recall': 0.9026993648553282, 'f1-score': 0.8861274679598199, 'support': 11336.0} | {'precision': 0.9353205849268842, 'recall': 0.901257316280078, 'f1-score': 0.9179730624862, 'support': 9226.0} | 0.8456 | {'precision': 0.7484009440844428, 'recall': 0.772427288818786, 'f1-score': 0.7591868729989103, 'support': 27619.0} | {'precision': 0.8450878141879223, 'recall': 0.845613526919874, 'f1-score': 0.8449820141638845, 'support': 27619.0} |
90
+ | 0.325 | 15.0 | 615 | 0.6473 | {'precision': 0.5985130111524164, 'recall': 0.5940959409594095, 'f1-score': 0.5962962962962963, 'support': 271.0} | {'precision': 0.6707317073170732, 'recall': 0.7913669064748201, 'f1-score': 0.7260726072607261, 'support': 139.0} | {'precision': 0.7566765578635015, 'recall': 0.8056872037914692, 'f1-score': 0.7804131599081868, 'support': 633.0} | {'precision': 0.6212043232115285, 'recall': 0.6033491627093227, 'f1-score': 0.6121465703055661, 'support': 4001.0} | {'precision': 0.7810361681329423, 'recall': 0.793840039741679, 'f1-score': 0.7873860556787385, 'support': 2013.0} | {'precision': 0.8689666893269884, 'recall': 0.9020818630910374, 'f1-score': 0.8852146814404431, 'support': 11336.0} | {'precision': 0.9395142986836132, 'recall': 0.8973553002384566, 'f1-score': 0.9179509923494844, 'support': 9226.0} | 0.8435 | {'precision': 0.748091822241152, 'recall': 0.7696823452865992, 'f1-score': 0.7579257661770632, 'support': 27619.0} | {'precision': 0.8440071909900311, 'recall': 0.8435497302581556, 'f1-score': 0.8434243803550634, 'support': 27619.0} |
91
+
92
+
93
+ ### Framework versions
94
+
95
+ - Transformers 4.37.2
96
+ - Pytorch 2.2.0+cu121
97
+ - Datasets 2.17.0
98
+ - Tokenizers 0.15.2
meta_data/README_s42_e4.md CHANGED
@@ -17,12 +17,12 @@ model-index:
17
  name: essays_su_g
18
  type: essays_su_g
19
  config: full_labels
20
- split: test
21
  args: full_labels
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.81812318606901
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,17 +32,17 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.4814
36
- - B-claim: {'precision': 0.4462809917355372, 'recall': 0.19494584837545126, 'f1-score': 0.271356783919598, 'support': 277.0}
37
- - B-majorclaim: {'precision': 0.5, 'recall': 0.0070921985815602835, 'f1-score': 0.013986013986013986, 'support': 141.0}
38
- - B-premise: {'precision': 0.6366995073891626, 'recall': 0.8065522620904836, 'f1-score': 0.7116311080523056, 'support': 641.0}
39
- - I-claim: {'precision': 0.5918424753867791, 'recall': 0.5158126991909782, 'f1-score': 0.5512182342153522, 'support': 4079.0}
40
- - I-majorclaim: {'precision': 0.6724854530340815, 'recall': 0.792748652621264, 'f1-score': 0.7276815830897234, 'support': 2041.0}
41
- - I-premise: {'precision': 0.8480509807167095, 'recall': 0.8945438673068529, 'f1-score': 0.8706772028209704, 'support': 11455.0}
42
- - O: {'precision': 0.928555431131019, 'recall': 0.8940161725067386, 'f1-score': 0.9109585278769569, 'support': 9275.0}
43
- - Accuracy: 0.8181
44
- - Macro avg: {'precision': 0.6605592627704697, 'recall': 0.5865302429533327, 'f1-score': 0.579644207708703, 'support': 27909.0}
45
- - Weighted avg: {'precision': 0.813919814165414, 'recall': 0.81812318606901, 'f1-score': 0.812987150747172, 'support': 27909.0}
46
 
47
  ## Model description
48
 
@@ -71,12 +71,12 @@ The following hyperparameters were used during training:
71
 
72
  ### Training results
73
 
74
- | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
- |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
- | No log | 1.0 | 41 | 0.7395 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.8888888888888888, 'recall': 0.0499219968798752, 'f1-score': 0.09453471196454949, 'support': 641.0} | {'precision': 0.44852021675698206, 'recall': 0.2637901446432949, 'f1-score': 0.3322012966965112, 'support': 4079.0} | {'precision': 0.6627906976744186, 'recall': 0.05585497305242528, 'f1-score': 0.10302756439222775, 'support': 2041.0} | {'precision': 0.7670441927603819, 'recall': 0.9045831514622436, 'f1-score': 0.8301554238102867, 'support': 11455.0} | {'precision': 0.7261935046213855, 'recall': 0.9233423180592992, 'f1-score': 0.8129865198405164, 'support': 9275.0} | 0.7219 | {'precision': 0.4990625001002938, 'recall': 0.31392751201387686, 'f1-score': 0.3104150738148702, 'support': 27909.0} | {'precision': 0.6906010082524104, 'recall': 0.7219176609695797, 'f1-score': 0.6691678472817553, 'support': 27909.0} |
77
- | No log | 2.0 | 82 | 0.5461 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.6170763260025873, 'recall': 0.7441497659906396, 'f1-score': 0.6746817538896746, 'support': 641.0} | {'precision': 0.5907648924920691, 'recall': 0.4108850208384408, 'f1-score': 0.4846732215153268, 'support': 4079.0} | {'precision': 0.6679045092838196, 'recall': 0.6168544830965214, 'f1-score': 0.6413652572592969, 'support': 2041.0} | {'precision': 0.8288555617932473, 'recall': 0.902226102138804, 'f1-score': 0.8639859555258319, 'support': 11455.0} | {'precision': 0.8542986425339366, 'recall': 0.916010781671159, 'f1-score': 0.8840790842872008, 'support': 9275.0} | 0.7970 | {'precision': 0.5084142760150943, 'recall': 0.5128751648193663, 'f1-score': 0.5069693246396187, 'support': 27909.0} | {'precision': 0.7734648104459132, 'recall': 0.7969830520620589, 'f1-score': 0.7816572500692507, 'support': 27909.0} |
78
- | No log | 3.0 | 123 | 0.4979 | {'precision': 0.4406779661016949, 'recall': 0.09386281588447654, 'f1-score': 0.15476190476190477, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.6197854588796186, 'recall': 0.8112324492979719, 'f1-score': 0.7027027027027026, 'support': 641.0} | {'precision': 0.5727699530516432, 'recall': 0.4486393723951949, 'f1-score': 0.5031619466593347, 'support': 4079.0} | {'precision': 0.7464440321583179, 'recall': 0.5913767760901519, 'f1-score': 0.6599234554401313, 'support': 2041.0} | {'precision': 0.8286169788760462, 'recall': 0.9074639895242252, 'f1-score': 0.86625, 'support': 11455.0} | {'precision': 0.8804516730550088, 'recall': 0.9163342318059299, 'f1-score': 0.8980346576500422, 'support': 9275.0} | 0.8054 | {'precision': 0.58410658030319, 'recall': 0.5384156621425643, 'f1-score': 0.5406906667448738, 'support': 27909.0} | {'precision': 0.7896079381022287, 'recall': 0.8053674441936293, 'f1-score': 0.7934633284149326, 'support': 27909.0} |
79
- | No log | 4.0 | 164 | 0.4814 | {'precision': 0.4462809917355372, 'recall': 0.19494584837545126, 'f1-score': 0.271356783919598, 'support': 277.0} | {'precision': 0.5, 'recall': 0.0070921985815602835, 'f1-score': 0.013986013986013986, 'support': 141.0} | {'precision': 0.6366995073891626, 'recall': 0.8065522620904836, 'f1-score': 0.7116311080523056, 'support': 641.0} | {'precision': 0.5918424753867791, 'recall': 0.5158126991909782, 'f1-score': 0.5512182342153522, 'support': 4079.0} | {'precision': 0.6724854530340815, 'recall': 0.792748652621264, 'f1-score': 0.7276815830897234, 'support': 2041.0} | {'precision': 0.8480509807167095, 'recall': 0.8945438673068529, 'f1-score': 0.8706772028209704, 'support': 11455.0} | {'precision': 0.928555431131019, 'recall': 0.8940161725067386, 'f1-score': 0.9109585278769569, 'support': 9275.0} | 0.8181 | {'precision': 0.6605592627704697, 'recall': 0.5865302429533327, 'f1-score': 0.579644207708703, 'support': 27909.0} | {'precision': 0.813919814165414, 'recall': 0.81812318606901, 'f1-score': 0.812987150747172, 'support': 27909.0} |
80
 
81
 
82
  ### Framework versions
 
17
  name: essays_su_g
18
  type: essays_su_g
19
  config: full_labels
20
+ split: train[80%:100%]
21
  args: full_labels
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.8239255584923423
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 0.4650
36
+ - B-claim: {'precision': 0.49019607843137253, 'recall': 0.18450184501845018, 'f1-score': 0.2680965147453083, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0}
38
+ - B-premise: {'precision': 0.6389937106918239, 'recall': 0.8025276461295419, 'f1-score': 0.711484593837535, 'support': 633.0}
39
+ - I-claim: {'precision': 0.5901814300960512, 'recall': 0.5528617845538615, 'f1-score': 0.5709123757904246, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.6800699300699301, 'recall': 0.7729756582215599, 'f1-score': 0.723552662171588, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8548009367681498, 'recall': 0.9015525758645024, 'f1-score': 0.8775545251588528, 'support': 11336.0}
42
+ - O: {'precision': 0.9404352806414662, 'recall': 0.889876436158682, 'f1-score': 0.9144575629316106, 'support': 9226.0}
43
+ - Accuracy: 0.8239
44
+ - Macro avg: {'precision': 0.5992396238141133, 'recall': 0.5863279922780854, 'f1-score': 0.5808654620907598, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8195120078775411, 'recall': 0.8239255584923423, 'f1-score': 0.8200332887031329, 'support': 27619.0}
46
 
47
  ## Model description
48
 
 
71
 
72
  ### Training results
73
 
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:---------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.6798 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.8333333333333334, 'recall': 0.023696682464454975, 'f1-score': 0.046082949308755755, 'support': 633.0} | {'precision': 0.478515625, 'recall': 0.12246938265433642, 'f1-score': 0.19502487562189058, 'support': 4001.0} | {'precision': 0.5305555555555556, 'recall': 0.4744162940884252, 'f1-score': 0.5009179124049304, 'support': 2013.0} | {'precision': 0.7172481252037822, 'recall': 0.9702717007762879, 'f1-score': 0.8247909714671365, 'support': 11336.0} | {'precision': 0.8570218174115654, 'recall': 0.8770864946889226, 'f1-score': 0.8669380758517248, 'support': 9226.0} | 0.7441 | {'precision': 0.48809635092917664, 'recall': 0.3525629363817753, 'f1-score': 0.34767925495063395, 'support': 27619.0} | {'precision': 0.7077612289984253, 'recall': 0.7440892139469206, 'f1-score': 0.6939430802095016, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5172 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.6075268817204301, 'recall': 0.7140600315955766, 'f1-score': 0.6564996368917938, 'support': 633.0} | {'precision': 0.5780830098536877, 'recall': 0.4838790302424394, 'f1-score': 0.5268027210884353, 'support': 4001.0} | {'precision': 0.6370745170193193, 'recall': 0.6880278191753602, 'f1-score': 0.6615715309290662, 'support': 2013.0} | {'precision': 0.8294960553856062, 'recall': 0.9089625970359916, 'f1-score': 0.8674130819092518, 'support': 11336.0} | {'precision': 0.9126539753639418, 'recall': 0.8833730760893128, 'f1-score': 0.8977748402731879, 'support': 9226.0} | 0.8048 | {'precision': 0.5092620627632836, 'recall': 0.5254717934483829, 'f1-score': 0.5157231158702479, 'support': 27619.0} | {'precision': 0.7894282378751035, 'recall': 0.8047720771932365, 'f1-score': 0.7954998668261435, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4747 | {'precision': 0.5384615384615384, 'recall': 0.07749077490774908, 'f1-score': 0.13548387096774195, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.6037735849056604, 'recall': 0.8088467614533965, 'f1-score': 0.6914247130317352, 'support': 633.0} | {'precision': 0.6086181277860326, 'recall': 0.511872031992002, 'f1-score': 0.5560684224816727, 'support': 4001.0} | {'precision': 0.6582597730138714, 'recall': 0.7779433681073026, 'f1-score': 0.7131147540983607, 'support': 2013.0} | {'precision': 0.8480489417989417, 'recall': 0.9049047282992237, 'f1-score': 0.8755547968589962, 'support': 11336.0} | {'precision': 0.927013045434098, 'recall': 0.893453284196835, 'f1-score': 0.9099238326526107, 'support': 9226.0} | 0.8200 | {'precision': 0.5977392873428775, 'recall': 0.5677872784223584, 'f1-score': 0.5545100557273025, 'support': 27619.0} | {'precision': 0.8130046334018246, 'recall': 0.8200152069227705, 'f1-score': 0.8130259671956656, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4650 | {'precision': 0.49019607843137253, 'recall': 0.18450184501845018, 'f1-score': 0.2680965147453083, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.6389937106918239, 'recall': 0.8025276461295419, 'f1-score': 0.711484593837535, 'support': 633.0} | {'precision': 0.5901814300960512, 'recall': 0.5528617845538615, 'f1-score': 0.5709123757904246, 'support': 4001.0} | {'precision': 0.6800699300699301, 'recall': 0.7729756582215599, 'f1-score': 0.723552662171588, 'support': 2013.0} | {'precision': 0.8548009367681498, 'recall': 0.9015525758645024, 'f1-score': 0.8775545251588528, 'support': 11336.0} | {'precision': 0.9404352806414662, 'recall': 0.889876436158682, 'f1-score': 0.9144575629316106, 'support': 9226.0} | 0.8239 | {'precision': 0.5992396238141133, 'recall': 0.5863279922780854, 'f1-score': 0.5808654620907598, 'support': 27619.0} | {'precision': 0.8195120078775411, 'recall': 0.8239255584923423, 'f1-score': 0.8200332887031329, 'support': 27619.0} |
80
 
81
 
82
  ### Framework versions
meta_data/README_s42_e5.md CHANGED
@@ -1,5 +1,4 @@
1
  ---
2
- license: apache-2.0
3
  base_model: allenai/longformer-base-4096
4
  tags:
5
  - generated_from_trainer
@@ -17,12 +16,12 @@ model-index:
17
  name: essays_su_g
18
  type: essays_su_g
19
  config: full_labels
20
- split: test
21
  args: full_labels
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.8265434089361855
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,17 +31,17 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.4535
36
- - B-claim: {'precision': 0.48535564853556484, 'recall': 0.4187725631768953, 'f1-score': 0.44961240310077516, 'support': 277.0}
37
- - B-majorclaim: {'precision': 0.6835443037974683, 'recall': 0.3829787234042553, 'f1-score': 0.4909090909090909, 'support': 141.0}
38
- - B-premise: {'precision': 0.6908850726552179, 'recall': 0.8159126365054602, 'f1-score': 0.748211731044349, 'support': 641.0}
39
- - I-claim: {'precision': 0.607398910238027, 'recall': 0.5192449129688649, 'f1-score': 0.5598731165741475, 'support': 4079.0}
40
- - I-majorclaim: {'precision': 0.7189189189189189, 'recall': 0.781969622733954, 'f1-score': 0.749119924900258, 'support': 2041.0}
41
- - I-premise: {'precision': 0.848167970358172, 'recall': 0.899257965953732, 'f1-score': 0.8729661016949152, 'support': 11455.0}
42
- - O: {'precision': 0.9307503896682253, 'recall': 0.9013477088948787, 'f1-score': 0.9158131127786602, 'support': 9275.0}
43
- - Accuracy: 0.8265
44
- - Macro avg: {'precision': 0.7092887448816564, 'recall': 0.6742120190911487, 'f1-score': 0.683786497286028, 'support': 27909.0}
45
- - Weighted avg: {'precision': 0.822926232614994, 'recall': 0.8265434089361855, 'f1-score': 0.8233915246781047, 'support': 27909.0}
46
 
47
  ## Model description
48
 
@@ -71,13 +70,13 @@ The following hyperparameters were used during training:
71
 
72
  ### Training results
73
 
74
- | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
- |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
- | No log | 1.0 | 41 | 0.7482 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.8505747126436781, 'recall': 0.11544461778471139, 'f1-score': 0.2032967032967033, 'support': 641.0} | {'precision': 0.455, 'recall': 0.26771267467516546, 'f1-score': 0.3370890569532335, 'support': 4079.0} | {'precision': 0.6081504702194357, 'recall': 0.0950514453699167, 'f1-score': 0.16440677966101694, 'support': 2041.0} | {'precision': 0.7819490672652682, 'recall': 0.8818856394587516, 'f1-score': 0.8289160580946912, 'support': 11455.0} | {'precision': 0.7102757715036113, 'recall': 0.9330458221024259, 'f1-score': 0.8065613495503053, 'support': 9275.0} | 0.7208 | {'precision': 0.48656428880457053, 'recall': 0.327591457055853, 'f1-score': 0.3343242782222786, 'support': 27909.0} | {'precision': 0.6874998332703467, 'recall': 0.7207710774302196, 'f1-score': 0.6742249328162002, 'support': 27909.0} |
77
- | No log | 2.0 | 82 | 0.5451 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.5961070559610706, 'recall': 0.7644305772230889, 'f1-score': 0.6698564593301436, 'support': 641.0} | {'precision': 0.5515320334261838, 'recall': 0.43687178229958323, 'f1-score': 0.48755129958960325, 'support': 4079.0} | {'precision': 0.6582010582010582, 'recall': 0.6095051445369917, 'f1-score': 0.6329178326125668, 'support': 2041.0} | {'precision': 0.8171697373578989, 'recall': 0.9099083369707551, 'f1-score': 0.8610491532424618, 'support': 11455.0} | {'precision': 0.8903484963630441, 'recall': 0.8842048517520216, 'f1-score': 0.8872660391647733, 'support': 9275.0} | 0.7933 | {'precision': 0.5019083401870365, 'recall': 0.5149886703974914, 'f1-score': 0.5055201119913642, 'support': 27909.0} | {'precision': 0.7737236659216302, 'recall': 0.7932924862947436, 'f1-score': 0.7812030384988546, 'support': 27909.0} |
78
- | No log | 3.0 | 123 | 0.4863 | {'precision': 0.3375, 'recall': 0.19494584837545126, 'f1-score': 0.2471395881006865, 'support': 277.0} | {'precision': 0.5, 'recall': 0.0070921985815602835, 'f1-score': 0.013986013986013986, 'support': 141.0} | {'precision': 0.6352509179926561, 'recall': 0.8096723868954758, 'f1-score': 0.7119341563786008, 'support': 641.0} | {'precision': 0.5800391389432485, 'recall': 0.3633243442020103, 'f1-score': 0.4467892674103105, 'support': 4079.0} | {'precision': 0.7292197858235594, 'recall': 0.7006369426751592, 'f1-score': 0.7146426786606696, 'support': 2041.0} | {'precision': 0.8073848496383708, 'recall': 0.9257965953731995, 'f1-score': 0.8625457503050019, 'support': 11455.0} | {'precision': 0.9044078025649316, 'recall': 0.9047978436657682, 'f1-score': 0.9046027810714671, 'support': 9275.0} | 0.8056 | {'precision': 0.6419717849946809, 'recall': 0.5580380228240892, 'f1-score': 0.5573771765589643, 'support': 27909.0} | {'precision': 0.7905147583376675, 'recall': 0.8055824286072593, 'f1-score': 0.7910878562904201, 'support': 27909.0} |
79
- | No log | 4.0 | 164 | 0.4628 | {'precision': 0.4874551971326165, 'recall': 0.49097472924187724, 'f1-score': 0.48920863309352514, 'support': 277.0} | {'precision': 0.6333333333333333, 'recall': 0.2695035460992908, 'f1-score': 0.37810945273631846, 'support': 141.0} | {'precision': 0.7059639389736477, 'recall': 0.7940717628705148, 'f1-score': 0.7474302496328927, 'support': 641.0} | {'precision': 0.5995037220843672, 'recall': 0.592302034812454, 'f1-score': 0.5958811197434949, 'support': 4079.0} | {'precision': 0.665748031496063, 'recall': 0.828515433610975, 'f1-score': 0.7382667539838464, 'support': 2041.0} | {'precision': 0.8763167134831461, 'recall': 0.8714971628109995, 'f1-score': 0.8739002932551321, 'support': 11455.0} | {'precision': 0.9352987509845843, 'recall': 0.8961725067385444, 'f1-score': 0.9153176962889549, 'support': 9275.0} | 0.8272 | {'precision': 0.7005170982125369, 'recall': 0.6775767394549508, 'f1-score': 0.6768734569620235, 'support': 27909.0} | {'precision': 0.831062354705826, 'recall': 0.8271525314414705, 'f1-score': 0.8278844831004247, 'support': 27909.0} |
80
- | No log | 5.0 | 205 | 0.4535 | {'precision': 0.48535564853556484, 'recall': 0.4187725631768953, 'f1-score': 0.44961240310077516, 'support': 277.0} | {'precision': 0.6835443037974683, 'recall': 0.3829787234042553, 'f1-score': 0.4909090909090909, 'support': 141.0} | {'precision': 0.6908850726552179, 'recall': 0.8159126365054602, 'f1-score': 0.748211731044349, 'support': 641.0} | {'precision': 0.607398910238027, 'recall': 0.5192449129688649, 'f1-score': 0.5598731165741475, 'support': 4079.0} | {'precision': 0.7189189189189189, 'recall': 0.781969622733954, 'f1-score': 0.749119924900258, 'support': 2041.0} | {'precision': 0.848167970358172, 'recall': 0.899257965953732, 'f1-score': 0.8729661016949152, 'support': 11455.0} | {'precision': 0.9307503896682253, 'recall': 0.9013477088948787, 'f1-score': 0.9158131127786602, 'support': 9275.0} | 0.8265 | {'precision': 0.7092887448816564, 'recall': 0.6742120190911487, 'f1-score': 0.683786497286028, 'support': 27909.0} | {'precision': 0.822926232614994, 'recall': 0.8265434089361855, 'f1-score': 0.8233915246781047, 'support': 27909.0} |
81
 
82
 
83
  ### Framework versions
 
1
  ---
 
2
  base_model: allenai/longformer-base-4096
3
  tags:
4
  - generated_from_trainer
 
16
  name: essays_su_g
17
  type: essays_su_g
18
  config: full_labels
19
+ split: train[80%:100%]
20
  args: full_labels
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
+ value: 0.8354393714471922
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
31
 
32
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
33
  It achieves the following results on the evaluation set:
34
+ - Loss: 0.4449
35
+ - B-claim: {'precision': 0.5258620689655172, 'recall': 0.45018450184501846, 'f1-score': 0.485089463220676, 'support': 271.0}
36
+ - B-majorclaim: {'precision': 0.7142857142857143, 'recall': 0.07194244604316546, 'f1-score': 0.13071895424836602, 'support': 139.0}
37
+ - B-premise: {'precision': 0.7081604426002767, 'recall': 0.8088467614533965, 'f1-score': 0.7551622418879057, 'support': 633.0}
38
+ - I-claim: {'precision': 0.622454448017149, 'recall': 0.580604848787803, 'f1-score': 0.6008017586964955, 'support': 4001.0}
39
+ - I-majorclaim: {'precision': 0.6968287526427062, 'recall': 0.8186785891703925, 'f1-score': 0.7528551850159891, 'support': 2013.0}
40
+ - I-premise: {'precision': 0.8654449817595656, 'recall': 0.8998764996471419, 'f1-score': 0.8823249578341911, 'support': 11336.0}
41
+ - O: {'precision': 0.9420488250057039, 'recall': 0.8950791242141773, 'f1-score': 0.9179635393508226, 'support': 9226.0}
42
+ - Accuracy: 0.8354
43
+ - Macro avg: {'precision': 0.725012176182376, 'recall': 0.6464589673087279, 'f1-score': 0.6464165857506352, 'support': 27619.0}
44
+ - Weighted avg: {'precision': 0.8358464532914583, 'recall': 0.8354393714471922, 'f1-score': 0.8334161098638371, 'support': 27619.0}
45
 
46
  ## Model description
47
 
 
70
 
71
  ### Training results
72
 
73
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
74
+ |:-------------:|:-----:|:----:|:---------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:----------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
75
+ | No log | 1.0 | 41 | 0.6799 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.875, 'recall': 0.044233807266982623, 'f1-score': 0.08421052631578947, 'support': 633.0} | {'precision': 0.44683080146673654, 'recall': 0.2131967008247938, 'f1-score': 0.28866328257191204, 'support': 4001.0} | {'precision': 0.592, 'recall': 0.36761053154495776, 'f1-score': 0.4535703340484217, 'support': 2013.0} | {'precision': 0.7292961700421094, 'recall': 0.9625088214537756, 'f1-score': 0.8298284975472487, 'support': 11336.0} | {'precision': 0.8543361149255307, 'recall': 0.8766529373509646, 'f1-score': 0.8653506660247152, 'support': 9226.0} | 0.7466 | {'precision': 0.49963758377633954, 'recall': 0.35202897120592486, 'f1-score': 0.36023190092972673, 'support': 27619.0} | {'precision': 0.7126524282765021, 'recall': 0.7465874941163692, 'f1-score': 0.706468200590435, 'support': 27619.0} |
76
+ | No log | 2.0 | 82 | 0.5045 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.5941676792223572, 'recall': 0.7725118483412322, 'f1-score': 0.6717032967032966, 'support': 633.0} | {'precision': 0.5916276346604216, 'recall': 0.5051237190702325, 'f1-score': 0.5449642712687071, 'support': 4001.0} | {'precision': 0.65738555922605, 'recall': 0.6920019870839543, 'f1-score': 0.6742497579864472, 'support': 2013.0} | {'precision': 0.8346545866364666, 'recall': 0.910197600564573, 'f1-score': 0.8707907840324077, 'support': 11336.0} | {'precision': 0.9139132389300967, 'recall': 0.8814220680685021, 'f1-score': 0.8973736482012802, 'support': 9226.0} | 0.8093 | {'precision': 0.5131069569536274, 'recall': 0.5373224604469277, 'f1-score': 0.5227259654560198, 'support': 27619.0} | {'precision': 0.7951024792507403, 'recall': 0.8093341540244035, 'f1-score': 0.800655657521358, 'support': 27619.0} |
77
+ | No log | 3.0 | 123 | 0.4710 | {'precision': 0.5217391304347826, 'recall': 0.17712177121771217, 'f1-score': 0.2644628099173554, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.6571798188874515, 'recall': 0.8025276461295419, 'f1-score': 0.7226173541963017, 'support': 633.0} | {'precision': 0.6227746053073564, 'recall': 0.4633841539615096, 'f1-score': 0.531384350816853, 'support': 4001.0} | {'precision': 0.6513105639396346, 'recall': 0.8147044212617983, 'f1-score': 0.7239020083866696, 'support': 2013.0} | {'precision': 0.8291489025738197, 'recall': 0.9264290755116443, 'f1-score': 0.8750937421881511, 'support': 11336.0} | {'precision': 0.9421622250669149, 'recall': 0.8775200520268805, 'f1-score': 0.9086929681800325, 'support': 9226.0} | 0.8200 | {'precision': 0.6034736066014228, 'recall': 0.5802410171584409, 'f1-score': 0.5751647476693377, 'support': 27619.0} | {'precision': 0.8129119859079974, 'recall': 0.8200152069227705, 'f1-score': 0.8116164134497381, 'support': 27619.0} |
78
+ | No log | 4.0 | 164 | 0.4437 | {'precision': 0.4723618090452261, 'recall': 0.34686346863468637, 'f1-score': 0.4, 'support': 271.0} | {'precision': 0.8571428571428571, 'recall': 0.04316546762589928, 'f1-score': 0.08219178082191782, 'support': 139.0} | {'precision': 0.6771653543307087, 'recall': 0.8151658767772512, 'f1-score': 0.739784946236559, 'support': 633.0} | {'precision': 0.6176223776223776, 'recall': 0.5518620344913772, 'f1-score': 0.5828933474128827, 'support': 4001.0} | {'precision': 0.7292452830188679, 'recall': 0.7680079483358172, 'f1-score': 0.7481248487781272, 'support': 2013.0} | {'precision': 0.8598264678628591, 'recall': 0.9004057868736768, 'f1-score': 0.879648381953721, 'support': 11336.0} | {'precision': 0.9251513483764446, 'recall': 0.9110123563841318, 'f1-score': 0.9180274152148981, 'support': 9226.0} | 0.8321 | {'precision': 0.7340736424856201, 'recall': 0.6194975627318342, 'f1-score': 0.621524388631158, 'support': 27619.0} | {'precision': 0.8290421682205733, 'recall': 0.8321083312212607, 'f1-score': 0.8279682509392567, 'support': 27619.0} |
79
+ | No log | 5.0 | 205 | 0.4449 | {'precision': 0.5258620689655172, 'recall': 0.45018450184501846, 'f1-score': 0.485089463220676, 'support': 271.0} | {'precision': 0.7142857142857143, 'recall': 0.07194244604316546, 'f1-score': 0.13071895424836602, 'support': 139.0} | {'precision': 0.7081604426002767, 'recall': 0.8088467614533965, 'f1-score': 0.7551622418879057, 'support': 633.0} | {'precision': 0.622454448017149, 'recall': 0.580604848787803, 'f1-score': 0.6008017586964955, 'support': 4001.0} | {'precision': 0.6968287526427062, 'recall': 0.8186785891703925, 'f1-score': 0.7528551850159891, 'support': 2013.0} | {'precision': 0.8654449817595656, 'recall': 0.8998764996471419, 'f1-score': 0.8823249578341911, 'support': 11336.0} | {'precision': 0.9420488250057039, 'recall': 0.8950791242141773, 'f1-score': 0.9179635393508226, 'support': 9226.0} | 0.8354 | {'precision': 0.725012176182376, 'recall': 0.6464589673087279, 'f1-score': 0.6464165857506352, 'support': 27619.0} | {'precision': 0.8358464532914583, 'recall': 0.8354393714471922, 'f1-score': 0.8334161098638371, 'support': 27619.0} |
80
 
81
 
82
  ### Framework versions
meta_data/README_s42_e6.md CHANGED
@@ -1,5 +1,4 @@
1
  ---
2
- license: apache-2.0
3
  base_model: allenai/longformer-base-4096
4
  tags:
5
  - generated_from_trainer
@@ -17,12 +16,12 @@ model-index:
17
  name: essays_su_g
18
  type: essays_su_g
19
  config: full_labels
20
- split: test
21
  args: full_labels
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.837507614031316
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,17 +31,17 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.4465
36
- - B-claim: {'precision': 0.5551020408163265, 'recall': 0.49097472924187724, 'f1-score': 0.5210727969348659, 'support': 277.0}
37
- - B-majorclaim: {'precision': 0.696969696969697, 'recall': 0.6524822695035462, 'f1-score': 0.673992673992674, 'support': 141.0}
38
- - B-premise: {'precision': 0.7235213204951857, 'recall': 0.8205928237129485, 'f1-score': 0.7690058479532164, 'support': 641.0}
39
- - I-claim: {'precision': 0.6258634982039237, 'recall': 0.5552831576366757, 'f1-score': 0.5884645362431801, 'support': 4079.0}
40
- - I-majorclaim: {'precision': 0.7485029940119761, 'recall': 0.7961783439490446, 'f1-score': 0.7716049382716049, 'support': 2041.0}
41
- - I-premise: {'precision': 0.8608122758735719, 'recall': 0.9010912265386294, 'f1-score': 0.8804913418067046, 'support': 11455.0}
42
- - O: {'precision': 0.9317375886524822, 'recall': 0.906522911051213, 'f1-score': 0.918957320072135, 'support': 9275.0}
43
- - Accuracy: 0.8375
44
- - Macro avg: {'precision': 0.7346442021461661, 'recall': 0.7318750659477049, 'f1-score': 0.731941350753483, 'support': 27909.0}
45
- - Weighted avg: {'precision': 0.8348158563134491, 'recall': 0.837507614031316, 'f1-score': 0.8354599902087165, 'support': 27909.0}
46
 
47
  ## Model description
48
 
@@ -71,14 +70,14 @@ The following hyperparameters were used during training:
71
 
72
  ### Training results
73
 
74
- | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
- |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:----------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
- | No log | 1.0 | 41 | 0.7283 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.8, 'recall': 0.1747269890795632, 'f1-score': 0.28681177976952626, 'support': 641.0} | {'precision': 0.43048740595354923, 'recall': 0.32262809512135326, 'f1-score': 0.3688340807174888, 'support': 4079.0} | {'precision': 0.612691466083151, 'recall': 0.13718765311122, 'f1-score': 0.2241793434747798, 'support': 2041.0} | {'precision': 0.7836619287788477, 'recall': 0.8952422522915757, 'f1-score': 0.83574426469989, 'support': 11455.0} | {'precision': 0.7604082728982003, 'recall': 0.9156873315363881, 'f1-score': 0.8308550185873605, 'support': 9275.0} | 0.7330 | {'precision': 0.48389272481624973, 'recall': 0.34935318873430005, 'f1-score': 0.36377492674986367, 'support': 27909.0} | {'precision': 0.7004513073364416, 'recall': 0.7329535275359204, 'f1-score': 0.6960300066518306, 'support': 27909.0} |
77
- | No log | 2.0 | 82 | 0.5262 | {'precision': 0.25, 'recall': 0.0036101083032490976, 'f1-score': 0.007117437722419929, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.5701754385964912, 'recall': 0.8112324492979719, 'f1-score': 0.6696716033483581, 'support': 641.0} | {'precision': 0.5822991508817766, 'recall': 0.4371169404265751, 'f1-score': 0.499369836157401, 'support': 4079.0} | {'precision': 0.6713729308666018, 'recall': 0.6756491915727585, 'f1-score': 0.6735042735042734, 'support': 2041.0} | {'precision': 0.8282341604432667, 'recall': 0.9003928415539066, 'f1-score': 0.862807428475824, 'support': 11455.0} | {'precision': 0.884974533106961, 'recall': 0.8991913746630728, 'f1-score': 0.8920263115674635, 'support': 9275.0} | 0.8004 | {'precision': 0.5410080305564425, 'recall': 0.5324561294025049, 'f1-score': 0.5149281272536771, 'support': 27909.0} | {'precision': 0.7838247141399024, 'recall': 0.800351141208929, 'f1-score': 0.7882685135577218, 'support': 27909.0} |
78
- | No log | 3.0 | 123 | 0.4882 | {'precision': 0.3162393162393162, 'recall': 0.26714801444043323, 'f1-score': 0.2896281800391389, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.6475609756097561, 'recall': 0.828393135725429, 'f1-score': 0.7268993839835729, 'support': 641.0} | {'precision': 0.5622895622895623, 'recall': 0.3684726648688404, 'f1-score': 0.4452014218009479, 'support': 4079.0} | {'precision': 0.7279187817258883, 'recall': 0.7025967662910338, 'f1-score': 0.7150336574420344, 'support': 2041.0} | {'precision': 0.8009592326139089, 'recall': 0.9330423395896988, 'f1-score': 0.8619702407355135, 'support': 11455.0} | {'precision': 0.9306417051990526, 'recall': 0.8897035040431267, 'f1-score': 0.9097122698710176, 'support': 9275.0} | 0.8055 | {'precision': 0.5693727962396407, 'recall': 0.569908060708366, 'f1-score': 0.564063593410318, 'support': 27909.0} | {'precision': 0.7914520785180175, 'recall': 0.8055465978716543, 'f1-score': 0.7930409622719756, 'support': 27909.0} |
79
- | No log | 4.0 | 164 | 0.4589 | {'precision': 0.4944649446494465, 'recall': 0.48375451263537905, 'f1-score': 0.48905109489051096, 'support': 277.0} | {'precision': 0.7263157894736842, 'recall': 0.48936170212765956, 'f1-score': 0.5847457627118644, 'support': 141.0} | {'precision': 0.733044733044733, 'recall': 0.7925117004680188, 'f1-score': 0.7616191904047976, 'support': 641.0} | {'precision': 0.6057815298030187, 'recall': 0.5805344447168423, 'f1-score': 0.5928893340010015, 'support': 4079.0} | {'precision': 0.6938000843525939, 'recall': 0.8059774620284175, 'f1-score': 0.7456935630099728, 'support': 2041.0} | {'precision': 0.8850901340911109, 'recall': 0.8701003928415539, 'f1-score': 0.8775312555027294, 'support': 11455.0} | {'precision': 0.9134171232140939, 'recall': 0.9167654986522911, 'f1-score': 0.9150882479552304, 'support': 9275.0} | 0.8311 | {'precision': 0.7217020483755258, 'recall': 0.7055722447814518, 'f1-score': 0.7095169212108725, 'support': 27909.0} | {'precision': 0.831521700022212, 'recall': 0.8310580816224157, 'f1-score': 0.8307726680976919, 'support': 27909.0} |
80
- | No log | 5.0 | 205 | 0.4503 | {'precision': 0.5175097276264592, 'recall': 0.48014440433212996, 'f1-score': 0.49812734082397003, 'support': 277.0} | {'precision': 0.6929824561403509, 'recall': 0.5602836879432624, 'f1-score': 0.6196078431372548, 'support': 141.0} | {'precision': 0.7054886211512718, 'recall': 0.8221528861154446, 'f1-score': 0.7593659942363112, 'support': 641.0} | {'precision': 0.606861499364676, 'recall': 0.5854376072566806, 'f1-score': 0.5959570751185427, 'support': 4079.0} | {'precision': 0.7482582443102648, 'recall': 0.7893189612934836, 'f1-score': 0.7682403433476394, 'support': 2041.0} | {'precision': 0.8633981403212172, 'recall': 0.8916630292448713, 'f1-score': 0.8773029847541336, 'support': 11455.0} | {'precision': 0.938915812013975, 'recall': 0.8982210242587602, 'f1-score': 0.9181176989199912, 'support': 9275.0} | 0.8342 | {'precision': 0.7247735001326021, 'recall': 0.7181745143492331, 'f1-score': 0.719531325762549, 'support': 27909.0} | {'precision': 0.8346602140306427, 'recall': 0.8342470170912609, 'f1-score': 0.833997433789052, 'support': 27909.0} |
81
- | No log | 6.0 | 246 | 0.4465 | {'precision': 0.5551020408163265, 'recall': 0.49097472924187724, 'f1-score': 0.5210727969348659, 'support': 277.0} | {'precision': 0.696969696969697, 'recall': 0.6524822695035462, 'f1-score': 0.673992673992674, 'support': 141.0} | {'precision': 0.7235213204951857, 'recall': 0.8205928237129485, 'f1-score': 0.7690058479532164, 'support': 641.0} | {'precision': 0.6258634982039237, 'recall': 0.5552831576366757, 'f1-score': 0.5884645362431801, 'support': 4079.0} | {'precision': 0.7485029940119761, 'recall': 0.7961783439490446, 'f1-score': 0.7716049382716049, 'support': 2041.0} | {'precision': 0.8608122758735719, 'recall': 0.9010912265386294, 'f1-score': 0.8804913418067046, 'support': 11455.0} | {'precision': 0.9317375886524822, 'recall': 0.906522911051213, 'f1-score': 0.918957320072135, 'support': 9275.0} | 0.8375 | {'precision': 0.7346442021461661, 'recall': 0.7318750659477049, 'f1-score': 0.731941350753483, 'support': 27909.0} | {'precision': 0.8348158563134491, 'recall': 0.837507614031316, 'f1-score': 0.8354599902087165, 'support': 27909.0} |
82
 
83
 
84
  ### Framework versions
 
1
  ---
 
2
  base_model: allenai/longformer-base-4096
3
  tags:
4
  - generated_from_trainer
 
16
  name: essays_su_g
17
  type: essays_su_g
18
  config: full_labels
19
+ split: train[80%:100%]
20
  args: full_labels
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
+ value: 0.8425721423657627
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
31
 
32
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
33
  It achieves the following results on the evaluation set:
34
+ - Loss: 0.4437
35
+ - B-claim: {'precision': 0.5596707818930041, 'recall': 0.5018450184501845, 'f1-score': 0.5291828793774319, 'support': 271.0}
36
+ - B-majorclaim: {'precision': 0.6754385964912281, 'recall': 0.5539568345323741, 'f1-score': 0.608695652173913, 'support': 139.0}
37
+ - B-premise: {'precision': 0.7251381215469613, 'recall': 0.8293838862559242, 'f1-score': 0.7737656595431099, 'support': 633.0}
38
+ - I-claim: {'precision': 0.6368131868131868, 'recall': 0.5793551612096975, 'f1-score': 0.606726868210967, 'support': 4001.0}
39
+ - I-majorclaim: {'precision': 0.7290465631929046, 'recall': 0.8166915052160953, 'f1-score': 0.7703842549203374, 'support': 2013.0}
40
+ - I-premise: {'precision': 0.868999323867478, 'recall': 0.9070218772053634, 'f1-score': 0.887603591160221, 'support': 11336.0}
41
+ - O: {'precision': 0.940755873340143, 'recall': 0.8984391935833514, 'f1-score': 0.9191107168597882, 'support': 9226.0}
42
+ - Accuracy: 0.8426
43
+ - Macro avg: {'precision': 0.7336946353064151, 'recall': 0.7266704966361416, 'f1-score': 0.7279242317493955, 'support': 27619.0}
44
+ - Weighted avg: {'precision': 0.841826984781827, 'recall': 0.8425721423657627, 'f1-score': 0.8413663929346332, 'support': 27619.0}
45
 
46
  ## Model description
47
 
 
70
 
71
  ### Training results
72
 
73
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
74
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
75
+ | No log | 1.0 | 41 | 0.6604 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.8192771084337349, 'recall': 0.10742496050552923, 'f1-score': 0.1899441340782123, 'support': 633.0} | {'precision': 0.4472741742936729, 'recall': 0.2809297675581105, 'f1-score': 0.34510285538839425, 'support': 4001.0} | {'precision': 0.5929878048780488, 'recall': 0.38648782911077995, 'f1-score': 0.4679699248120301, 'support': 2013.0} | {'precision': 0.7589671029359746, 'recall': 0.9463655610444601, 'f1-score': 0.8423697538376977, 'support': 11336.0} | {'precision': 0.8500417710944027, 'recall': 0.8822891827444179, 'f1-score': 0.8658653334751623, 'support': 9226.0} | 0.7545 | {'precision': 0.495506851662262, 'recall': 0.3719281858518997, 'f1-score': 0.38732171451307096, 'support': 27619.0} | {'precision': 0.7222552333975241, 'recall': 0.7544806111734675, 'f1-score': 0.7234364646103437, 'support': 27619.0} |
76
+ | No log | 2.0 | 82 | 0.5105 | {'precision': 0.3333333333333333, 'recall': 0.007380073800738007, 'f1-score': 0.014440433212996389, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.5504201680672269, 'recall': 0.8278041074249605, 'f1-score': 0.661198738170347, 'support': 633.0} | {'precision': 0.5798838053740014, 'recall': 0.39915021244688825, 'f1-score': 0.47283493708364177, 'support': 4001.0} | {'precision': 0.651589789520824, 'recall': 0.7228017883755589, 'f1-score': 0.6853509185115403, 'support': 2013.0} | {'precision': 0.8067625458996328, 'recall': 0.9303105151729005, 'f1-score': 0.8641429039659129, 'support': 11336.0} | {'precision': 0.93048128342246, 'recall': 0.8675482332538478, 'f1-score': 0.8979133946600854, 'support': 9226.0} | 0.8012 | {'precision': 0.5503529893739255, 'recall': 0.536427847210699, 'f1-score': 0.5136973322292177, 'support': 27619.0} | {'precision': 0.7893332558202881, 'recall': 0.8011875882544625, 'f1-score': 0.7883684810959655, 'support': 27619.0} |
77
+ | No log | 3.0 | 123 | 0.4599 | {'precision': 0.4838709677419355, 'recall': 0.2767527675276753, 'f1-score': 0.352112676056338, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.6763540290620872, 'recall': 0.8088467614533965, 'f1-score': 0.7366906474820145, 'support': 633.0} | {'precision': 0.6095063025210085, 'recall': 0.5801049737565609, 'f1-score': 0.5944423101549494, 'support': 4001.0} | {'precision': 0.6323583857830433, 'recall': 0.8484848484848485, 'f1-score': 0.724649978786593, 'support': 2013.0} | {'precision': 0.8667463910480909, 'recall': 0.8951129146083274, 'f1-score': 0.8807012975741006, 'support': 11336.0} | {'precision': 0.9468786808009423, 'recall': 0.8713418599609798, 'f1-score': 0.9075412056897719, 'support': 9226.0} | 0.8256 | {'precision': 0.6022449652795868, 'recall': 0.6115205893988269, 'f1-score': 0.5994483022491097, 'support': 27619.0} | {'precision': 0.8266835539886613, 'recall': 0.8255910786053079, 'f1-score': 0.8239051695676377, 'support': 27619.0} |
78
+ | No log | 4.0 | 164 | 0.4403 | {'precision': 0.5138888888888888, 'recall': 0.4095940959409594, 'f1-score': 0.45585215605749485, 'support': 271.0} | {'precision': 0.7857142857142857, 'recall': 0.23741007194244604, 'f1-score': 0.36464088397790057, 'support': 139.0} | {'precision': 0.6914893617021277, 'recall': 0.8214849921011058, 'f1-score': 0.7509025270758122, 'support': 633.0} | {'precision': 0.6289361702127659, 'recall': 0.554111472131967, 'f1-score': 0.5891575870316237, 'support': 4001.0} | {'precision': 0.759545923632611, 'recall': 0.7312468951813215, 'f1-score': 0.7451278157428499, 'support': 2013.0} | {'precision': 0.8634677011300388, 'recall': 0.9032286520818631, 'f1-score': 0.8829007501940157, 'support': 11336.0} | {'precision': 0.9131136950904393, 'recall': 0.9192499458053327, 'f1-score': 0.9161715458571891, 'support': 9226.0} | 0.8354 | {'precision': 0.7365937180530224, 'recall': 0.653760875026428, 'f1-score': 0.672107609419555, 'support': 27619.0} | {'precision': 0.830739248805853, 'recall': 0.8354031644882146, 'f1-score': 0.831596571269241, 'support': 27619.0} |
79
+ | No log | 5.0 | 205 | 0.4412 | {'precision': 0.549800796812749, 'recall': 0.5092250922509225, 'f1-score': 0.528735632183908, 'support': 271.0} | {'precision': 0.6701030927835051, 'recall': 0.4676258992805755, 'f1-score': 0.5508474576271185, 'support': 139.0} | {'precision': 0.7202797202797203, 'recall': 0.8135860979462876, 'f1-score': 0.7640949554896144, 'support': 633.0} | {'precision': 0.6236191478169385, 'recall': 0.5926018495376156, 'f1-score': 0.6077149814174035, 'support': 4001.0} | {'precision': 0.7192829033668562, 'recall': 0.8171882762046696, 'f1-score': 0.7651162790697674, 'support': 2013.0} | {'precision': 0.8738498581133374, 'recall': 0.8964361326746648, 'f1-score': 0.8849989113868931, 'support': 11336.0} | {'precision': 0.9396922380629101, 'recall': 0.9001734229351832, 'f1-score': 0.9195084145261293, 'support': 9226.0} | 0.8400 | {'precision': 0.7280896796051451, 'recall': 0.7138338244042741, 'f1-score': 0.7172880902429765, 'support': 27619.0} | {'precision': 0.840604536138328, 'recall': 0.8400376552373366, 'f1-score': 0.8396721916823671, 'support': 27619.0} |
80
+ | No log | 6.0 | 246 | 0.4437 | {'precision': 0.5596707818930041, 'recall': 0.5018450184501845, 'f1-score': 0.5291828793774319, 'support': 271.0} | {'precision': 0.6754385964912281, 'recall': 0.5539568345323741, 'f1-score': 0.608695652173913, 'support': 139.0} | {'precision': 0.7251381215469613, 'recall': 0.8293838862559242, 'f1-score': 0.7737656595431099, 'support': 633.0} | {'precision': 0.6368131868131868, 'recall': 0.5793551612096975, 'f1-score': 0.606726868210967, 'support': 4001.0} | {'precision': 0.7290465631929046, 'recall': 0.8166915052160953, 'f1-score': 0.7703842549203374, 'support': 2013.0} | {'precision': 0.868999323867478, 'recall': 0.9070218772053634, 'f1-score': 0.887603591160221, 'support': 11336.0} | {'precision': 0.940755873340143, 'recall': 0.8984391935833514, 'f1-score': 0.9191107168597882, 'support': 9226.0} | 0.8426 | {'precision': 0.7336946353064151, 'recall': 0.7266704966361416, 'f1-score': 0.7279242317493955, 'support': 27619.0} | {'precision': 0.841826984781827, 'recall': 0.8425721423657627, 'f1-score': 0.8413663929346332, 'support': 27619.0} |
81
 
82
 
83
  ### Framework versions
meta_data/README_s42_e7.md CHANGED
@@ -1,5 +1,4 @@
1
  ---
2
- license: apache-2.0
3
  base_model: allenai/longformer-base-4096
4
  tags:
5
  - generated_from_trainer
@@ -17,12 +16,12 @@ model-index:
17
  name: essays_su_g
18
  type: essays_su_g
19
  config: full_labels
20
- split: test
21
  args: full_labels
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.8431688702569063
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,17 +31,17 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.4438
36
- - B-claim: {'precision': 0.5970149253731343, 'recall': 0.5776173285198556, 'f1-score': 0.5871559633027523, 'support': 277.0}
37
- - B-majorclaim: {'precision': 0.6540880503144654, 'recall': 0.7375886524822695, 'f1-score': 0.6933333333333332, 'support': 141.0}
38
- - B-premise: {'precision': 0.7533039647577092, 'recall': 0.8003120124804992, 'f1-score': 0.7760968229954615, 'support': 641.0}
39
- - I-claim: {'precision': 0.6253114100647733, 'recall': 0.6153468987496935, 'f1-score': 0.6202891387618931, 'support': 4079.0}
40
- - I-majorclaim: {'precision': 0.7570308898109728, 'recall': 0.8045075943165115, 'f1-score': 0.7800475059382421, 'support': 2041.0}
41
- - I-premise: {'precision': 0.8764452113891286, 'recall': 0.8867743343518114, 'f1-score': 0.881579518333695, 'support': 11455.0}
42
- - O: {'precision': 0.9354231280460789, 'recall': 0.9105121293800539, 'f1-score': 0.9227995410588428, 'support': 9275.0}
43
- - Accuracy: 0.8432
44
- - Macro avg: {'precision': 0.7426596542508946, 'recall': 0.7618084214686707, 'f1-score': 0.7516145462463172, 'support': 27909.0}
45
- - Weighted avg: {'precision': 0.8438834099280034, 'recall': 0.8431688702569063, 'f1-score': 0.8433686534034867, 'support': 27909.0}
46
 
47
  ## Model description
48
 
@@ -71,15 +70,15 @@ The following hyperparameters were used during training:
71
 
72
  ### Training results
73
 
74
- | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
- |:-------------:|:-----:|:----:|:---------------:|:-----------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
- | No log | 1.0 | 41 | 0.7886 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.8888888888888888, 'recall': 0.0748829953198128, 'f1-score': 0.1381294964028777, 'support': 641.0} | {'precision': 0.47000821692686934, 'recall': 0.1402304486393724, 'f1-score': 0.21601208459214502, 'support': 4079.0} | {'precision': 0.5424354243542435, 'recall': 0.0720235178833905, 'f1-score': 0.12716262975778547, 'support': 2041.0} | {'precision': 0.7775630122158652, 'recall': 0.8779572239196858, 'f1-score': 0.8247160605190865, 'support': 11455.0} | {'precision': 0.6536142336038115, 'recall': 0.9466307277628032, 'f1-score': 0.7732957548000705, 'support': 9275.0} | 0.7024 | {'precision': 0.4760728251413826, 'recall': 0.3016749876464378, 'f1-score': 0.29704514658170933, 'support': 27909.0} | {'precision': 0.6651369922726568, 'recall': 0.7024257408004586, 'f1-score': 0.6395296795513288, 'support': 27909.0} |
77
- | No log | 2.0 | 82 | 0.5373 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 277.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 141.0} | {'precision': 0.5765472312703583, 'recall': 0.828393135725429, 'f1-score': 0.679897567221511, 'support': 641.0} | {'precision': 0.5732105732105732, 'recall': 0.47315518509438587, 'f1-score': 0.5183991404781091, 'support': 4079.0} | {'precision': 0.5972927241962775, 'recall': 0.6918177364037237, 'f1-score': 0.6410896708286039, 'support': 2041.0} | {'precision': 0.858668504004823, 'recall': 0.870362287210825, 'f1-score': 0.8644758519032342, 'support': 11455.0} | {'precision': 0.8686365992742353, 'recall': 0.903288409703504, 'f1-score': 0.8856236786469344, 'support': 9275.0} | 0.7962 | {'precision': 0.4963365188508953, 'recall': 0.538145250591124, 'f1-score': 0.5127837012969133, 'support': 27909.0} | {'precision': 0.7818058448922789, 'recall': 0.7961947758787488, 'f1-score': 0.7874004427160499, 'support': 27909.0} |
78
- | No log | 3.0 | 123 | 0.4911 | {'precision': 0.34893617021276596, 'recall': 0.296028880866426, 'f1-score': 0.3203125, 'support': 277.0} | {'precision': 0.8333333333333334, 'recall': 0.03546099290780142, 'f1-score': 0.06802721088435375, 'support': 141.0} | {'precision': 0.6662371134020618, 'recall': 0.8065522620904836, 'f1-score': 0.7297106563161608, 'support': 641.0} | {'precision': 0.6018223234624146, 'recall': 0.3238538857563128, 'f1-score': 0.421102964615875, 'support': 4079.0} | {'precision': 0.7116279069767442, 'recall': 0.7496325330720235, 'f1-score': 0.7301360057265569, 'support': 2041.0} | {'precision': 0.7889374090247453, 'recall': 0.9463116542994325, 'f1-score': 0.8604881921016073, 'support': 11455.0} | {'precision': 0.9330078346769615, 'recall': 0.8859299191374663, 'f1-score': 0.908859639420418, 'support': 9275.0} | 0.8066 | {'precision': 0.6977002987270037, 'recall': 0.5776814468757066, 'f1-score': 0.5769481670092816, 'support': 27909.0} | {'precision': 0.7968542338095115, 'recall': 0.8066215199398044, 'f1-score': 0.7904444769227739, 'support': 27909.0} |
79
- | No log | 4.0 | 164 | 0.4471 | {'precision': 0.5464285714285714, 'recall': 0.5523465703971119, 'f1-score': 0.5493716337522441, 'support': 277.0} | {'precision': 0.6544117647058824, 'recall': 0.6312056737588653, 'f1-score': 0.6425992779783394, 'support': 141.0} | {'precision': 0.7510917030567685, 'recall': 0.8049921996879875, 'f1-score': 0.7771084337349398, 'support': 641.0} | {'precision': 0.6000949893137022, 'recall': 0.619514586908556, 'f1-score': 0.6096501809408926, 'support': 4079.0} | {'precision': 0.7037037037037037, 'recall': 0.8005879470847623, 'f1-score': 0.7490258996103598, 'support': 2041.0} | {'precision': 0.8949800652410294, 'recall': 0.8622435617634221, 'f1-score': 0.8783068783068784, 'support': 11455.0} | {'precision': 0.9216195734545848, 'recall': 0.9178436657681941, 'f1-score': 0.9197277441659464, 'support': 9275.0} | 0.8352 | {'precision': 0.7246186244148918, 'recall': 0.7412477436241284, 'f1-score': 0.7322557212128, 'support': 27909.0} | {'precision': 0.838766973613019, 'recall': 0.835178616216991, 'f1-score': 0.8365729339666597, 'support': 27909.0} |
80
- | No log | 5.0 | 205 | 0.4553 | {'precision': 0.5725490196078431, 'recall': 0.5270758122743683, 'f1-score': 0.5488721804511277, 'support': 277.0} | {'precision': 0.608433734939759, 'recall': 0.7163120567375887, 'f1-score': 0.6579804560260587, 'support': 141.0} | {'precision': 0.7355021216407355, 'recall': 0.8112324492979719, 'f1-score': 0.7715133531157271, 'support': 641.0} | {'precision': 0.5901240035429584, 'recall': 0.6533464084334396, 'f1-score': 0.6201279813845259, 'support': 4079.0} | {'precision': 0.7180370210934137, 'recall': 0.8172464478196962, 'f1-score': 0.7644362969752522, 'support': 2041.0} | {'precision': 0.8847149103239047, 'recall': 0.8655608904408555, 'f1-score': 0.8750330950489806, 'support': 11455.0} | {'precision': 0.9455065827132226, 'recall': 0.8904582210242588, 'f1-score': 0.9171571349250416, 'support': 9275.0} | 0.8339 | {'precision': 0.7221239134088339, 'recall': 0.7544617551468827, 'f1-score': 0.7364457854181019, 'support': 27909.0} | {'precision': 0.8417519193793559, 'recall': 0.8339245404708159, 'f1-score': 0.8369775321954073, 'support': 27909.0} |
81
- | No log | 6.0 | 246 | 0.4431 | {'precision': 0.5860805860805861, 'recall': 0.5776173285198556, 'f1-score': 0.5818181818181819, 'support': 277.0} | {'precision': 0.6503067484662577, 'recall': 0.75177304964539, 'f1-score': 0.6973684210526316, 'support': 141.0} | {'precision': 0.7481804949053857, 'recall': 0.8018720748829953, 'f1-score': 0.7740963855421686, 'support': 641.0} | {'precision': 0.634337807039757, 'recall': 0.614121108114734, 'f1-score': 0.6240657698056801, 'support': 4079.0} | {'precision': 0.7280740414279419, 'recall': 0.809407153356198, 'f1-score': 0.7665893271461718, 'support': 2041.0} | {'precision': 0.8748292349726776, 'recall': 0.8944565691837626, 'f1-score': 0.8845340354815039, 'support': 11455.0} | {'precision': 0.9417344173441734, 'recall': 0.8991913746630728, 'f1-score': 0.9199713198389498, 'support': 9275.0} | 0.8428 | {'precision': 0.7376490471766827, 'recall': 0.7640626654808583, 'f1-score': 0.7497776343836123, 'support': 27909.0} | {'precision': 0.8442738869920543, 'recall': 0.8428463936364614, 'f1-score': 0.8431306326473246, 'support': 27909.0} |
82
- | No log | 7.0 | 287 | 0.4438 | {'precision': 0.5970149253731343, 'recall': 0.5776173285198556, 'f1-score': 0.5871559633027523, 'support': 277.0} | {'precision': 0.6540880503144654, 'recall': 0.7375886524822695, 'f1-score': 0.6933333333333332, 'support': 141.0} | {'precision': 0.7533039647577092, 'recall': 0.8003120124804992, 'f1-score': 0.7760968229954615, 'support': 641.0} | {'precision': 0.6253114100647733, 'recall': 0.6153468987496935, 'f1-score': 0.6202891387618931, 'support': 4079.0} | {'precision': 0.7570308898109728, 'recall': 0.8045075943165115, 'f1-score': 0.7800475059382421, 'support': 2041.0} | {'precision': 0.8764452113891286, 'recall': 0.8867743343518114, 'f1-score': 0.881579518333695, 'support': 11455.0} | {'precision': 0.9354231280460789, 'recall': 0.9105121293800539, 'f1-score': 0.9227995410588428, 'support': 9275.0} | 0.8432 | {'precision': 0.7426596542508946, 'recall': 0.7618084214686707, 'f1-score': 0.7516145462463172, 'support': 27909.0} | {'precision': 0.8438834099280034, 'recall': 0.8431688702569063, 'f1-score': 0.8433686534034867, 'support': 27909.0} |
83
 
84
 
85
  ### Framework versions
 
1
  ---
 
2
  base_model: allenai/longformer-base-4096
3
  tags:
4
  - generated_from_trainer
 
16
  name: essays_su_g
17
  type: essays_su_g
18
  config: full_labels
19
+ split: train[80%:100%]
20
  args: full_labels
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
+ value: 0.8489807741047829
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
31
 
32
  This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
33
  It achieves the following results on the evaluation set:
34
+ - Loss: 0.4462
35
+ - B-claim: {'precision': 0.5791505791505791, 'recall': 0.5535055350553506, 'f1-score': 0.5660377358490567, 'support': 271.0}
36
+ - B-majorclaim: {'precision': 0.6291390728476821, 'recall': 0.6834532374100719, 'f1-score': 0.6551724137931034, 'support': 139.0}
37
+ - B-premise: {'precision': 0.7467438494934877, 'recall': 0.8151658767772512, 'f1-score': 0.7794561933534744, 'support': 633.0}
38
+ - I-claim: {'precision': 0.6527263102170461, 'recall': 0.6163459135216196, 'f1-score': 0.6340146548399537, 'support': 4001.0}
39
+ - I-majorclaim: {'precision': 0.730102267674522, 'recall': 0.8156979632389468, 'f1-score': 0.7705302674800563, 'support': 2013.0}
40
+ - I-premise: {'precision': 0.8786896434055622, 'recall': 0.9086097388849682, 'f1-score': 0.8933992540549918, 'support': 11336.0}
41
+ - O: {'precision': 0.9441213365263998, 'recall': 0.8973553002384566, 'f1-score': 0.9201444845790497, 'support': 9226.0}
42
+ - Accuracy: 0.8490
43
+ - Macro avg: {'precision': 0.7372390084736112, 'recall': 0.7557333664466664, 'f1-score': 0.7455364291356693, 'support': 27619.0}
44
+ - Weighted avg: {'precision': 0.849764005765967, 'recall': 0.8489807741047829, 'f1-score': 0.8487801145396572, 'support': 27619.0}
45
 
46
  ## Model description
47
 
 
70
 
71
  ### Training results
72
 
73
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
74
+ |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:---------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
75
+ | No log | 1.0 | 41 | 0.6718 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.8023255813953488, 'recall': 0.10900473933649289, 'f1-score': 0.19193324061196104, 'support': 633.0} | {'precision': 0.48851894374282434, 'recall': 0.21269682579355162, 'f1-score': 0.2963607870450984, 'support': 4001.0} | {'precision': 0.5525238744884038, 'recall': 0.40238450074515647, 'f1-score': 0.46565104915205513, 'support': 2013.0} | {'precision': 0.7549047282992237, 'recall': 0.9436309103740297, 'f1-score': 0.8387830314435818, 'support': 11336.0} | {'precision': 0.8141802067946824, 'recall': 0.8961630175590722, 'f1-score': 0.8532067488777668, 'support': 9226.0} | 0.7493 | {'precision': 0.48749333353149765, 'recall': 0.3662685705440433, 'f1-score': 0.37799069387578044, 'support': 27619.0} | {'precision': 0.7112456473504178, 'recall': 0.7493030160396829, 'f1-score': 0.7105513857058047, 'support': 27619.0} |
76
+ | No log | 2.0 | 82 | 0.5177 | {'precision': 0.3333333333333333, 'recall': 0.02214022140221402, 'f1-score': 0.04152249134948097, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.554968287526427, 'recall': 0.8293838862559242, 'f1-score': 0.6649778340721976, 'support': 633.0} | {'precision': 0.597047329570126, 'recall': 0.3436640839790053, 'f1-score': 0.4362309644670051, 'support': 4001.0} | {'precision': 0.6440988106129918, 'recall': 0.6994535519125683, 'f1-score': 0.6706358656823053, 'support': 2013.0} | {'precision': 0.7990675289517221, 'recall': 0.9373676781933663, 'f1-score': 0.862710075505399, 'support': 11336.0} | {'precision': 0.913509246729815, 'recall': 0.878061998699328, 'f1-score': 0.895434950812424, 'support': 9226.0} | 0.7980 | {'precision': 0.5488606481034879, 'recall': 0.5300102029203437, 'f1-score': 0.510216025984116, 'support': 27619.0} | {'precision': 0.7825501049725762, 'recall': 0.7980375828234186, 'f1-score': 0.7809297194937906, 'support': 27619.0} |
77
+ | No log | 3.0 | 123 | 0.4635 | {'precision': 0.4563106796116505, 'recall': 0.34686346863468637, 'f1-score': 0.39412997903563946, 'support': 271.0} | {'precision': 0.5, 'recall': 0.014388489208633094, 'f1-score': 0.027972027972027972, 'support': 139.0} | {'precision': 0.6818181818181818, 'recall': 0.8056872037914692, 'f1-score': 0.7385952208544532, 'support': 633.0} | {'precision': 0.5960642154324184, 'recall': 0.57535616095976, 'f1-score': 0.5855271524863284, 'support': 4001.0} | {'precision': 0.6403406891211769, 'recall': 0.821659215101838, 'f1-score': 0.7197563098346389, 'support': 2013.0} | {'precision': 0.8593168752113629, 'recall': 0.8966125617501765, 'f1-score': 0.8775686409946469, 'support': 11336.0} | {'precision': 0.9511206485455412, 'recall': 0.8647301105571212, 'f1-score': 0.9058703304189849, 'support': 9226.0} | 0.8220 | {'precision': 0.6692816128200473, 'recall': 0.6178996014290978, 'f1-score': 0.6070599516566741, 'support': 27619.0} | {'precision': 0.8260568824826705, 'recall': 0.8220427966255114, 'f1-score': 0.8210097933510785, 'support': 27619.0} |
78
+ | No log | 4.0 | 164 | 0.4425 | {'precision': 0.5324074074074074, 'recall': 0.42435424354243545, 'f1-score': 0.4722792607802875, 'support': 271.0} | {'precision': 0.7424242424242424, 'recall': 0.35251798561151076, 'f1-score': 0.4780487804878048, 'support': 139.0} | {'precision': 0.7050938337801609, 'recall': 0.8309636650868878, 'f1-score': 0.7628716461203771, 'support': 633.0} | {'precision': 0.6349449204406364, 'recall': 0.5186203449137715, 'f1-score': 0.5709175952675746, 'support': 4001.0} | {'precision': 0.7780149413020278, 'recall': 0.7242921013412816, 'f1-score': 0.7501929508618471, 'support': 2013.0} | {'precision': 0.8509357737653558, 'recall': 0.9104622441778405, 'f1-score': 0.8796931600255701, 'support': 11336.0} | {'precision': 0.9124463519313305, 'recall': 0.921742900498591, 'f1-score': 0.9170710665372588, 'support': 9226.0} | 0.8345 | {'precision': 0.7366096387215945, 'recall': 0.668993355024617, 'f1-score': 0.6901534942972457, 'support': 27619.0} | {'precision': 0.8278648919850001, 'recall': 0.8344979905137767, 'f1-score': 0.8293136334706738, 'support': 27619.0} |
79
+ | No log | 5.0 | 205 | 0.4397 | {'precision': 0.5591836734693878, 'recall': 0.5055350553505535, 'f1-score': 0.5310077519379846, 'support': 271.0} | {'precision': 0.6722689075630253, 'recall': 0.5755395683453237, 'f1-score': 0.62015503875969, 'support': 139.0} | {'precision': 0.7339971550497866, 'recall': 0.8151658767772512, 'f1-score': 0.7724550898203593, 'support': 633.0} | {'precision': 0.6357277374226527, 'recall': 0.5906023494126469, 'f1-score': 0.6123348017621145, 'support': 4001.0} | {'precision': 0.7521286660359509, 'recall': 0.789865871833085, 'f1-score': 0.7705354979403926, 'support': 2013.0} | {'precision': 0.8708142104368519, 'recall': 0.9038461538461539, 'f1-score': 0.8870227685914638, 'support': 11336.0} | {'precision': 0.9345616973757678, 'recall': 0.9071103403425103, 'f1-score': 0.9206314284142787, 'support': 9226.0} | 0.8437 | {'precision': 0.7369545781933462, 'recall': 0.7268093165582178, 'f1-score': 0.7305917681751833, 'support': 27619.0} | {'precision': 0.8422101504206296, 'recall': 0.8436583511350881, 'f1-score': 0.8425049381051551, 'support': 27619.0} |
80
+ | No log | 6.0 | 246 | 0.4509 | {'precision': 0.573076923076923, 'recall': 0.5498154981549815, 'f1-score': 0.5612052730696798, 'support': 271.0} | {'precision': 0.6590909090909091, 'recall': 0.6258992805755396, 'f1-score': 0.6420664206642066, 'support': 139.0} | {'precision': 0.7467438494934877, 'recall': 0.8151658767772512, 'f1-score': 0.7794561933534744, 'support': 633.0} | {'precision': 0.6449959536012948, 'recall': 0.5976005998500374, 'f1-score': 0.6203943954333161, 'support': 4001.0} | {'precision': 0.757604117922321, 'recall': 0.8042722305017387, 'f1-score': 0.7802409638554216, 'support': 2013.0} | {'precision': 0.8661364587693595, 'recall': 0.912667607621736, 'f1-score': 0.8887934367080451, 'support': 11336.0} | {'precision': 0.9436378186806905, 'recall': 0.8946455668762194, 'f1-score': 0.9184888443776775, 'support': 9226.0} | 0.8459 | {'precision': 0.7416122900907122, 'recall': 0.7428666657653578, 'f1-score': 0.7415207896374031, 'support': 27619.0} | {'precision': 0.8454258898128384, 'recall': 0.8458669756327166, 'f1-score': 0.8449579327632236, 'support': 27619.0} |
81
+ | No log | 7.0 | 287 | 0.4462 | {'precision': 0.5791505791505791, 'recall': 0.5535055350553506, 'f1-score': 0.5660377358490567, 'support': 271.0} | {'precision': 0.6291390728476821, 'recall': 0.6834532374100719, 'f1-score': 0.6551724137931034, 'support': 139.0} | {'precision': 0.7467438494934877, 'recall': 0.8151658767772512, 'f1-score': 0.7794561933534744, 'support': 633.0} | {'precision': 0.6527263102170461, 'recall': 0.6163459135216196, 'f1-score': 0.6340146548399537, 'support': 4001.0} | {'precision': 0.730102267674522, 'recall': 0.8156979632389468, 'f1-score': 0.7705302674800563, 'support': 2013.0} | {'precision': 0.8786896434055622, 'recall': 0.9086097388849682, 'f1-score': 0.8933992540549918, 'support': 11336.0} | {'precision': 0.9441213365263998, 'recall': 0.8973553002384566, 'f1-score': 0.9201444845790497, 'support': 9226.0} | 0.8490 | {'precision': 0.7372390084736112, 'recall': 0.7557333664466664, 'f1-score': 0.7455364291356693, 'support': 27619.0} | {'precision': 0.849764005765967, 'recall': 0.8489807741047829, 'f1-score': 0.8487801145396572, 'support': 27619.0} |
82
 
83
 
84
  ### Framework versions
meta_data/README_s42_e8.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: allenai/longformer-base-4096
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - essays_su_g
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: longformer-full_labels
12
+ results:
13
+ - task:
14
+ name: Token Classification
15
+ type: token-classification
16
+ dataset:
17
+ name: essays_su_g
18
+ type: essays_su_g
19
+ config: full_labels
20
+ split: train[80%:100%]
21
+ args: full_labels
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.8484738766790977
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # longformer-full_labels
32
+
33
+ This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.4627
36
+ - B-claim: {'precision': 0.5947955390334573, 'recall': 0.5904059040590406, 'f1-score': 0.5925925925925926, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.6121212121212121, 'recall': 0.7266187050359713, 'f1-score': 0.6644736842105262, 'support': 139.0}
38
+ - B-premise: {'precision': 0.7612612612612613, 'recall': 0.8009478672985783, 'f1-score': 0.7806004618937644, 'support': 633.0}
39
+ - I-claim: {'precision': 0.6395901024743814, 'recall': 0.6395901024743814, 'f1-score': 0.6395901024743814, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.7495404411764706, 'recall': 0.8102334823646299, 'f1-score': 0.7787061351157795, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8801381692573402, 'recall': 0.8990825688073395, 'f1-score': 0.8895095130040147, 'support': 11336.0}
42
+ - O: {'precision': 0.9454462451495093, 'recall': 0.897897246910904, 'f1-score': 0.9210584834334, 'support': 9226.0}
43
+ - Accuracy: 0.8485
44
+ - Macro avg: {'precision': 0.7404132814962331, 'recall': 0.7663965538501206, 'f1-score': 0.7523615675320655, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8507154882682251, 'recall': 0.8484738766790977, 'f1-score': 0.8492260901783096, 'support': 27619.0}
46
+
47
+ ## Model description
48
+
49
+ More information needed
50
+
51
+ ## Intended uses & limitations
52
+
53
+ More information needed
54
+
55
+ ## Training and evaluation data
56
+
57
+ More information needed
58
+
59
+ ## Training procedure
60
+
61
+ ### Training hyperparameters
62
+
63
+ The following hyperparameters were used during training:
64
+ - learning_rate: 2e-05
65
+ - train_batch_size: 8
66
+ - eval_batch_size: 8
67
+ - seed: 42
68
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
+ - lr_scheduler_type: linear
70
+ - num_epochs: 8
71
+
72
+ ### Training results
73
+
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.6678 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.7934782608695652, 'recall': 0.11532385466034756, 'f1-score': 0.2013793103448276, 'support': 633.0} | {'precision': 0.5032258064516129, 'recall': 0.21444638840289929, 'f1-score': 0.3007360672975815, 'support': 4001.0} | {'precision': 0.5472203616878768, 'recall': 0.40586189766517633, 'f1-score': 0.46605818596691384, 'support': 2013.0} | {'precision': 0.7588924387646432, 'recall': 0.942925194071983, 'f1-score': 0.8409582628535461, 'support': 11336.0} | {'precision': 0.8103279968762203, 'recall': 0.8997398655972252, 'f1-score': 0.8526964560862866, 'support': 9226.0} | 0.7509 | {'precision': 0.4875921235214169, 'recall': 0.36832817148537594, 'f1-score': 0.3802611832213079, 'support': 27619.0} | {'precision': 0.7131367378919236, 'recall': 0.7508599152757159, 'f1-score': 0.7121537205770366, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5160 | {'precision': 0.3, 'recall': 0.02214022140221402, 'f1-score': 0.041237113402061855, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.5445240532241555, 'recall': 0.8404423380726699, 'f1-score': 0.6608695652173913, 'support': 633.0} | {'precision': 0.6044260027662517, 'recall': 0.3276680829792552, 'f1-score': 0.42495948136142625, 'support': 4001.0} | {'precision': 0.6442350829968596, 'recall': 0.7133631395926477, 'f1-score': 0.6770391324846771, 'support': 2013.0} | {'precision': 0.7956139042219902, 'recall': 0.9408962597035991, 'f1-score': 0.8621776735914639, 'support': 11336.0} | {'precision': 0.9174415967339533, 'recall': 0.8768697160199437, 'f1-score': 0.8966969629793837, 'support': 9226.0} | 0.7980 | {'precision': 0.54374866284903, 'recall': 0.5316256796814757, 'f1-score': 0.5089971327194863, 'support': 27619.0} | {'precision': 0.7829585710764065, 'recall': 0.7980375828234186, 'f1-score': 0.7798696780989144, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4464 | {'precision': 0.44680851063829785, 'recall': 0.3874538745387454, 'f1-score': 0.41501976284584985, 'support': 271.0} | {'precision': 0.6666666666666666, 'recall': 0.02877697841726619, 'f1-score': 0.05517241379310344, 'support': 139.0} | {'precision': 0.6974219810040706, 'recall': 0.8120063191153238, 'f1-score': 0.7503649635036497, 'support': 633.0} | {'precision': 0.5958631662688942, 'recall': 0.5616095976005998, 'f1-score': 0.5782295419454451, 'support': 4001.0} | {'precision': 0.7014858171994597, 'recall': 0.7739692001987084, 'f1-score': 0.7359470949456779, 'support': 2013.0} | {'precision': 0.8603077963808557, 'recall': 0.8974947071277346, 'f1-score': 0.8785079008721182, 'support': 11336.0} | {'precision': 0.9359628244361329, 'recall': 0.8950791242141773, 'f1-score': 0.9150645465122722, 'support': 9226.0} | 0.8277 | {'precision': 0.7006452517991967, 'recall': 0.6223414001732223, 'f1-score': 0.6183294606311595, 'support': 27619.0} | {'precision': 0.8269307926902033, 'recall': 0.8276910822260038, 'f1-score': 0.8252011047830917, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4402 | {'precision': 0.5475113122171946, 'recall': 0.44649446494464945, 'f1-score': 0.491869918699187, 'support': 271.0} | {'precision': 0.7534246575342466, 'recall': 0.39568345323741005, 'f1-score': 0.5188679245283018, 'support': 139.0} | {'precision': 0.717983651226158, 'recall': 0.8325434439178515, 'f1-score': 0.7710314557425019, 'support': 633.0} | {'precision': 0.6395672333848532, 'recall': 0.517120719820045, 'f1-score': 0.5718629076838032, 'support': 4001.0} | {'precision': 0.7842926304464766, 'recall': 0.7242921013412816, 'f1-score': 0.7530991735537189, 'support': 2013.0} | {'precision': 0.8484153631971173, 'recall': 0.9139026111503176, 'f1-score': 0.879942243173228, 'support': 11336.0} | {'precision': 0.9154641395649364, 'recall': 0.9214177324951225, 'f1-score': 0.9184312878133103, 'support': 9226.0} | 0.8361 | {'precision': 0.7438084267958548, 'recall': 0.6787792181295255, 'f1-score': 0.7007292730277215, 'support': 27619.0} | {'precision': 0.8294646264862761, 'recall': 0.8360548897498099, 'f1-score': 0.830803677212997, 'support': 27619.0} |
80
+ | No log | 5.0 | 205 | 0.4402 | {'precision': 0.5736434108527132, 'recall': 0.5461254612546126, 'f1-score': 0.5595463137996219, 'support': 271.0} | {'precision': 0.6690647482014388, 'recall': 0.6690647482014388, 'f1-score': 0.6690647482014388, 'support': 139.0} | {'precision': 0.7511111111111111, 'recall': 0.8009478672985783, 'f1-score': 0.775229357798165, 'support': 633.0} | {'precision': 0.6177680948922779, 'recall': 0.6378405398650338, 'f1-score': 0.6276438760452534, 'support': 4001.0} | {'precision': 0.7445221445221445, 'recall': 0.7933432687531048, 'f1-score': 0.7681577681577682, 'support': 2013.0} | {'precision': 0.8894235053016127, 'recall': 0.8805575158786167, 'f1-score': 0.884968305332683, 'support': 11336.0} | {'precision': 0.9308134394341291, 'recall': 0.9128549750704531, 'f1-score': 0.9217467440078799, 'support': 9226.0} | 0.8437 | {'precision': 0.7394780649022038, 'recall': 0.7486763394745484, 'f1-score': 0.7437653019061158, 'support': 27619.0} | {'precision': 0.8459579843795886, 'recall': 0.8436583511350881, 'f1-score': 0.8446684579221759, 'support': 27619.0} |
81
+ | No log | 6.0 | 246 | 0.4510 | {'precision': 0.5868725868725869, 'recall': 0.5608856088560885, 'f1-score': 0.5735849056603772, 'support': 271.0} | {'precision': 0.6571428571428571, 'recall': 0.6618705035971223, 'f1-score': 0.6594982078853047, 'support': 139.0} | {'precision': 0.7551319648093842, 'recall': 0.8135860979462876, 'f1-score': 0.7832699619771862, 'support': 633.0} | {'precision': 0.6399897066392177, 'recall': 0.6215946013496626, 'f1-score': 0.6306580448839864, 'support': 4001.0} | {'precision': 0.7567693744164332, 'recall': 0.8052657724788872, 'f1-score': 0.7802647412755717, 'support': 2013.0} | {'precision': 0.8719802653963933, 'recall': 0.904287226534933, 'f1-score': 0.887839944569548, 'support': 11336.0} | {'precision': 0.944368288782271, 'recall': 0.8960546282245827, 'f1-score': 0.9195773081201334, 'support': 9226.0} | 0.8467 | {'precision': 0.7446078634370206, 'recall': 0.7519349198553663, 'f1-score': 0.7478133020531582, 'support': 27619.0} | {'precision': 0.8476001864554185, 'recall': 0.8466997356891994, 'f1-score': 0.8467153504611636, 'support': 27619.0} |
82
+ | No log | 7.0 | 287 | 0.4622 | {'precision': 0.5871212121212122, 'recall': 0.5719557195571956, 'f1-score': 0.5794392523364487, 'support': 271.0} | {'precision': 0.64, 'recall': 0.6906474820143885, 'f1-score': 0.6643598615916955, 'support': 139.0} | {'precision': 0.7442857142857143, 'recall': 0.8230647709320695, 'f1-score': 0.781695423855964, 'support': 633.0} | {'precision': 0.6431836515192256, 'recall': 0.5978505373656586, 'f1-score': 0.6196891191709846, 'support': 4001.0} | {'precision': 0.7794045876037091, 'recall': 0.7933432687531048, 'f1-score': 0.7863121614967996, 'support': 2013.0} | {'precision': 0.8665099487868357, 'recall': 0.9104622441778405, 'f1-score': 0.887942530218953, 'support': 11336.0} | {'precision': 0.9424427826875141, 'recall': 0.9015824842835465, 'f1-score': 0.921559937957013, 'support': 9226.0} | 0.8472 | {'precision': 0.7432782710006016, 'recall': 0.7555580724405433, 'f1-score': 0.7487140409468368, 'support': 27619.0} | {'precision': 0.8464917564982428, 'recall': 0.8472428400738622, 'f1-score': 0.8463173293202095, 'support': 27619.0} |
83
+ | No log | 8.0 | 328 | 0.4627 | {'precision': 0.5947955390334573, 'recall': 0.5904059040590406, 'f1-score': 0.5925925925925926, 'support': 271.0} | {'precision': 0.6121212121212121, 'recall': 0.7266187050359713, 'f1-score': 0.6644736842105262, 'support': 139.0} | {'precision': 0.7612612612612613, 'recall': 0.8009478672985783, 'f1-score': 0.7806004618937644, 'support': 633.0} | {'precision': 0.6395901024743814, 'recall': 0.6395901024743814, 'f1-score': 0.6395901024743814, 'support': 4001.0} | {'precision': 0.7495404411764706, 'recall': 0.8102334823646299, 'f1-score': 0.7787061351157795, 'support': 2013.0} | {'precision': 0.8801381692573402, 'recall': 0.8990825688073395, 'f1-score': 0.8895095130040147, 'support': 11336.0} | {'precision': 0.9454462451495093, 'recall': 0.897897246910904, 'f1-score': 0.9210584834334, 'support': 9226.0} | 0.8485 | {'precision': 0.7404132814962331, 'recall': 0.7663965538501206, 'f1-score': 0.7523615675320655, 'support': 27619.0} | {'precision': 0.8507154882682251, 'recall': 0.8484738766790977, 'f1-score': 0.8492260901783096, 'support': 27619.0} |
84
+
85
+
86
+ ### Framework versions
87
+
88
+ - Transformers 4.37.2
89
+ - Pytorch 2.2.0+cu121
90
+ - Datasets 2.17.0
91
+ - Tokenizers 0.15.2
meta_data/README_s42_e9.md ADDED
@@ -0,0 +1,92 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: allenai/longformer-base-4096
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - essays_su_g
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: longformer-full_labels
12
+ results:
13
+ - task:
14
+ name: Token Classification
15
+ type: token-classification
16
+ dataset:
17
+ name: essays_su_g
18
+ type: essays_su_g
19
+ config: full_labels
20
+ split: train[80%:100%]
21
+ args: full_labels
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.8512980194793439
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # longformer-full_labels
32
+
33
+ This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the essays_su_g dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.4843
36
+ - B-claim: {'precision': 0.5827338129496403, 'recall': 0.5977859778597786, 'f1-score': 0.5901639344262295, 'support': 271.0}
37
+ - B-majorclaim: {'precision': 0.6162790697674418, 'recall': 0.762589928057554, 'f1-score': 0.6816720257234727, 'support': 139.0}
38
+ - B-premise: {'precision': 0.7659574468085106, 'recall': 0.7962085308056872, 'f1-score': 0.7807900852052672, 'support': 633.0}
39
+ - I-claim: {'precision': 0.6394124847001224, 'recall': 0.6528367908022994, 'f1-score': 0.6460549097205045, 'support': 4001.0}
40
+ - I-majorclaim: {'precision': 0.7709125475285171, 'recall': 0.8057625434674615, 'f1-score': 0.7879523925188244, 'support': 2013.0}
41
+ - I-premise: {'precision': 0.8850074048262043, 'recall': 0.8961714890613973, 'f1-score': 0.8905544597852291, 'support': 11336.0}
42
+ - O: {'precision': 0.9439104376342871, 'recall': 0.9047257749837416, 'f1-score': 0.923902816979357, 'support': 9226.0}
43
+ - Accuracy: 0.8513
44
+ - Macro avg: {'precision': 0.7434590291735319, 'recall': 0.7737258621482742, 'f1-score': 0.7572986606226978, 'support': 27619.0}
45
+ - Weighted avg: {'precision': 0.8537431719475974, 'recall': 0.8512980194793439, 'f1-score': 0.8522826158531822, 'support': 27619.0}
46
+
47
+ ## Model description
48
+
49
+ More information needed
50
+
51
+ ## Intended uses & limitations
52
+
53
+ More information needed
54
+
55
+ ## Training and evaluation data
56
+
57
+ More information needed
58
+
59
+ ## Training procedure
60
+
61
+ ### Training hyperparameters
62
+
63
+ The following hyperparameters were used during training:
64
+ - learning_rate: 2e-05
65
+ - train_batch_size: 8
66
+ - eval_batch_size: 8
67
+ - seed: 42
68
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
69
+ - lr_scheduler_type: linear
70
+ - num_epochs: 9
71
+
72
+ ### Training results
73
+
74
+ | Training Loss | Epoch | Step | Validation Loss | B-claim | B-majorclaim | B-premise | I-claim | I-majorclaim | I-premise | O | Accuracy | Macro avg | Weighted avg |
75
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:--------:|:----------------------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------:|
76
+ | No log | 1.0 | 41 | 0.6667 | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.8021978021978022, 'recall': 0.11532385466034756, 'f1-score': 0.20165745856353592, 'support': 633.0} | {'precision': 0.4947674418604651, 'recall': 0.21269682579355162, 'f1-score': 0.29750043698654083, 'support': 4001.0} | {'precision': 0.5498652291105122, 'recall': 0.40536512667660207, 'f1-score': 0.46668573062625107, 'support': 2013.0} | {'precision': 0.7564971751412429, 'recall': 0.944954128440367, 'f1-score': 0.8402886727329778, 'support': 11336.0} | {'precision': 0.8145415190869736, 'recall': 0.8973553002384566, 'f1-score': 0.8539453326456936, 'support': 9226.0} | 0.7506 | {'precision': 0.48826702391385657, 'recall': 0.36795646225847495, 'f1-score': 0.38001109022214274, 'support': 27619.0} | {'precision': 0.7127284290659307, 'recall': 0.7506064665628733, 'f1-score': 0.7118794608238792, 'support': 27619.0} |
77
+ | No log | 2.0 | 82 | 0.5137 | {'precision': 0.34615384615384615, 'recall': 0.033210332103321034, 'f1-score': 0.06060606060606061, 'support': 271.0} | {'precision': 0.0, 'recall': 0.0, 'f1-score': 0.0, 'support': 139.0} | {'precision': 0.540785498489426, 'recall': 0.8483412322274881, 'f1-score': 0.6605166051660517, 'support': 633.0} | {'precision': 0.6073273837482386, 'recall': 0.3231692076980755, 'f1-score': 0.42185970636215336, 'support': 4001.0} | {'precision': 0.6427943760984183, 'recall': 0.726775956284153, 'f1-score': 0.6822103054325018, 'support': 2013.0} | {'precision': 0.7947533164406022, 'recall': 0.9407198306280875, 'f1-score': 0.8615981255554658, 'support': 11336.0} | {'precision': 0.9200182294633702, 'recall': 0.8752438760026013, 'f1-score': 0.8970727101038716, 'support': 9226.0} | 0.7980 | {'precision': 0.5502618071991288, 'recall': 0.5353514907062467, 'f1-score': 0.5119805018894436, 'support': 27619.0} | {'precision': 0.7841485439195495, 'recall': 0.7980375828234186, 'f1-score': 0.7798671370505824, 'support': 27619.0} |
78
+ | No log | 3.0 | 123 | 0.4499 | {'precision': 0.401673640167364, 'recall': 0.35424354243542433, 'f1-score': 0.3764705882352941, 'support': 271.0} | {'precision': 0.7142857142857143, 'recall': 0.03597122302158273, 'f1-score': 0.0684931506849315, 'support': 139.0} | {'precision': 0.6745098039215687, 'recall': 0.8151658767772512, 'f1-score': 0.7381974248927039, 'support': 633.0} | {'precision': 0.6081670822942643, 'recall': 0.4876280929767558, 'f1-score': 0.5412678596199194, 'support': 4001.0} | {'precision': 0.7265586647029946, 'recall': 0.7352210630899155, 'f1-score': 0.7308641975308642, 'support': 2013.0} | {'precision': 0.8397316738058677, 'recall': 0.9165490472829922, 'f1-score': 0.8764604158758277, 'support': 11336.0} | {'precision': 0.9282536151279199, 'recall': 0.9045089963147627, 'f1-score': 0.916227492314449, 'support': 9226.0} | 0.8249 | {'precision': 0.6990257420436705, 'recall': 0.6070411202712407, 'f1-score': 0.6068544470219985, 'support': 27619.0} | {'precision': 0.8187917438138024, 'recall': 0.8249031463847352, 'f1-score': 0.8184342482256587, 'support': 27619.0} |
79
+ | No log | 4.0 | 164 | 0.4384 | {'precision': 0.5627705627705628, 'recall': 0.4797047970479705, 'f1-score': 0.5179282868525897, 'support': 271.0} | {'precision': 0.7558139534883721, 'recall': 0.4676258992805755, 'f1-score': 0.5777777777777777, 'support': 139.0} | {'precision': 0.7251381215469613, 'recall': 0.8293838862559242, 'f1-score': 0.7737656595431099, 'support': 633.0} | {'precision': 0.6380688806888068, 'recall': 0.5186203449137715, 'f1-score': 0.5721770301944022, 'support': 4001.0} | {'precision': 0.7735368956743003, 'recall': 0.7550919026328863, 'f1-score': 0.7642031171442937, 'support': 2013.0} | {'precision': 0.8475748697916666, 'recall': 0.9187544107268878, 'f1-score': 0.881730443616661, 'support': 11336.0} | {'precision': 0.927036261435027, 'recall': 0.9116626923910687, 'f1-score': 0.9192852068419038, 'support': 9226.0} | 0.8379 | {'precision': 0.7471342207708139, 'recall': 0.6972634190355835, 'f1-score': 0.7152667888529626, 'support': 27619.0} | {'precision': 0.8323100049810314, 'recall': 0.8378652376986857, 'f1-score': 0.8332925210586948, 'support': 27619.0} |
80
+ | No log | 5.0 | 205 | 0.4499 | {'precision': 0.5625, 'recall': 0.5645756457564576, 'f1-score': 0.56353591160221, 'support': 271.0} | {'precision': 0.6513157894736842, 'recall': 0.7122302158273381, 'f1-score': 0.6804123711340206, 'support': 139.0} | {'precision': 0.765793528505393, 'recall': 0.7851500789889415, 'f1-score': 0.7753510140405616, 'support': 633.0} | {'precision': 0.598067849921366, 'recall': 0.6653336665833541, 'f1-score': 0.6299100804543304, 'support': 4001.0} | {'precision': 0.7299465240641712, 'recall': 0.8137108792846498, 'f1-score': 0.7695560253699788, 'support': 2013.0} | {'precision': 0.8991627564633361, 'recall': 0.86212067748765, 'f1-score': 0.880252195451475, 'support': 11336.0} | {'precision': 0.9330883990202627, 'recall': 0.9084110123563841, 'f1-score': 0.9205843585237259, 'support': 9226.0} | 0.8401 | {'precision': 0.7342678353497447, 'recall': 0.7587903108978251, 'f1-score': 0.7456574223680432, 'support': 27619.0} | {'precision': 0.8469369671380803, 'recall': 0.8401100691552916, 'f1-score': 0.8428737258360137, 'support': 27619.0} |
81
+ | No log | 6.0 | 246 | 0.4510 | {'precision': 0.5871886120996441, 'recall': 0.6088560885608856, 'f1-score': 0.5978260869565217, 'support': 271.0} | {'precision': 0.6712328767123288, 'recall': 0.7050359712230215, 'f1-score': 0.687719298245614, 'support': 139.0} | {'precision': 0.76103500761035, 'recall': 0.7898894154818326, 'f1-score': 0.7751937984496124, 'support': 633.0} | {'precision': 0.6215966774342409, 'recall': 0.6733316670832292, 'f1-score': 0.6464307138572285, 'support': 4001.0} | {'precision': 0.7734109655507035, 'recall': 0.791852955787382, 'f1-score': 0.7825233186057928, 'support': 2013.0} | {'precision': 0.895002251238181, 'recall': 0.8767642907551164, 'f1-score': 0.8857894033242726, 'support': 11336.0} | {'precision': 0.9332595462091865, 'recall': 0.9139388684153479, 'f1-score': 0.9234981654892941, 'support': 9226.0} | 0.8480 | {'precision': 0.7489608481220907, 'recall': 0.7656670367581164, 'f1-score': 0.7569972549897622, 'support': 27619.0} | {'precision': 0.85209628578114, 'recall': 0.8480393931713676, 'f1-score': 0.8498277636346131, 'support': 27619.0} |
82
+ | No log | 7.0 | 287 | 0.4763 | {'precision': 0.609375, 'recall': 0.5756457564575646, 'f1-score': 0.5920303605313093, 'support': 271.0} | {'precision': 0.6451612903225806, 'recall': 0.7194244604316546, 'f1-score': 0.6802721088435374, 'support': 139.0} | {'precision': 0.7536023054755043, 'recall': 0.8262243285939969, 'f1-score': 0.7882441597588545, 'support': 633.0} | {'precision': 0.6364850427350427, 'recall': 0.5956010997250687, 'f1-score': 0.61536475145255, 'support': 4001.0} | {'precision': 0.8008171603677222, 'recall': 0.7789369100844511, 'f1-score': 0.7897255099471165, 'support': 2013.0} | {'precision': 0.8622444722569879, 'recall': 0.9116090331686661, 'f1-score': 0.8862398696453839, 'support': 11336.0} | {'precision': 0.9419961481817152, 'recall': 0.901257316280078, 'f1-score': 0.9211765357558299, 'support': 9226.0} | 0.8465 | {'precision': 0.749954488477079, 'recall': 0.7583855578202116, 'f1-score': 0.7532933279906545, 'support': 27619.0} | {'precision': 0.845639947288232, 'recall': 0.8464824939353344, 'f1-score': 0.8454664653763405, 'support': 27619.0} |
83
+ | No log | 8.0 | 328 | 0.4751 | {'precision': 0.5851063829787234, 'recall': 0.6088560885608856, 'f1-score': 0.5967450271247738, 'support': 271.0} | {'precision': 0.6130952380952381, 'recall': 0.7410071942446043, 'f1-score': 0.6710097719869708, 'support': 139.0} | {'precision': 0.7784615384615384, 'recall': 0.7993680884676145, 'f1-score': 0.7887763055339049, 'support': 633.0} | {'precision': 0.626434034416826, 'recall': 0.6550862284428893, 'f1-score': 0.6404398289554062, 'support': 4001.0} | {'precision': 0.7732253454025727, 'recall': 0.8062593144560357, 'f1-score': 0.7893968871595329, 'support': 2013.0} | {'precision': 0.8827230621340557, 'recall': 0.8910550458715596, 'f1-score': 0.8868694850520217, 'support': 11336.0} | {'precision': 0.9454111224837939, 'recall': 0.901040537611099, 'f1-score': 0.922692713247128, 'support': 9226.0} | 0.8484 | {'precision': 0.7434938177103927, 'recall': 0.7718103568078126, 'f1-score': 0.7565614312942482, 'support': 27619.0} | {'precision': 0.8518891727474877, 'recall': 0.8484014627611427, 'f1-score': 0.8498517254980413, 'support': 27619.0} |
84
+ | No log | 9.0 | 369 | 0.4843 | {'precision': 0.5827338129496403, 'recall': 0.5977859778597786, 'f1-score': 0.5901639344262295, 'support': 271.0} | {'precision': 0.6162790697674418, 'recall': 0.762589928057554, 'f1-score': 0.6816720257234727, 'support': 139.0} | {'precision': 0.7659574468085106, 'recall': 0.7962085308056872, 'f1-score': 0.7807900852052672, 'support': 633.0} | {'precision': 0.6394124847001224, 'recall': 0.6528367908022994, 'f1-score': 0.6460549097205045, 'support': 4001.0} | {'precision': 0.7709125475285171, 'recall': 0.8057625434674615, 'f1-score': 0.7879523925188244, 'support': 2013.0} | {'precision': 0.8850074048262043, 'recall': 0.8961714890613973, 'f1-score': 0.8905544597852291, 'support': 11336.0} | {'precision': 0.9439104376342871, 'recall': 0.9047257749837416, 'f1-score': 0.923902816979357, 'support': 9226.0} | 0.8513 | {'precision': 0.7434590291735319, 'recall': 0.7737258621482742, 'f1-score': 0.7572986606226978, 'support': 27619.0} | {'precision': 0.8537431719475974, 'recall': 0.8512980194793439, 'f1-score': 0.8522826158531822, 'support': 27619.0} |
85
+
86
+
87
+ ### Framework versions
88
+
89
+ - Transformers 4.37.2
90
+ - Pytorch 2.2.0+cu121
91
+ - Datasets 2.17.0
92
+ - Tokenizers 0.15.2
meta_data/meta_s42_e10_cvi0.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5174603174603175, "recall": 0.573943661971831, "f1-score": 0.5442404006677797, "support": 284.0}, "B-MajorClaim": {"precision": 0.6573426573426573, "recall": 0.6666666666666666, "f1-score": 0.6619718309859154, "support": 141.0}, "B-Premise": {"precision": 0.7438867438867439, "recall": 0.8163841807909604, "f1-score": 0.7784511784511785, "support": 708.0}, "I-Claim": {"precision": 0.5750120598166908, "recall": 0.5847436840814324, "f1-score": 0.5798370424419311, "support": 4077.0}, "I-MajorClaim": {"precision": 0.8073394495412844, "recall": 0.7391304347826086, "f1-score": 0.7717307196285789, "support": 2024.0}, "I-Premise": {"precision": 0.872694165149236, "recall": 0.901160889470242, "f1-score": 0.8866991111289868, "support": 12232.0}, "O": {"precision": 0.9210053859964094, "recall": 0.8837657073368463, "f1-score": 0.9020013445725811, "support": 9868.0}, "accuracy": 0.8338105952137451, "macro avg": {"precision": 0.7278201113133341, "recall": 0.7379707464429411, "f1-score": 0.7321330896967073, "support": 29334.0}, "weighted avg": {"precision": 0.835480031716714, "recall": 0.8338105952137451, "f1-score": 0.8342563963468361, "support": 29334.0}}
meta_data/meta_s42_e10_cvi1.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.569620253164557, "recall": 0.6101694915254238, "f1-score": 0.5891980360065466, "support": 295.0}, "B-MajorClaim": {"precision": 0.6559139784946236, "recall": 0.782051282051282, "f1-score": 0.7134502923976609, "support": 156.0}, "B-Premise": {"precision": 0.7689393939393939, "recall": 0.8376891334250344, "f1-score": 0.8018433179723502, "support": 727.0}, "I-Claim": {"precision": 0.6154603643525357, "recall": 0.6034274680183442, "f1-score": 0.6093845216331506, "support": 4143.0}, "I-MajorClaim": {"precision": 0.7738396624472574, "recall": 0.836297309621523, "f1-score": 0.8038571115494193, "support": 2193.0}, "I-Premise": {"precision": 0.8917087011349306, "recall": 0.9005810714001433, "f1-score": 0.8961229258247198, "support": 12563.0}, "O": {"precision": 0.9356903383114904, "recall": 0.9047151277013753, "f1-score": 0.9199420666233832, "support": 10180.0}, "accuracy": 0.8516706877747298, "macro avg": {"precision": 0.7444532416921127, "recall": 0.7821329833918752, "f1-score": 0.7619711817153186, "support": 30257.0}, "weighted avg": {"precision": 0.8528316164970468, "recall": 0.8516706877747298, "f1-score": 0.8519877394493136, "support": 30257.0}}
meta_data/meta_s42_e10_cvi2.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5870967741935483, "recall": 0.5741324921135647, "f1-score": 0.580542264752791, "support": 317.0}, "B-MajorClaim": {"precision": 0.711764705882353, "recall": 0.7806451612903226, "f1-score": 0.7446153846153847, "support": 155.0}, "B-Premise": {"precision": 0.779379157427938, "recall": 0.8541919805589308, "f1-score": 0.815072463768116, "support": 823.0}, "I-Claim": {"precision": 0.6206587497199193, "recall": 0.6376611418047882, "f1-score": 0.6290450777790393, "support": 4344.0}, "I-MajorClaim": {"precision": 0.8247663551401869, "recall": 0.8349101229895932, "f1-score": 0.8298072402444758, "support": 2114.0}, "I-Premise": {"precision": 0.8902735786446427, "recall": 0.8872639082825017, "f1-score": 0.888766195524146, "support": 13607.0}, "O": {"precision": 0.9162145871006631, "recall": 0.8961207404787171, "f1-score": 0.9060562708631379, "support": 8481.0}, "accuracy": 0.8449448745015248, "macro avg": {"precision": 0.7614505583013216, "recall": 0.7807036496454883, "f1-score": 0.7705578425067271, "support": 29841.0}, "weighted avg": {"precision": 0.8465510014665183, "recall": 0.8449448745015248, "f1-score": 0.8456399282751742, "support": 29841.0}}
meta_data/meta_s42_e10_cvi3.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.598159509202454, "recall": 0.5752212389380531, "f1-score": 0.5864661654135338, "support": 339.0}, "B-MajorClaim": {"precision": 0.7664670658682635, "recall": 0.8, "f1-score": 0.7828746177370032, "support": 160.0}, "B-Premise": {"precision": 0.7480842911877394, "recall": 0.8299681190223167, "f1-score": 0.7869017632241814, "support": 941.0}, "I-Claim": {"precision": 0.6306038426349497, "recall": 0.586845466155811, "f1-score": 0.6079382579933849, "support": 4698.0}, "I-MajorClaim": {"precision": 0.8337314859053989, "recall": 0.8604536489151874, "f1-score": 0.8468818247998059, "support": 2028.0}, "I-Premise": {"precision": 0.8645224811289793, "recall": 0.8862795235852231, "f1-score": 0.8752658160552896, "support": 14861.0}, "O": {"precision": 0.922439832407678, "recall": 0.9039434736942614, "f1-score": 0.9130979938271604, "support": 10473.0}, "accuracy": 0.8431044776119403, "macro avg": {"precision": 0.7662869297622089, "recall": 0.7775302100444075, "f1-score": 0.7713466341500513, "support": 33500.0}, "weighted avg": {"precision": 0.8415260711983502, "recall": 0.8431044776119403, "f1-score": 0.8420393249732389, "support": 33500.0}}
meta_data/meta_s42_e10_cvi4.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5934065934065934, "recall": 0.5977859778597786, "f1-score": 0.5955882352941178, "support": 271.0}, "B-MajorClaim": {"precision": 0.6190476190476191, "recall": 0.7482014388489209, "f1-score": 0.6775244299674268, "support": 139.0}, "B-Premise": {"precision": 0.7654135338345864, "recall": 0.8041074249605056, "f1-score": 0.7842835130970724, "support": 633.0}, "I-Claim": {"precision": 0.6421972534332084, "recall": 0.6428392901774557, "f1-score": 0.6425181114164377, "support": 4001.0}, "I-MajorClaim": {"precision": 0.7737752161383286, "recall": 0.8002980625931445, "f1-score": 0.7868131868131869, "support": 2013.0}, "I-Premise": {"precision": 0.8804516462678849, "recall": 0.9011115031757233, "f1-score": 0.8906617839393146, "support": 11336.0}, "O": {"precision": 0.9418631006346329, "recall": 0.9008237589421201, "f1-score": 0.9208864265927977, "support": 9226.0}, "accuracy": 0.8502842246279735, "macro avg": {"precision": 0.7451649946804076, "recall": 0.7707382080796641, "f1-score": 0.7568965267314792, "support": 27619.0}, "weighted avg": {"precision": 0.8519076404793325, "recall": 0.8502842246279735, "f1-score": 0.8508360851093074, "support": 27619.0}}
meta_data/meta_s42_e11_cvi0.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5283018867924528, "recall": 0.5915492957746479, "f1-score": 0.5581395348837209, "support": 284.0}, "B-MajorClaim": {"precision": 0.6778523489932886, "recall": 0.7163120567375887, "f1-score": 0.6965517241379311, "support": 141.0}, "B-Premise": {"precision": 0.7549148099606815, "recall": 0.8135593220338984, "f1-score": 0.7831407205982326, "support": 708.0}, "I-Claim": {"precision": 0.5790840415486308, "recall": 0.6016678930586216, "f1-score": 0.5901599903765187, "support": 4077.0}, "I-MajorClaim": {"precision": 0.8180309734513275, "recall": 0.7307312252964426, "f1-score": 0.7719206680584549, "support": 2024.0}, "I-Premise": {"precision": 0.8779549643400913, "recall": 0.89568345323741, "f1-score": 0.8867306058030836, "support": 12232.0}, "O": {"precision": 0.9155620498904081, "recall": 0.8889339278475882, "f1-score": 0.9020515193583217, "support": 9868.0}, "accuracy": 0.8353787413922411, "macro avg": {"precision": 0.7359572964252685, "recall": 0.7483481677123139, "f1-score": 0.7412421090308948, "support": 29334.0}, "weighted avg": {"precision": 0.8376159528974935, "recall": 0.8353787413922411, "f1-score": 0.8361482214263514, "support": 29334.0}}
meta_data/meta_s42_e11_cvi1.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5792880258899676, "recall": 0.6067796610169491, "f1-score": 0.5927152317880795, "support": 295.0}, "B-MajorClaim": {"precision": 0.6507936507936508, "recall": 0.7884615384615384, "f1-score": 0.7130434782608696, "support": 156.0}, "B-Premise": {"precision": 0.7594458438287154, "recall": 0.8294360385144429, "f1-score": 0.7928994082840236, "support": 727.0}, "I-Claim": {"precision": 0.6233605543182381, "recall": 0.6080135167752836, "f1-score": 0.6155913978494624, "support": 4143.0}, "I-MajorClaim": {"precision": 0.7915752523036419, "recall": 0.8226174190606476, "f1-score": 0.8067978533094812, "support": 2193.0}, "I-Premise": {"precision": 0.8929933269780743, "recall": 0.8947703573987105, "f1-score": 0.8938809590075941, "support": 12563.0}, "O": {"precision": 0.9298995724371085, "recall": 0.918664047151277, "f1-score": 0.924247665167762, "support": 10180.0}, "accuracy": 0.8533892983441849, "macro avg": {"precision": 0.7467651752213423, "recall": 0.781248939768407, "f1-score": 0.7627394276667533, "support": 30257.0}, "weighted avg": {"precision": 0.8536236581519149, "recall": 0.8533892983441849, "f1-score": 0.8533858022549129, "support": 30257.0}}
meta_data/meta_s42_e11_cvi2.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.597444089456869, "recall": 0.5899053627760252, "f1-score": 0.5936507936507937, "support": 317.0}, "B-MajorClaim": {"precision": 0.6971428571428572, "recall": 0.7870967741935484, "f1-score": 0.7393939393939394, "support": 155.0}, "B-Premise": {"precision": 0.7808676307007787, "recall": 0.8529769137302552, "f1-score": 0.8153310104529617, "support": 823.0}, "I-Claim": {"precision": 0.629570747217806, "recall": 0.638121546961326, "f1-score": 0.6338173087915856, "support": 4344.0}, "I-MajorClaim": {"precision": 0.8208530805687204, "recall": 0.8192999053926207, "f1-score": 0.8200757575757577, "support": 2114.0}, "I-Premise": {"precision": 0.8912360207587164, "recall": 0.8960828985081208, "f1-score": 0.8936528877162123, "support": 13607.0}, "O": {"precision": 0.9185230024213075, "recall": 0.8945879023700035, "f1-score": 0.9063974672958605, "support": 8481.0}, "accuracy": 0.8476592607486344, "macro avg": {"precision": 0.762233918323865, "recall": 0.7825816148474142, "f1-score": 0.7717598806967302, "support": 29841.0}, "weighted avg": {"precision": 0.8487410554444255, "recall": 0.8476592607486344, "f1-score": 0.8480897117386362, "support": 29841.0}}
meta_data/meta_s42_e11_cvi3.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5889212827988338, "recall": 0.5958702064896755, "f1-score": 0.5923753665689149, "support": 339.0}, "B-MajorClaim": {"precision": 0.7818181818181819, "recall": 0.80625, "f1-score": 0.7938461538461539, "support": 160.0}, "B-Premise": {"precision": 0.7514563106796116, "recall": 0.822529224229543, "f1-score": 0.7853881278538812, "support": 941.0}, "I-Claim": {"precision": 0.6238159675236806, "recall": 0.5887611749680716, "f1-score": 0.6057818659658345, "support": 4698.0}, "I-MajorClaim": {"precision": 0.8430992736077482, "recall": 0.8584812623274162, "f1-score": 0.8507207427314928, "support": 2028.0}, "I-Premise": {"precision": 0.8650553213909379, "recall": 0.8838570755669202, "f1-score": 0.8743551339657182, "support": 14861.0}, "O": {"precision": 0.9203229886175698, "recall": 0.9032750883223527, "f1-score": 0.9117193523515806, "support": 10473.0}, "accuracy": 0.842, "macro avg": {"precision": 0.7677841894909376, "recall": 0.7798605759862828, "f1-score": 0.7734552490405109, "support": 33500.0}, "weighted avg": {"precision": 0.8407903924058069, "recall": 0.842, "f1-score": 0.8412040047105178, "support": 33500.0}}
meta_data/meta_s42_e11_cvi4.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5977859778597786, "recall": 0.5977859778597786, "f1-score": 0.5977859778597786, "support": 271.0}, "B-MajorClaim": {"precision": 0.6287425149700598, "recall": 0.7553956834532374, "f1-score": 0.6862745098039216, "support": 139.0}, "B-Premise": {"precision": 0.7628398791540786, "recall": 0.7977883096366508, "f1-score": 0.7799227799227799, "support": 633.0}, "I-Claim": {"precision": 0.6403061224489796, "recall": 0.6273431642089478, "f1-score": 0.6337583638429491, "support": 4001.0}, "I-MajorClaim": {"precision": 0.7770823302840636, "recall": 0.8017883755588674, "f1-score": 0.7892420537897311, "support": 2013.0}, "I-Premise": {"precision": 0.8788191754884241, "recall": 0.9007586450247, "f1-score": 0.8896536702243519, "support": 11336.0}, "O": {"precision": 0.9381107491856677, "recall": 0.9052677216561891, "f1-score": 0.9213966572894258, "support": 9226.0}, "accuracy": 0.8493790506535356, "macro avg": {"precision": 0.7462409641987217, "recall": 0.7694468396283387, "f1-score": 0.7568620018189912, "support": 27619.0}, "weighted avg": {"precision": 0.8499840082982476, "recall": 0.8493790506535356, "f1-score": 0.8494664654905584, "support": 27619.0}}
meta_data/meta_s42_e12_cvi0.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5352564102564102, "recall": 0.5880281690140845, "f1-score": 0.5604026845637584, "support": 284.0}, "B-MajorClaim": {"precision": 0.7062937062937062, "recall": 0.7163120567375887, "f1-score": 0.7112676056338029, "support": 141.0}, "B-Premise": {"precision": 0.7406931964056482, "recall": 0.8149717514124294, "f1-score": 0.7760591795561534, "support": 708.0}, "I-Claim": {"precision": 0.6026818971939409, "recall": 0.5952906548933039, "f1-score": 0.5989634748272458, "support": 4077.0}, "I-MajorClaim": {"precision": 0.8060735215769845, "recall": 0.7475296442687747, "f1-score": 0.7756985388361958, "support": 2024.0}, "I-Premise": {"precision": 0.8732427736534513, "recall": 0.9039404839764552, "f1-score": 0.8883265043785651, "support": 12232.0}, "O": {"precision": 0.9140969162995595, "recall": 0.8831576813944062, "f1-score": 0.8983609937119886, "support": 9868.0}, "accuracy": 0.8371514283766278, "macro avg": {"precision": 0.7397626316685287, "recall": 0.7498900630995775, "f1-score": 0.7441541402153872, "support": 29334.0}, "weighted avg": {"precision": 0.8374736447828465, "recall": 0.8371514283766278, "f1-score": 0.8369781485961643, "support": 29334.0}}
meta_data/meta_s42_e12_cvi1.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5642633228840125, "recall": 0.6101694915254238, "f1-score": 0.5863192182410423, "support": 295.0}, "B-MajorClaim": {"precision": 0.6354166666666666, "recall": 0.782051282051282, "f1-score": 0.7011494252873562, "support": 156.0}, "B-Premise": {"precision": 0.7597977243994943, "recall": 0.8266850068775791, "f1-score": 0.7918313570487484, "support": 727.0}, "I-Claim": {"precision": 0.6139733601407389, "recall": 0.589669321747526, "f1-score": 0.6015759665107117, "support": 4143.0}, "I-MajorClaim": {"precision": 0.7746124842899036, "recall": 0.8431372549019608, "f1-score": 0.807423580786026, "support": 2193.0}, "I-Premise": {"precision": 0.8924645531304763, "recall": 0.8918252009870254, "f1-score": 0.8921447625114465, "support": 12563.0}, "O": {"precision": 0.9214748380667663, "recall": 0.9083497053045186, "f1-score": 0.9148651991095721, "support": 10180.0}, "accuracy": 0.8476055127739036, "macro avg": {"precision": 0.7374289927968655, "recall": 0.7788410376279022, "f1-score": 0.7564727870707005, "support": 30257.0}, "weighted avg": {"precision": 0.8478374745512745, "recall": 0.8476055127739036, "f1-score": 0.8474850909404354, "support": 30257.0}}
meta_data/meta_s42_e12_cvi2.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5856697819314641, "recall": 0.5930599369085173, "f1-score": 0.5893416927899686, "support": 317.0}, "B-MajorClaim": {"precision": 0.7151162790697675, "recall": 0.7935483870967742, "f1-score": 0.7522935779816515, "support": 155.0}, "B-Premise": {"precision": 0.7823660714285714, "recall": 0.8517618469015796, "f1-score": 0.8155904595695173, "support": 823.0}, "I-Claim": {"precision": 0.6304700162074555, "recall": 0.626841620626151, "f1-score": 0.6286505829389357, "support": 4344.0}, "I-MajorClaim": {"precision": 0.812900274473925, "recall": 0.8405865657521286, "f1-score": 0.8265116279069767, "support": 2114.0}, "I-Premise": {"precision": 0.8907231555880204, "recall": 0.8961563900933344, "f1-score": 0.8934315126204345, "support": 13607.0}, "O": {"precision": 0.9194622744338138, "recall": 0.8951774554887395, "f1-score": 0.907157366471502, "support": 8481.0}, "accuracy": 0.8477597935726014, "macro avg": {"precision": 0.7623868361618598, "recall": 0.7853046004096036, "f1-score": 0.7732824028969979, "support": 29841.0}, "weighted avg": {"precision": 0.8483512643382002, "recall": 0.8477597935726014, "f1-score": 0.8479369223678976, "support": 29841.0}}
meta_data/meta_s42_e12_cvi3.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5982404692082112, "recall": 0.6017699115044248, "f1-score": 0.6000000000000001, "support": 339.0}, "B-MajorClaim": {"precision": 0.7878787878787878, "recall": 0.8125, "f1-score": 0.8, "support": 160.0}, "B-Premise": {"precision": 0.7573099415204678, "recall": 0.8257173219978746, "f1-score": 0.7900355871886121, "support": 941.0}, "I-Claim": {"precision": 0.6340368208352043, "recall": 0.6011068539804172, "f1-score": 0.6171328671328671, "support": 4698.0}, "I-MajorClaim": {"precision": 0.8382559774964838, "recall": 0.8816568047337278, "f1-score": 0.8594087959625091, "support": 2028.0}, "I-Premise": {"precision": 0.868578964721398, "recall": 0.8863468138079537, "f1-score": 0.8773729434490108, "support": 14861.0}, "O": {"precision": 0.923061863743148, "recall": 0.9004105795856011, "f1-score": 0.9115955338585723, "support": 10473.0}, "accuracy": 0.8455223880597015, "macro avg": {"precision": 0.7724804036291001, "recall": 0.7870726122299999, "f1-score": 0.7793636753702246, "support": 33500.0}, "weighted avg": {"precision": 0.8446376053864566, "recall": 0.8455223880597015, "f1-score": 0.8448589275893504, "support": 33500.0}}
meta_data/meta_s42_e12_cvi4.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.6123188405797102, "recall": 0.6236162361623616, "f1-score": 0.6179159049360147, "support": 271.0}, "B-MajorClaim": {"precision": 0.6358381502890174, "recall": 0.7913669064748201, "f1-score": 0.7051282051282051, "support": 139.0}, "B-Premise": {"precision": 0.7745398773006135, "recall": 0.7977883096366508, "f1-score": 0.7859922178988326, "support": 633.0}, "I-Claim": {"precision": 0.6346679930365581, "recall": 0.6378405398650338, "f1-score": 0.6362503116429817, "support": 4001.0}, "I-MajorClaim": {"precision": 0.7706247019551741, "recall": 0.8027819175360159, "f1-score": 0.7863746958637469, "support": 2013.0}, "I-Premise": {"precision": 0.8815161765981439, "recall": 0.8965243472124206, "f1-score": 0.8889569210583862, "support": 11336.0}, "O": {"precision": 0.940029308984331, "recall": 0.9038586603078257, "f1-score": 0.9215892136818257, "support": 9226.0}, "accuracy": 0.8491980158586481, "macro avg": {"precision": 0.7499335783919354, "recall": 0.7791109881707328, "f1-score": 0.7631724957442847, "support": 27619.0}, "weighted avg": {"precision": 0.8508908939063541, "recall": 0.8491980158586481, "f1-score": 0.8498283285739571, "support": 27619.0}}
meta_data/meta_s42_e13_cvi0.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5305466237942122, "recall": 0.5809859154929577, "f1-score": 0.5546218487394958, "support": 284.0}, "B-MajorClaim": {"precision": 0.7153284671532847, "recall": 0.6950354609929078, "f1-score": 0.7050359712230215, "support": 141.0}, "B-Premise": {"precision": 0.7493506493506493, "recall": 0.8149717514124294, "f1-score": 0.7807848443843031, "support": 708.0}, "I-Claim": {"precision": 0.5895616924246263, "recall": 0.5707628157959284, "f1-score": 0.5800099700897308, "support": 4077.0}, "I-MajorClaim": {"precision": 0.8096035734226689, "recall": 0.7164031620553359, "f1-score": 0.7601572739187418, "support": 2024.0}, "I-Premise": {"precision": 0.871970952719236, "recall": 0.9031229561805101, "f1-score": 0.88727360346974, "support": 12232.0}, "O": {"precision": 0.9119373776908023, "recall": 0.8972436157276044, "f1-score": 0.9045308269908565, "support": 9868.0}, "accuracy": 0.8358219131383378, "macro avg": {"precision": 0.7397570480793542, "recall": 0.7397893825225248, "f1-score": 0.7389163341165557, "support": 29334.0}, "weighted avg": {"precision": 0.8348436696019514, "recall": 0.8358219131383378, "f1-score": 0.8349361913023641, "support": 29334.0}}
meta_data/meta_s42_e13_cvi1.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5741935483870968, "recall": 0.6033898305084746, "f1-score": 0.5884297520661158, "support": 295.0}, "B-MajorClaim": {"precision": 0.6564102564102564, "recall": 0.8205128205128205, "f1-score": 0.7293447293447293, "support": 156.0}, "B-Premise": {"precision": 0.7581863979848866, "recall": 0.828060522696011, "f1-score": 0.7915844838921761, "support": 727.0}, "I-Claim": {"precision": 0.6114157527417746, "recall": 0.5920830316195993, "f1-score": 0.6015941140404659, "support": 4143.0}, "I-MajorClaim": {"precision": 0.7891566265060241, "recall": 0.836297309621523, "f1-score": 0.8120433916316139, "support": 2193.0}, "I-Premise": {"precision": 0.8920251252285919, "recall": 0.8930191833160869, "f1-score": 0.8925218774860779, "support": 12563.0}, "O": {"precision": 0.9221503235440518, "recall": 0.9099214145383104, "f1-score": 0.9159950556242274, "support": 10180.0}, "accuracy": 0.8486300690749248, "macro avg": {"precision": 0.7433625758289547, "recall": 0.7833263018304036, "f1-score": 0.7616447720122009, "support": 30257.0}, "weighted avg": {"precision": 0.8487525695069351, "recall": 0.8486300690749248, "f1-score": 0.8485191545710714, "support": 30257.0}}
meta_data/meta_s42_e13_cvi2.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.6006600660066007, "recall": 0.5741324921135647, "f1-score": 0.5870967741935484, "support": 317.0}, "B-MajorClaim": {"precision": 0.7167630057803468, "recall": 0.8, "f1-score": 0.7560975609756098, "support": 155.0}, "B-Premise": {"precision": 0.7854748603351955, "recall": 0.8541919805589308, "f1-score": 0.8183934807916182, "support": 823.0}, "I-Claim": {"precision": 0.6409460105112279, "recall": 0.617633517495396, "f1-score": 0.629073856975381, "support": 4344.0}, "I-MajorClaim": {"precision": 0.8405521180390291, "recall": 0.8353831598864712, "f1-score": 0.8379596678529062, "support": 2114.0}, "I-Premise": {"precision": 0.8891789962286046, "recall": 0.9010068347174248, "f1-score": 0.8950538419419602, "support": 13607.0}, "O": {"precision": 0.9091125670041692, "recall": 0.8998938804386275, "f1-score": 0.9044797345342499, "support": 8481.0}, "accuracy": 0.8495023625213632, "macro avg": {"precision": 0.7689553748435962, "recall": 0.7831774093157735, "f1-score": 0.7754507024664676, "support": 29841.0}, "weighted avg": {"precision": 0.8484431569490358, "recall": 0.8495023625213632, "f1-score": 0.8488615147781041, "support": 29841.0}}
meta_data/meta_s42_e13_cvi3.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.6021798365122616, "recall": 0.6519174041297935, "f1-score": 0.6260623229461757, "support": 339.0}, "B-MajorClaim": {"precision": 0.7590361445783133, "recall": 0.7875, "f1-score": 0.7730061349693251, "support": 160.0}, "B-Premise": {"precision": 0.7682317682317682, "recall": 0.8172157279489904, "f1-score": 0.7919670442842431, "support": 941.0}, "I-Claim": {"precision": 0.6284153005464481, "recall": 0.6119625372498936, "f1-score": 0.6200798015744635, "support": 4698.0}, "I-MajorClaim": {"precision": 0.8337236533957846, "recall": 0.8777120315581854, "f1-score": 0.8551525342301226, "support": 2028.0}, "I-Premise": {"precision": 0.8702178533475027, "recall": 0.8816364982168091, "f1-score": 0.8758899622288332, "support": 14861.0}, "O": {"precision": 0.9249019607843137, "recall": 0.9007925140838347, "f1-score": 0.9126880472113386, "support": 10473.0}, "accuracy": 0.8449850746268657, "macro avg": {"precision": 0.7695295024851988, "recall": 0.7898195304553582, "f1-score": 0.7792636924920718, "support": 33500.0}, "weighted avg": {"precision": 0.8450860670615373, "recall": 0.8449850746268657, "f1-score": 0.8448872833459066, "support": 33500.0}}
meta_data/meta_s42_e13_cvi4.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.6085271317829457, "recall": 0.5793357933579336, "f1-score": 0.5935727788279772, "support": 271.0}, "B-MajorClaim": {"precision": 0.6473988439306358, "recall": 0.8057553956834532, "f1-score": 0.717948717948718, "support": 139.0}, "B-Premise": {"precision": 0.7566371681415929, "recall": 0.8104265402843602, "f1-score": 0.782608695652174, "support": 633.0}, "I-Claim": {"precision": 0.6461824953445066, "recall": 0.6070982254436391, "f1-score": 0.6260309278350515, "support": 4001.0}, "I-MajorClaim": {"precision": 0.7872444011684518, "recall": 0.8032786885245902, "f1-score": 0.7951807228915663, "support": 2013.0}, "I-Premise": {"precision": 0.8729847308709374, "recall": 0.902787579393084, "f1-score": 0.8876360640097141, "support": 11336.0}, "O": {"precision": 0.9370403387564074, "recall": 0.9114459137220897, "f1-score": 0.924065934065934, "support": 9226.0}, "accuracy": 0.8498135341612658, "macro avg": {"precision": 0.7508593014279253, "recall": 0.7743040194870214, "f1-score": 0.7610062630330193, "support": 27619.0}, "weighted avg": {"precision": 0.8488808008037289, "recall": 0.8498135341612658, "f1-score": 0.849023051738306, "support": 27619.0}}
meta_data/meta_s42_e14_cvi0.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5419354838709678, "recall": 0.5915492957746479, "f1-score": 0.5656565656565656, "support": 284.0}, "B-MajorClaim": {"precision": 0.7285714285714285, "recall": 0.723404255319149, "f1-score": 0.7259786476868327, "support": 141.0}, "B-Premise": {"precision": 0.7503250975292588, "recall": 0.8149717514124294, "f1-score": 0.7813134732566014, "support": 708.0}, "I-Claim": {"precision": 0.5926021727884118, "recall": 0.5619327937208732, "f1-score": 0.5768601284149566, "support": 4077.0}, "I-MajorClaim": {"precision": 0.7985193019566367, "recall": 0.7460474308300395, "f1-score": 0.7713920817369093, "support": 2024.0}, "I-Premise": {"precision": 0.8698108765596798, "recall": 0.9061478090255068, "f1-score": 0.8876076076076077, "support": 12232.0}, "O": {"precision": 0.9144045761830473, "recall": 0.8909606809890556, "f1-score": 0.9025304111276498, "support": 9868.0}, "accuracy": 0.8360264539442286, "macro avg": {"precision": 0.7423098482084901, "recall": 0.7478591452959574, "f1-score": 0.7444769879267319, "support": 29334.0}, "weighted avg": {"precision": 0.8346281292482969, "recall": 0.8360264539442286, "f1-score": 0.8349601848804515, "support": 29334.0}}
meta_data/meta_s42_e14_cvi1.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.56, "recall": 0.6169491525423729, "f1-score": 0.5870967741935484, "support": 295.0}, "B-MajorClaim": {"precision": 0.671957671957672, "recall": 0.8141025641025641, "f1-score": 0.736231884057971, "support": 156.0}, "B-Premise": {"precision": 0.7611749680715197, "recall": 0.8198074277854195, "f1-score": 0.7894039735099337, "support": 727.0}, "I-Claim": {"precision": 0.6131707317073171, "recall": 0.6068066618392469, "f1-score": 0.6099720975373044, "support": 4143.0}, "I-MajorClaim": {"precision": 0.8008714596949891, "recall": 0.8381212950296397, "f1-score": 0.8190730837789661, "support": 2193.0}, "I-Premise": {"precision": 0.891461764939922, "recall": 0.8917456021650879, "f1-score": 0.8916036609629925, "support": 12563.0}, "O": {"precision": 0.9253850770154031, "recall": 0.9088408644400786, "f1-score": 0.9170383586083855, "support": 10180.0}, "accuracy": 0.849786826188981, "macro avg": {"precision": 0.7462888104838319, "recall": 0.7851962239863443, "f1-score": 0.7643456903784431, "support": 30257.0}, "weighted avg": {"precision": 0.8507099609394506, "recall": 0.849786826188981, "f1-score": 0.850115720896904, "support": 30257.0}}
meta_data/meta_s42_e14_cvi2.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5758513931888545, "recall": 0.5867507886435331, "f1-score": 0.58125, "support": 317.0}, "B-MajorClaim": {"precision": 0.72, "recall": 0.8129032258064516, "f1-score": 0.7636363636363636, "support": 155.0}, "B-Premise": {"precision": 0.7849099099099099, "recall": 0.8469015795868773, "f1-score": 0.814728229105786, "support": 823.0}, "I-Claim": {"precision": 0.625531914893617, "recall": 0.6429558011049724, "f1-score": 0.6341241911681235, "support": 4344.0}, "I-MajorClaim": {"precision": 0.8161496350364964, "recall": 0.8462630085146642, "f1-score": 0.8309335810496981, "support": 2114.0}, "I-Premise": {"precision": 0.8961986149992633, "recall": 0.894025134122143, "f1-score": 0.8951105551672124, "support": 13607.0}, "O": {"precision": 0.9208414396887159, "recall": 0.8929371536375428, "f1-score": 0.9066746483088896, "support": 8481.0}, "accuracy": 0.848798632753594, "macro avg": {"precision": 0.7627832725309797, "recall": 0.7889623844880262, "f1-score": 0.7752082240622962, "support": 29841.0}, "weighted avg": {"precision": 0.8507425193042031, "recall": 0.848798632753594, "f1-score": 0.849624587385109, "support": 29841.0}}
meta_data/meta_s42_e14_cvi3.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.6080691642651297, "recall": 0.6224188790560472, "f1-score": 0.6151603498542274, "support": 339.0}, "B-MajorClaim": {"precision": 0.7687861271676301, "recall": 0.83125, "f1-score": 0.7987987987987989, "support": 160.0}, "B-Premise": {"precision": 0.7608267716535433, "recall": 0.8214665249734325, "f1-score": 0.7899846704138989, "support": 941.0}, "I-Claim": {"precision": 0.6311366651923928, "recall": 0.6074925500212857, "f1-score": 0.6190889370932755, "support": 4698.0}, "I-MajorClaim": {"precision": 0.8506069094304388, "recall": 0.8984220907297831, "f1-score": 0.8738609112709832, "support": 2028.0}, "I-Premise": {"precision": 0.8681159420289855, "recall": 0.8867505551443375, "f1-score": 0.8773343097766386, "support": 14861.0}, "O": {"precision": 0.9317193675889328, "recall": 0.9003150959610426, "f1-score": 0.9157480697324334, "support": 10473.0}, "accuracy": 0.8477611940298507, "macro avg": {"precision": 0.7741801353324362, "recall": 0.7954450994122756, "f1-score": 0.7842822924200367, "support": 33500.0}, "weighted avg": {"precision": 0.8475868070390783, "recall": 0.8477611940298507, "f1-score": 0.8474354390354637, "support": 33500.0}}
meta_data/meta_s42_e14_cvi4.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.608058608058608, "recall": 0.6125461254612546, "f1-score": 0.6102941176470588, "support": 271.0}, "B-MajorClaim": {"precision": 0.650887573964497, "recall": 0.7913669064748201, "f1-score": 0.7142857142857143, "support": 139.0}, "B-Premise": {"precision": 0.77526395173454, "recall": 0.8120063191153238, "f1-score": 0.7932098765432098, "support": 633.0}, "I-Claim": {"precision": 0.6432855280312908, "recall": 0.6165958510372407, "f1-score": 0.6296579887697805, "support": 4001.0}, "I-MajorClaim": {"precision": 0.7802516940948693, "recall": 0.8007948335817189, "f1-score": 0.7903898014219172, "support": 2013.0}, "I-Premise": {"precision": 0.8767545361177679, "recall": 0.9036697247706422, "f1-score": 0.8900086880973067, "support": 11336.0}, "O": {"precision": 0.9373950050397581, "recall": 0.9072187296769998, "f1-score": 0.9220600385568715, "support": 9226.0}, "accuracy": 0.8502480176689959, "macro avg": {"precision": 0.7531281281487615, "recall": 0.7777426414454286, "f1-score": 0.7642723179031227, "support": 27619.0}, "weighted avg": {"precision": 0.8500571031828417, "recall": 0.8502480176689959, "f1-score": 0.8498916673068139, "support": 27619.0}}
meta_data/meta_s42_e15_cvi0.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5235109717868338, "recall": 0.5880281690140845, "f1-score": 0.5538971807628524, "support": 284.0}, "B-MajorClaim": {"precision": 0.704225352112676, "recall": 0.7092198581560284, "f1-score": 0.7067137809187278, "support": 141.0}, "B-Premise": {"precision": 0.7470967741935484, "recall": 0.8177966101694916, "f1-score": 0.7808496291301417, "support": 708.0}, "I-Claim": {"precision": 0.5874384236453202, "recall": 0.5849889624724062, "f1-score": 0.5862111343246897, "support": 4077.0}, "I-MajorClaim": {"precision": 0.775950668036999, "recall": 0.7460474308300395, "f1-score": 0.760705289672544, "support": 2024.0}, "I-Premise": {"precision": 0.8748707342295761, "recall": 0.8991170699803793, "f1-score": 0.8868282062653711, "support": 12232.0}, "O": {"precision": 0.9163953366243042, "recall": 0.8841710579651398, "f1-score": 0.899994842436433, "support": 9868.0}, "accuracy": 0.8339810458853207, "macro avg": {"precision": 0.732784037232751, "recall": 0.7470527369410813, "f1-score": 0.7393142947872514, "support": 29334.0}, "weighted avg": {"precision": 0.8347595287031446, "recall": 0.8339810458853207, "f1-score": 0.8341268495605801, "support": 29334.0}}
meta_data/meta_s42_e15_cvi1.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5936507936507937, "recall": 0.6338983050847458, "f1-score": 0.6131147540983606, "support": 295.0}, "B-MajorClaim": {"precision": 0.6632124352331606, "recall": 0.8205128205128205, "f1-score": 0.7335243553008597, "support": 156.0}, "B-Premise": {"precision": 0.7657430730478589, "recall": 0.8363136176066025, "f1-score": 0.7994740302432611, "support": 727.0}, "I-Claim": {"precision": 0.6200743494423792, "recall": 0.6039102099927589, "f1-score": 0.611885546588408, "support": 4143.0}, "I-MajorClaim": {"precision": 0.789873417721519, "recall": 0.853625170998632, "f1-score": 0.8205128205128205, "support": 2193.0}, "I-Premise": {"precision": 0.8934943944418128, "recall": 0.9008198678659556, "f1-score": 0.8971421776527012, "support": 12563.0}, "O": {"precision": 0.933630109267503, "recall": 0.906483300589391, "f1-score": 0.9198564593301436, "support": 10180.0}, "accuracy": 0.8540833526126186, "macro avg": {"precision": 0.751382653257861, "recall": 0.7936518989501294, "f1-score": 0.7707871633895078, "support": 30257.0}, "weighted avg": {"precision": 0.8548689018292591, "recall": 0.8540833526126186, "f1-score": 0.8542115424729795, "support": 30257.0}}
meta_data/meta_s42_e15_cvi2.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5899053627760252, "recall": 0.5899053627760252, "f1-score": 0.5899053627760252, "support": 317.0}, "B-MajorClaim": {"precision": 0.7426900584795322, "recall": 0.8193548387096774, "f1-score": 0.7791411042944786, "support": 155.0}, "B-Premise": {"precision": 0.7868480725623582, "recall": 0.8432563791008505, "f1-score": 0.8140762463343107, "support": 823.0}, "I-Claim": {"precision": 0.6322775966464834, "recall": 0.625, "f1-score": 0.6286177355869413, "support": 4344.0}, "I-MajorClaim": {"precision": 0.8225430833721472, "recall": 0.8353831598864712, "f1-score": 0.8289134006101855, "support": 2114.0}, "I-Premise": {"precision": 0.8930642201834862, "recall": 0.8942456088777835, "f1-score": 0.8936545240893067, "support": 13607.0}, "O": {"precision": 0.9061273051754908, "recall": 0.8980073104586723, "f1-score": 0.9020490347033044, "support": 8481.0}, "accuracy": 0.8469220200395429, "macro avg": {"precision": 0.7676365284565033, "recall": 0.786450379972783, "f1-score": 0.7766224869135074, "support": 29841.0}, "weighted avg": {"precision": 0.8468869474915125, "recall": 0.8469220200395429, "f1-score": 0.8468558348172082, "support": 29841.0}}
meta_data/meta_s42_e15_cvi3.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.6197604790419161, "recall": 0.6106194690265486, "f1-score": 0.6151560178306091, "support": 339.0}, "B-MajorClaim": {"precision": 0.7941176470588235, "recall": 0.84375, "f1-score": 0.8181818181818182, "support": 160.0}, "B-Premise": {"precision": 0.758220502901354, "recall": 0.8331562167906482, "f1-score": 0.7939240506329114, "support": 941.0}, "I-Claim": {"precision": 0.6368613138686131, "recall": 0.5942954448701575, "f1-score": 0.6148425456947809, "support": 4698.0}, "I-MajorClaim": {"precision": 0.8533653846153846, "recall": 0.8752465483234714, "f1-score": 0.8641674780915287, "support": 2028.0}, "I-Premise": {"precision": 0.8667277007585665, "recall": 0.891864612071866, "f1-score": 0.8791165058203164, "support": 14861.0}, "O": {"precision": 0.9239662943366647, "recall": 0.9004105795856011, "f1-score": 0.9120363653948451, "support": 10473.0}, "accuracy": 0.8470746268656716, "macro avg": {"precision": 0.7790027603687603, "recall": 0.7927632672383275, "f1-score": 0.7853463973781157, "support": 33500.0}, "weighted avg": {"precision": 0.8456830427841937, "recall": 0.8470746268656716, "f1-score": 0.8460865279289216, "support": 33500.0}}
meta_data/meta_s42_e15_cvi4.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5985130111524164, "recall": 0.5940959409594095, "f1-score": 0.5962962962962963, "support": 271.0}, "B-MajorClaim": {"precision": 0.6707317073170732, "recall": 0.7913669064748201, "f1-score": 0.7260726072607261, "support": 139.0}, "B-Premise": {"precision": 0.7566765578635015, "recall": 0.8056872037914692, "f1-score": 0.7804131599081868, "support": 633.0}, "I-Claim": {"precision": 0.6212043232115285, "recall": 0.6033491627093227, "f1-score": 0.6121465703055661, "support": 4001.0}, "I-MajorClaim": {"precision": 0.7810361681329423, "recall": 0.793840039741679, "f1-score": 0.7873860556787385, "support": 2013.0}, "I-Premise": {"precision": 0.8689666893269884, "recall": 0.9020818630910374, "f1-score": 0.8852146814404431, "support": 11336.0}, "O": {"precision": 0.9395142986836132, "recall": 0.8973553002384566, "f1-score": 0.9179509923494844, "support": 9226.0}, "accuracy": 0.8435497302581556, "macro avg": {"precision": 0.748091822241152, "recall": 0.7696823452865992, "f1-score": 0.7579257661770632, "support": 27619.0}, "weighted avg": {"precision": 0.8440071909900311, "recall": 0.8435497302581556, "f1-score": 0.8434243803550634, "support": 27619.0}}
meta_data/meta_s42_e16_cvi0.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5466666666666666, "recall": 0.5774647887323944, "f1-score": 0.5616438356164384, "support": 284.0}, "B-MajorClaim": {"precision": 0.7299270072992701, "recall": 0.7092198581560284, "f1-score": 0.7194244604316545, "support": 141.0}, "B-Premise": {"precision": 0.754863813229572, "recall": 0.8220338983050848, "f1-score": 0.7870182555780934, "support": 708.0}, "I-Claim": {"precision": 0.5857645875251509, "recall": 0.5712533725778759, "f1-score": 0.5784179808766918, "support": 4077.0}, "I-MajorClaim": {"precision": 0.7987048030221263, "recall": 0.7312252964426877, "f1-score": 0.7634769151405727, "support": 2024.0}, "I-Premise": {"precision": 0.8739856801909308, "recall": 0.8981360366252452, "f1-score": 0.8858962986855898, "support": 12232.0}, "O": {"precision": 0.9090161406394571, "recall": 0.896027563842724, "f1-score": 0.9024751212043889, "support": 9868.0}, "accuracy": 0.8346287584373082, "macro avg": {"precision": 0.7427040997961676, "recall": 0.7436229735260057, "f1-score": 0.7426218382190614, "support": 29334.0}, "weighted avg": {"precision": 0.8337806464072924, "recall": 0.8346287584373082, "f1-score": 0.8339657063145517, "support": 29334.0}}
meta_data/meta_s42_e4_cvi0.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.45161290322580644, "recall": 0.19718309859154928, "f1-score": 0.2745098039215686, "support": 284.0}, "B-MajorClaim": {"precision": 0.0, "recall": 0.0, "f1-score": 0.0, "support": 141.0}, "B-Premise": {"precision": 0.6189495365602472, "recall": 0.8488700564971752, "f1-score": 0.7159023228111973, "support": 708.0}, "I-Claim": {"precision": 0.5432489451476793, "recall": 0.5052734854059358, "f1-score": 0.5235735163299021, "support": 4077.0}, "I-MajorClaim": {"precision": 0.612832824094282, "recall": 0.6936758893280632, "f1-score": 0.6507531865585168, "support": 2024.0}, "I-Premise": {"precision": 0.8590851933354291, "recall": 0.8936396337475474, "f1-score": 0.8760217983651226, "support": 12232.0}, "O": {"precision": 0.9105173876166243, "recall": 0.8702877989460883, "f1-score": 0.8899481865284974, "support": 9868.0}, "accuracy": 0.8058907752096544, "macro avg": {"precision": 0.5708923985685811, "recall": 0.5727042803594798, "f1-score": 0.5615298306449722, "support": 29334.0}, "weighted avg": {"precision": 0.8016291534606435, "recall": 0.8058907752096544, "f1-score": 0.802279288429839, "support": 29334.0}}
meta_data/meta_s42_e4_cvi1.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.44019138755980863, "recall": 0.31186440677966104, "f1-score": 0.3650793650793651, "support": 295.0}, "B-MajorClaim": {"precision": 0.25, "recall": 0.00641025641025641, "f1-score": 0.0125, "support": 156.0}, "B-Premise": {"precision": 0.6536842105263158, "recall": 0.8541953232462174, "f1-score": 0.740608228980322, "support": 727.0}, "I-Claim": {"precision": 0.5942634363686995, "recall": 0.5150856867004586, "f1-score": 0.5518489785363331, "support": 4143.0}, "I-MajorClaim": {"precision": 0.6617754952311079, "recall": 0.8226174190606476, "f1-score": 0.7334824151250255, "support": 2193.0}, "I-Premise": {"precision": 0.8746502953061859, "recall": 0.8958847409058346, "f1-score": 0.8851401832409265, "support": 12563.0}, "O": {"precision": 0.9236048037137955, "recall": 0.8990176817288802, "f1-score": 0.911145402956842, "support": 10180.0}, "accuracy": 0.8282050434610173, "macro avg": {"precision": 0.6283099469579875, "recall": 0.6150107878331365, "f1-score": 0.5999720819884019, "support": 30257.0}, "weighted avg": {"precision": 0.8245338440704025, "recall": 0.8282050434610173, "f1-score": 0.8242186658878515, "support": 30257.0}}
meta_data/meta_s42_e4_cvi2.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.5431034482758621, "recall": 0.19873817034700317, "f1-score": 0.2909930715935335, "support": 317.0}, "B-MajorClaim": {"precision": 0.0, "recall": 0.0, "f1-score": 0.0, "support": 155.0}, "B-Premise": {"precision": 0.6691519105312209, "recall": 0.8724179829890644, "f1-score": 0.7573839662447258, "support": 823.0}, "I-Claim": {"precision": 0.6121104636432734, "recall": 0.5561694290976059, "f1-score": 0.5828006271861054, "support": 4344.0}, "I-MajorClaim": {"precision": 0.6880234505862647, "recall": 0.7771996215704825, "f1-score": 0.7298978231896935, "support": 2114.0}, "I-Premise": {"precision": 0.8733275787656453, "recall": 0.8922613360770192, "f1-score": 0.8826929368570287, "support": 13607.0}, "O": {"precision": 0.8817587641117053, "recall": 0.8748968282042212, "f1-score": 0.878314393939394, "support": 8481.0}, "accuracy": 0.8177004792064609, "macro avg": {"precision": 0.609639373701996, "recall": 0.5959547668979138, "f1-score": 0.5888689741443545, "support": 29841.0}, "weighted avg": {"precision": 0.8108954018555643, "recall": 0.8177004792064609, "f1-score": 0.8126419656662848, "support": 29841.0}}
meta_data/meta_s42_e4_cvi3.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.3854166666666667, "recall": 0.10914454277286136, "f1-score": 0.17011494252873563, "support": 339.0}, "B-MajorClaim": {"precision": 0.0, "recall": 0.0, "f1-score": 0.0, "support": 160.0}, "B-Premise": {"precision": 0.6150234741784038, "recall": 0.8352816153028693, "f1-score": 0.7084272194682288, "support": 941.0}, "I-Claim": {"precision": 0.6174849699398798, "recall": 0.5246913580246914, "f1-score": 0.567318757192175, "support": 4698.0}, "I-MajorClaim": {"precision": 0.6872174270448007, "recall": 0.8244575936883629, "f1-score": 0.749607711275499, "support": 2028.0}, "I-Premise": {"precision": 0.8566045091287116, "recall": 0.8871542964807213, "f1-score": 0.8716117942615363, "support": 14861.0}, "O": {"precision": 0.8987292656901736, "recall": 0.884655781533467, "f1-score": 0.8916369935521125, "support": 10473.0}, "accuracy": 0.8181791044776119, "macro avg": {"precision": 0.5800680446640909, "recall": 0.5807693125432819, "f1-score": 0.5655310597540409, "support": 33500.0}, "weighted avg": {"precision": 0.8103404740227241, "recall": 0.8181791044776119, "f1-score": 0.8119672849786376, "support": 33500.0}}
meta_data/meta_s42_e4_cvi4.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.49019607843137253, "recall": 0.18450184501845018, "f1-score": 0.2680965147453083, "support": 271.0}, "B-MajorClaim": {"precision": 0.0, "recall": 0.0, "f1-score": 0.0, "support": 139.0}, "B-Premise": {"precision": 0.6389937106918239, "recall": 0.8025276461295419, "f1-score": 0.711484593837535, "support": 633.0}, "I-Claim": {"precision": 0.5901814300960512, "recall": 0.5528617845538615, "f1-score": 0.5709123757904246, "support": 4001.0}, "I-MajorClaim": {"precision": 0.6800699300699301, "recall": 0.7729756582215599, "f1-score": 0.723552662171588, "support": 2013.0}, "I-Premise": {"precision": 0.8548009367681498, "recall": 0.9015525758645024, "f1-score": 0.8775545251588528, "support": 11336.0}, "O": {"precision": 0.9404352806414662, "recall": 0.889876436158682, "f1-score": 0.9144575629316106, "support": 9226.0}, "accuracy": 0.8239255584923423, "macro avg": {"precision": 0.5992396238141133, "recall": 0.5863279922780854, "f1-score": 0.5808654620907598, "support": 27619.0}, "weighted avg": {"precision": 0.8195120078775411, "recall": 0.8239255584923423, "f1-score": 0.8200332887031329, "support": 27619.0}}
meta_data/meta_s42_e5_cvi0.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"B-Claim": {"precision": 0.44933920704845814, "recall": 0.3591549295774648, "f1-score": 0.3992172211350294, "support": 284.0}, "B-MajorClaim": {"precision": 0.8333333333333334, "recall": 0.03546099290780142, "f1-score": 0.06802721088435375, "support": 141.0}, "B-Premise": {"precision": 0.6659364731653888, "recall": 0.8587570621468926, "f1-score": 0.7501542257865514, "support": 708.0}, "I-Claim": {"precision": 0.5467836257309941, "recall": 0.5045376502330144, "f1-score": 0.5248118382446739, "support": 4077.0}, "I-MajorClaim": {"precision": 0.6616847826086957, "recall": 0.7218379446640316, "f1-score": 0.6904536862003781, "support": 2024.0}, "I-Premise": {"precision": 0.8573203883495145, "recall": 0.9023871811641596, "f1-score": 0.8792766957422232, "support": 12232.0}, "O": {"precision": 0.9172642620143423, "recall": 0.8684637211187677, "f1-score": 0.8921971786997033, "support": 9868.0}, "accuracy": 0.8127428922069952, "macro avg": {"precision": 0.7045231531786753, "recall": 0.6072284974017332, "f1-score": 0.6005911509561305, "support": 29334.0}, "weighted avg": {"precision": 0.812142528388795, "recall": 0.8127428922069952, "f1-score": 0.8096655466869356, "support": 29334.0}}