lilyyellow commited on
Commit
8eaaf8f
·
verified ·
1 Parent(s): 2c07d35

End of training

Browse files
Files changed (4) hide show
  1. README.md +26 -29
  2. config.json +1 -1
  3. model.safetensors +1 -1
  4. training_args.bin +1 -1
README.md CHANGED
@@ -1,5 +1,5 @@
1
  ---
2
- base_model: NlpHUST/ner-vietnamese-electra-base
3
  tags:
4
  - generated_from_trainer
5
  model-index:
@@ -12,27 +12,27 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # my_awesome_ner-token_classification_v1.0.7-5
14
 
15
- This model is a fine-tuned version of [NlpHUST/ner-vietnamese-electra-base](https://huggingface.co/NlpHUST/ner-vietnamese-electra-base) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.4278
18
- - Age: {'precision': 0.8680555555555556, 'recall': 0.946969696969697, 'f1': 0.9057971014492754, 'number': 132}
19
- - Datetime: {'precision': 0.7073403241182078, 'recall': 0.7540650406504065, 'f1': 0.7299557304476143, 'number': 984}
20
- - Disease: {'precision': 0.684981684981685, 'recall': 0.6607773851590106, 'f1': 0.6726618705035972, 'number': 283}
21
- - Event: {'precision': 0.30670926517571884, 'recall': 0.36363636363636365, 'f1': 0.3327556325823223, 'number': 264}
22
- - Gender: {'precision': 0.7583333333333333, 'recall': 0.7982456140350878, 'f1': 0.7777777777777778, 'number': 114}
23
- - Law: {'precision': 0.5238095238095238, 'recall': 0.6956521739130435, 'f1': 0.597623089983022, 'number': 253}
24
- - Location: {'precision': 0.6987080103359173, 'recall': 0.7392017495899399, 'f1': 0.7183846971307121, 'number': 1829}
25
- - Organization: {'precision': 0.6421254801536491, 'recall': 0.713371266002845, 'f1': 0.6758760107816711, 'number': 1406}
26
- - Person: {'precision': 0.6771269177126917, 'recall': 0.7273408239700374, 'f1': 0.701336222462983, 'number': 1335}
27
- - Phone: {'precision': 0.8314606741573034, 'recall': 0.9487179487179487, 'f1': 0.8862275449101796, 'number': 78}
28
- - Product: {'precision': 0.3771186440677966, 'recall': 0.34765625, 'f1': 0.36178861788617883, 'number': 256}
29
- - Quantity: {'precision': 0.5411184210526315, 'recall': 0.6047794117647058, 'f1': 0.5711805555555554, 'number': 544}
30
- - Role: {'precision': 0.4329896907216495, 'recall': 0.48554913294797686, 'f1': 0.45776566757493187, 'number': 519}
31
- - Transportation: {'precision': 0.4880952380952381, 'recall': 0.5942028985507246, 'f1': 0.5359477124183006, 'number': 138}
32
- - Overall Precision: 0.6293
33
- - Overall Recall: 0.6846
34
- - Overall F1: 0.6558
35
- - Overall Accuracy: 0.8864
36
 
37
  ## Model description
38
 
@@ -57,17 +57,14 @@ The following hyperparameters were used during training:
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
  - lr_scheduler_type: cosine
60
- - num_epochs: 10
61
 
62
  ### Training results
63
 
64
- | Training Loss | Epoch | Step | Validation Loss | Age | Datetime | Disease | Event | Gender | Law | Location | Organization | Person | Phone | Product | Quantity | Role | Transportation | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
65
- |:-------------:|:------:|:-----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
66
- | 0.3253 | 1.9991 | 2313 | 0.3476 | {'precision': 0.8611111111111112, 'recall': 0.9393939393939394, 'f1': 0.8985507246376813, 'number': 132} | {'precision': 0.7020057306590258, 'recall': 0.7469512195121951, 'f1': 0.723781388478582, 'number': 984} | {'precision': 0.6902985074626866, 'recall': 0.6537102473498233, 'f1': 0.6715063520871144, 'number': 283} | {'precision': 0.3137931034482759, 'recall': 0.3446969696969697, 'f1': 0.3285198555956678, 'number': 264} | {'precision': 0.6870229007633588, 'recall': 0.7894736842105263, 'f1': 0.7346938775510204, 'number': 114} | {'precision': 0.5159235668789809, 'recall': 0.6403162055335968, 'f1': 0.5714285714285714, 'number': 253} | {'precision': 0.6962690488702049, 'recall': 0.7244395844723893, 'f1': 0.710075026795284, 'number': 1829} | {'precision': 0.6082603254067585, 'recall': 0.6913229018492176, 'f1': 0.6471371504660454, 'number': 1406} | {'precision': 0.7044444444444444, 'recall': 0.7123595505617978, 'f1': 0.7083798882681565, 'number': 1335} | {'precision': 0.8045977011494253, 'recall': 0.8974358974358975, 'f1': 0.8484848484848485, 'number': 78} | {'precision': 0.37790697674418605, 'recall': 0.25390625, 'f1': 0.3037383177570093, 'number': 256} | {'precision': 0.549645390070922, 'recall': 0.5698529411764706, 'f1': 0.5595667870036101, 'number': 544} | {'precision': 0.4404332129963899, 'recall': 0.4701348747591522, 'f1': 0.45479962721342027, 'number': 519} | {'precision': 0.5060240963855421, 'recall': 0.6086956521739131, 'f1': 0.5526315789473684, 'number': 138} | 0.6297 | 0.6648 | 0.6468 | 0.8909 |
67
- | 0.2531 | 3.9983 | 4626 | 0.3606 | {'precision': 0.8378378378378378, 'recall': 0.9393939393939394, 'f1': 0.8857142857142858, 'number': 132} | {'precision': 0.6869728209934396, 'recall': 0.7449186991869918, 'f1': 0.7147732813261823, 'number': 984} | {'precision': 0.6821428571428572, 'recall': 0.6749116607773852, 'f1': 0.6785079928952042, 'number': 283} | {'precision': 0.3225806451612903, 'recall': 0.3787878787878788, 'f1': 0.3484320557491289, 'number': 264} | {'precision': 0.7652173913043478, 'recall': 0.7719298245614035, 'f1': 0.7685589519650655, 'number': 114} | {'precision': 0.5160349854227405, 'recall': 0.6996047430830039, 'f1': 0.593959731543624, 'number': 253} | {'precision': 0.7030271398747391, 'recall': 0.7364680153089119, 'f1': 0.7193591455273699, 'number': 1829} | {'precision': 0.6185252894576477, 'recall': 0.7219061166429588, 'f1': 0.6662290777814243, 'number': 1406} | {'precision': 0.6882649388048956, 'recall': 0.7161048689138577, 'f1': 0.7019089574155654, 'number': 1335} | {'precision': 0.8780487804878049, 'recall': 0.9230769230769231, 'f1': 0.9, 'number': 78} | {'precision': 0.36, 'recall': 0.31640625, 'f1': 0.3367983367983368, 'number': 256} | {'precision': 0.5583333333333333, 'recall': 0.6158088235294118, 'f1': 0.5856643356643356, 'number': 544} | {'precision': 0.4087237479806139, 'recall': 0.48747591522157996, 'f1': 0.4446397188049209, 'number': 519} | {'precision': 0.5369127516778524, 'recall': 0.5797101449275363, 'f1': 0.5574912891986062, 'number': 138} | 0.6249 | 0.6825 | 0.6524 | 0.8891 |
68
- | 0.1949 | 5.9974 | 6939 | 0.3906 | {'precision': 0.8551724137931035, 'recall': 0.9393939393939394, 'f1': 0.8953068592057762, 'number': 132} | {'precision': 0.7107039537126326, 'recall': 0.7489837398373984, 'f1': 0.7293419099455716, 'number': 984} | {'precision': 0.7193675889328063, 'recall': 0.6431095406360424, 'f1': 0.6791044776119403, 'number': 283} | {'precision': 0.3208955223880597, 'recall': 0.32575757575757575, 'f1': 0.3233082706766917, 'number': 264} | {'precision': 0.75, 'recall': 0.8157894736842105, 'f1': 0.7815126050420167, 'number': 114} | {'precision': 0.502906976744186, 'recall': 0.6837944664031621, 'f1': 0.5795644891122279, 'number': 253} | {'precision': 0.6926167754897037, 'recall': 0.7539639147074905, 'f1': 0.7219895287958116, 'number': 1829} | {'precision': 0.6372299872935197, 'recall': 0.713371266002845, 'f1': 0.6731543624161074, 'number': 1406} | {'precision': 0.6635198921105866, 'recall': 0.7370786516853932, 'f1': 0.6983676366217175, 'number': 1335} | {'precision': 0.8522727272727273, 'recall': 0.9615384615384616, 'f1': 0.9036144578313254, 'number': 78} | {'precision': 0.34901960784313724, 'recall': 0.34765625, 'f1': 0.34833659491193736, 'number': 256} | {'precision': 0.5363489499192245, 'recall': 0.6102941176470589, 'f1': 0.5709372312983664, 'number': 544} | {'precision': 0.4283276450511945, 'recall': 0.4836223506743738, 'f1': 0.45429864253393665, 'number': 519} | {'precision': 0.5185185185185185, 'recall': 0.6086956521739131, 'f1': 0.5599999999999999, 'number': 138} | 0.6263 | 0.6874 | 0.6554 | 0.8879 |
69
- | 0.1471 | 7.9965 | 9252 | 0.4196 | {'precision': 0.8551724137931035, 'recall': 0.9393939393939394, 'f1': 0.8953068592057762, 'number': 132} | {'precision': 0.697282099343955, 'recall': 0.7560975609756098, 'f1': 0.7254997562164799, 'number': 984} | {'precision': 0.6859205776173285, 'recall': 0.6713780918727915, 'f1': 0.6785714285714286, 'number': 283} | {'precision': 0.29941860465116277, 'recall': 0.39015151515151514, 'f1': 0.3388157894736842, 'number': 264} | {'precision': 0.736, 'recall': 0.8070175438596491, 'f1': 0.7698744769874476, 'number': 114} | {'precision': 0.5545454545454546, 'recall': 0.7233201581027668, 'f1': 0.6277873070325901, 'number': 253} | {'precision': 0.695852534562212, 'recall': 0.7430289775833789, 'f1': 0.7186673717609731, 'number': 1829} | {'precision': 0.645618556701031, 'recall': 0.7126600284495022, 'f1': 0.6774847870182555, 'number': 1406} | {'precision': 0.6722106722106722, 'recall': 0.7265917602996255, 'f1': 0.6983441324694024, 'number': 1335} | {'precision': 0.8409090909090909, 'recall': 0.9487179487179487, 'f1': 0.891566265060241, 'number': 78} | {'precision': 0.3826086956521739, 'recall': 0.34375, 'f1': 0.36213991769547327, 'number': 256} | {'precision': 0.5398373983739837, 'recall': 0.6102941176470589, 'f1': 0.5729076790336497, 'number': 544} | {'precision': 0.43791946308724833, 'recall': 0.5028901734104047, 'f1': 0.46816143497757845, 'number': 519} | {'precision': 0.5, 'recall': 0.6086956521739131, 'f1': 0.5490196078431373, 'number': 138} | 0.6276 | 0.6891 | 0.6569 | 0.8858 |
70
- | 0.1387 | 9.9957 | 11565 | 0.4278 | {'precision': 0.8680555555555556, 'recall': 0.946969696969697, 'f1': 0.9057971014492754, 'number': 132} | {'precision': 0.7073403241182078, 'recall': 0.7540650406504065, 'f1': 0.7299557304476143, 'number': 984} | {'precision': 0.684981684981685, 'recall': 0.6607773851590106, 'f1': 0.6726618705035972, 'number': 283} | {'precision': 0.30670926517571884, 'recall': 0.36363636363636365, 'f1': 0.3327556325823223, 'number': 264} | {'precision': 0.7583333333333333, 'recall': 0.7982456140350878, 'f1': 0.7777777777777778, 'number': 114} | {'precision': 0.5238095238095238, 'recall': 0.6956521739130435, 'f1': 0.597623089983022, 'number': 253} | {'precision': 0.6987080103359173, 'recall': 0.7392017495899399, 'f1': 0.7183846971307121, 'number': 1829} | {'precision': 0.6421254801536491, 'recall': 0.713371266002845, 'f1': 0.6758760107816711, 'number': 1406} | {'precision': 0.6771269177126917, 'recall': 0.7273408239700374, 'f1': 0.701336222462983, 'number': 1335} | {'precision': 0.8314606741573034, 'recall': 0.9487179487179487, 'f1': 0.8862275449101796, 'number': 78} | {'precision': 0.3771186440677966, 'recall': 0.34765625, 'f1': 0.36178861788617883, 'number': 256} | {'precision': 0.5411184210526315, 'recall': 0.6047794117647058, 'f1': 0.5711805555555554, 'number': 544} | {'precision': 0.4329896907216495, 'recall': 0.48554913294797686, 'f1': 0.45776566757493187, 'number': 519} | {'precision': 0.4880952380952381, 'recall': 0.5942028985507246, 'f1': 0.5359477124183006, 'number': 138} | 0.6293 | 0.6846 | 0.6558 | 0.8864 |
71
 
72
 
73
  ### Framework versions
 
1
  ---
2
+ base_model: lilyyellow/my_awesome_ner-token_classification_v1.0.7-5
3
  tags:
4
  - generated_from_trainer
5
  model-index:
 
12
 
13
  # my_awesome_ner-token_classification_v1.0.7-5
14
 
15
+ This model is a fine-tuned version of [lilyyellow/my_awesome_ner-token_classification_v1.0.7-5](https://huggingface.co/lilyyellow/my_awesome_ner-token_classification_v1.0.7-5) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.5208
18
+ - Age: {'precision': 0.8493150684931506, 'recall': 0.9393939393939394, 'f1': 0.8920863309352518, 'number': 132}
19
+ - Datetime: {'precision': 0.7049180327868853, 'recall': 0.7428861788617886, 'f1': 0.723404255319149, 'number': 984}
20
+ - Disease: {'precision': 0.6953405017921147, 'recall': 0.6855123674911661, 'f1': 0.6903914590747331, 'number': 283}
21
+ - Event: {'precision': 0.30033003300330036, 'recall': 0.3446969696969697, 'f1': 0.3209876543209877, 'number': 264}
22
+ - Gender: {'precision': 0.7647058823529411, 'recall': 0.7982456140350878, 'f1': 0.7811158798283262, 'number': 114}
23
+ - Law: {'precision': 0.5303514376996805, 'recall': 0.6561264822134387, 'f1': 0.5865724381625441, 'number': 253}
24
+ - Location: {'precision': 0.7111228255139694, 'recall': 0.7375615090213231, 'f1': 0.7241009125067096, 'number': 1829}
25
+ - Organization: {'precision': 0.6420640104506858, 'recall': 0.6991465149359887, 'f1': 0.6693905345590739, 'number': 1406}
26
+ - Person: {'precision': 0.6987087517934003, 'recall': 0.7295880149812735, 'f1': 0.7138145840967388, 'number': 1335}
27
+ - Phone: {'precision': 0.8522727272727273, 'recall': 0.9615384615384616, 'f1': 0.9036144578313254, 'number': 78}
28
+ - Product: {'precision': 0.4, 'recall': 0.3828125, 'f1': 0.3912175648702595, 'number': 256}
29
+ - Quantity: {'precision': 0.5313001605136437, 'recall': 0.6084558823529411, 'f1': 0.567266495287061, 'number': 544}
30
+ - Role: {'precision': 0.4302721088435374, 'recall': 0.48747591522157996, 'f1': 0.45709123757904246, 'number': 519}
31
+ - Transportation: {'precision': 0.5, 'recall': 0.6231884057971014, 'f1': 0.5548387096774193, 'number': 138}
32
+ - Overall Precision: 0.6349
33
+ - Overall Recall: 0.6817
34
+ - Overall F1: 0.6575
35
+ - Overall Accuracy: 0.8878
36
 
37
  ## Model description
38
 
 
57
  - seed: 42
58
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
59
  - lr_scheduler_type: cosine
60
+ - num_epochs: 5
61
 
62
  ### Training results
63
 
64
+ | Training Loss | Epoch | Step | Validation Loss | Age | Datetime | Disease | Event | Gender | Law | Location | Organization | Person | Phone | Product | Quantity | Role | Transportation | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
65
+ |:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
66
+ | 0.1172 | 1.9991 | 2313 | 0.4711 | {'precision': 0.8620689655172413, 'recall': 0.946969696969697, 'f1': 0.9025270758122743, 'number': 132} | {'precision': 0.6928909952606636, 'recall': 0.7428861788617886, 'f1': 0.7170181461500736, 'number': 984} | {'precision': 0.7089552238805971, 'recall': 0.6713780918727915, 'f1': 0.689655172413793, 'number': 283} | {'precision': 0.3090909090909091, 'recall': 0.32196969696969696, 'f1': 0.3153988868274582, 'number': 264} | {'precision': 0.7520661157024794, 'recall': 0.7982456140350878, 'f1': 0.7744680851063831, 'number': 114} | {'precision': 0.5795053003533569, 'recall': 0.6482213438735178, 'f1': 0.6119402985074627, 'number': 253} | {'precision': 0.7174721189591078, 'recall': 0.7386550027337343, 'f1': 0.7279094827586208, 'number': 1829} | {'precision': 0.6510554089709762, 'recall': 0.7019914651493598, 'f1': 0.675564681724846, 'number': 1406} | {'precision': 0.720666161998486, 'recall': 0.7131086142322097, 'f1': 0.716867469879518, 'number': 1335} | {'precision': 0.7816091954022989, 'recall': 0.8717948717948718, 'f1': 0.8242424242424243, 'number': 78} | {'precision': 0.38288288288288286, 'recall': 0.33203125, 'f1': 0.35564853556485354, 'number': 256} | {'precision': 0.5684575389948007, 'recall': 0.6029411764705882, 'f1': 0.5851917930419268, 'number': 544} | {'precision': 0.4645390070921986, 'recall': 0.5048169556840078, 'f1': 0.4838411819021238, 'number': 519} | {'precision': 0.47368421052631576, 'recall': 0.5869565217391305, 'f1': 0.5242718446601942, 'number': 138} | 0.6480 | 0.6761 | 0.6617 | 0.8890 |
67
+ | 0.0813 | 3.9983 | 4626 | 0.5208 | {'precision': 0.8493150684931506, 'recall': 0.9393939393939394, 'f1': 0.8920863309352518, 'number': 132} | {'precision': 0.7049180327868853, 'recall': 0.7428861788617886, 'f1': 0.723404255319149, 'number': 984} | {'precision': 0.6953405017921147, 'recall': 0.6855123674911661, 'f1': 0.6903914590747331, 'number': 283} | {'precision': 0.30033003300330036, 'recall': 0.3446969696969697, 'f1': 0.3209876543209877, 'number': 264} | {'precision': 0.7647058823529411, 'recall': 0.7982456140350878, 'f1': 0.7811158798283262, 'number': 114} | {'precision': 0.5303514376996805, 'recall': 0.6561264822134387, 'f1': 0.5865724381625441, 'number': 253} | {'precision': 0.7111228255139694, 'recall': 0.7375615090213231, 'f1': 0.7241009125067096, 'number': 1829} | {'precision': 0.6420640104506858, 'recall': 0.6991465149359887, 'f1': 0.6693905345590739, 'number': 1406} | {'precision': 0.6987087517934003, 'recall': 0.7295880149812735, 'f1': 0.7138145840967388, 'number': 1335} | {'precision': 0.8522727272727273, 'recall': 0.9615384615384616, 'f1': 0.9036144578313254, 'number': 78} | {'precision': 0.4, 'recall': 0.3828125, 'f1': 0.3912175648702595, 'number': 256} | {'precision': 0.5313001605136437, 'recall': 0.6084558823529411, 'f1': 0.567266495287061, 'number': 544} | {'precision': 0.4302721088435374, 'recall': 0.48747591522157996, 'f1': 0.45709123757904246, 'number': 519} | {'precision': 0.5, 'recall': 0.6231884057971014, 'f1': 0.5548387096774193, 'number': 138} | 0.6349 | 0.6817 | 0.6575 | 0.8878 |
 
 
 
68
 
69
 
70
  ### Framework versions
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "NlpHUST/ner-vietnamese-electra-base",
3
  "architectures": [
4
  "ElectraForTokenClassification"
5
  ],
 
1
  {
2
+ "_name_or_path": "lilyyellow/my_awesome_ner-token_classification_v1.0.7-5",
3
  "architectures": [
4
  "ElectraForTokenClassification"
5
  ],
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0f519492d5be7eb441ef2b187306e8a70916993e5b7bd43d0c4947912090102f
3
  size 532380148
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2a7dcd8bb58efc27954bbb08041f58d3af30c94a6cf60194df25acc30e310bbb
3
  size 532380148
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b00b5adb2452372f7b274c8f6fe5d8693f9dc9c2cd2a1fb14c986c4ce906c9f8
3
  size 5112
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ba4cc3b9994e50a8607f06cd4430123c5d7dd6857e10a067b7ba55a230862558
3
  size 5112