File size: 9,392 Bytes
2c07d35 bd21def 2c07d35 bd21def 2c07d35 bd21def 2c07d35 bd21def 2c07d35 8eaaf8f 2c07d35 bd21def 2c07d35 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 |
---
base_model: NlpHUST/ner-vietnamese-electra-base
tags:
- generated_from_trainer
model-index:
- name: my_awesome_ner-token_classification_v1.0.7-5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_ner-token_classification_v1.0.7-5
This model is a fine-tuned version of [NlpHUST/ner-vietnamese-electra-base](https://huggingface.co/NlpHUST/ner-vietnamese-electra-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3789
- Age: {'precision': 0.8503401360544217, 'recall': 0.946969696969697, 'f1': 0.8960573476702508, 'number': 132}
- Datetime: {'precision': 0.6935483870967742, 'recall': 0.7428861788617886, 'f1': 0.7173699705593719, 'number': 984}
- Disease: {'precision': 0.6895306859205776, 'recall': 0.6749116607773852, 'f1': 0.6821428571428573, 'number': 283}
- Event: {'precision': 0.3210702341137124, 'recall': 0.36363636363636365, 'f1': 0.3410301953818828, 'number': 264}
- Gender: {'precision': 0.7704918032786885, 'recall': 0.8245614035087719, 'f1': 0.7966101694915254, 'number': 114}
- Law: {'precision': 0.5617283950617284, 'recall': 0.7193675889328063, 'f1': 0.6308492201039861, 'number': 253}
- Location: {'precision': 0.6985105290190036, 'recall': 0.7435757244395844, 'f1': 0.7203389830508473, 'number': 1829}
- Organization: {'precision': 0.640555906506633, 'recall': 0.7211948790896159, 'f1': 0.6784877885580463, 'number': 1406}
- Person: {'precision': 0.7024147727272727, 'recall': 0.7408239700374532, 'f1': 0.7211082756106453, 'number': 1335}
- Phone: {'precision': 0.8705882352941177, 'recall': 0.9487179487179487, 'f1': 0.9079754601226994, 'number': 78}
- Product: {'precision': 0.3686274509803922, 'recall': 0.3671875, 'f1': 0.36790606653620356, 'number': 256}
- Quantity: {'precision': 0.5566502463054187, 'recall': 0.6231617647058824, 'f1': 0.588031222896791, 'number': 544}
- Role: {'precision': 0.4342560553633218, 'recall': 0.4836223506743738, 'f1': 0.45761166818596166, 'number': 519}
- Transportation: {'precision': 0.49122807017543857, 'recall': 0.6086956521739131, 'f1': 0.5436893203883495, 'number': 138}
- Overall Precision: 0.6348
- Overall Recall: 0.6913
- Overall F1: 0.6619
- Overall Accuracy: 0.8912
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Age | Datetime | Disease | Event | Gender | Law | Location | Organization | Person | Phone | Product | Quantity | Role | Transportation | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:------:|:----:|:---------------:|:-------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.29 | 1.9991 | 2313 | 0.3353 | {'precision': 0.8561643835616438, 'recall': 0.946969696969697, 'f1': 0.8992805755395684, 'number': 132} | {'precision': 0.707647628267183, 'recall': 0.7428861788617886, 'f1': 0.7248388696083291, 'number': 984} | {'precision': 0.6946564885496184, 'recall': 0.6431095406360424, 'f1': 0.6678899082568808, 'number': 283} | {'precision': 0.34191176470588236, 'recall': 0.3522727272727273, 'f1': 0.34701492537313433, 'number': 264} | {'precision': 0.7560975609756098, 'recall': 0.8157894736842105, 'f1': 0.7848101265822786, 'number': 114} | {'precision': 0.5384615384615384, 'recall': 0.6363636363636364, 'f1': 0.5833333333333334, 'number': 253} | {'precision': 0.7157279489904357, 'recall': 0.7364680153089119, 'f1': 0.7259498787388844, 'number': 1829} | {'precision': 0.6326268464996788, 'recall': 0.7005689900426743, 'f1': 0.6648666891663854, 'number': 1406} | {'precision': 0.7298136645962733, 'recall': 0.704119850187266, 'f1': 0.7167365611894777, 'number': 1335} | {'precision': 0.8072289156626506, 'recall': 0.8589743589743589, 'f1': 0.8322981366459627, 'number': 78} | {'precision': 0.425, 'recall': 0.265625, 'f1': 0.32692307692307687, 'number': 256} | {'precision': 0.5797101449275363, 'recall': 0.5882352941176471, 'f1': 0.583941605839416, 'number': 544} | {'precision': 0.4549019607843137, 'recall': 0.44701348747591524, 'f1': 0.4509232264334305, 'number': 519} | {'precision': 0.5194805194805194, 'recall': 0.5797101449275363, 'f1': 0.5479452054794519, 'number': 138} | 0.6518 | 0.6667 | 0.6592 | 0.8937 |
| 0.1806 | 3.9983 | 4626 | 0.3789 | {'precision': 0.8503401360544217, 'recall': 0.946969696969697, 'f1': 0.8960573476702508, 'number': 132} | {'precision': 0.6935483870967742, 'recall': 0.7428861788617886, 'f1': 0.7173699705593719, 'number': 984} | {'precision': 0.6895306859205776, 'recall': 0.6749116607773852, 'f1': 0.6821428571428573, 'number': 283} | {'precision': 0.3210702341137124, 'recall': 0.36363636363636365, 'f1': 0.3410301953818828, 'number': 264} | {'precision': 0.7704918032786885, 'recall': 0.8245614035087719, 'f1': 0.7966101694915254, 'number': 114} | {'precision': 0.5617283950617284, 'recall': 0.7193675889328063, 'f1': 0.6308492201039861, 'number': 253} | {'precision': 0.6985105290190036, 'recall': 0.7435757244395844, 'f1': 0.7203389830508473, 'number': 1829} | {'precision': 0.640555906506633, 'recall': 0.7211948790896159, 'f1': 0.6784877885580463, 'number': 1406} | {'precision': 0.7024147727272727, 'recall': 0.7408239700374532, 'f1': 0.7211082756106453, 'number': 1335} | {'precision': 0.8705882352941177, 'recall': 0.9487179487179487, 'f1': 0.9079754601226994, 'number': 78} | {'precision': 0.3686274509803922, 'recall': 0.3671875, 'f1': 0.36790606653620356, 'number': 256} | {'precision': 0.5566502463054187, 'recall': 0.6231617647058824, 'f1': 0.588031222896791, 'number': 544} | {'precision': 0.4342560553633218, 'recall': 0.4836223506743738, 'f1': 0.45761166818596166, 'number': 519} | {'precision': 0.49122807017543857, 'recall': 0.6086956521739131, 'f1': 0.5436893203883495, 'number': 138} | 0.6348 | 0.6913 | 0.6619 | 0.8912 |
### Framework versions
- Transformers 4.41.2
- Pytorch 2.1.2
- Datasets 2.19.2
- Tokenizers 0.19.1
|