roberta-large-ner-ghtk-cs-7-label-old-data-3090-15Aug-3
This model is a fine-tuned version of FacebookAI/xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1819
- Tk: {'precision': 0.8571428571428571, 'recall': 0.7758620689655172, 'f1': 0.8144796380090498, 'number': 116}
- Gày: {'precision': 0.7297297297297297, 'recall': 0.8181818181818182, 'f1': 0.7714285714285715, 'number': 33}
- Gày trừu tượng: {'precision': 0.9229122055674518, 'recall': 0.9229122055674518, 'f1': 0.9229122055674518, 'number': 467}
- Iờ: {'precision': 0.675, 'recall': 0.7105263157894737, 'f1': 0.6923076923076923, 'number': 38}
- Ã đơn: {'precision': 0.8682926829268293, 'recall': 0.8944723618090452, 'f1': 0.8811881188118812, 'number': 199}
- Đt: {'precision': 0.9344790547798066, 'recall': 0.9908883826879271, 'f1': 0.9618573797678275, 'number': 878}
- Đt trừu tượng: {'precision': 0.8609865470852018, 'recall': 0.897196261682243, 'f1': 0.8787185354691077, 'number': 214}
- Overall Precision: 0.9039
- Overall Recall: 0.9332
- Overall F1: 0.9183
- Overall Accuracy: 0.9698
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Tk | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 232 | 0.1472 | {'precision': 0.6511627906976745, 'recall': 0.4827586206896552, 'f1': 0.5544554455445545, 'number': 116} | {'precision': 0.5490196078431373, 'recall': 0.8484848484848485, 'f1': 0.6666666666666667, 'number': 33} | {'precision': 0.8313725490196079, 'recall': 0.9079229122055674, 'f1': 0.8679631525076766, 'number': 467} | {'precision': 0.5925925925925926, 'recall': 0.8421052631578947, 'f1': 0.6956521739130435, 'number': 38} | {'precision': 0.8263157894736842, 'recall': 0.7889447236180904, 'f1': 0.8071979434447301, 'number': 199} | {'precision': 0.894141829393628, 'recall': 0.9908883826879271, 'f1': 0.940032414910859, 'number': 878} | {'precision': 0.9, 'recall': 0.29439252336448596, 'f1': 0.4436619718309859, 'number': 214} | 0.8428 | 0.8380 | 0.8404 | 0.9505 |
No log | 2.0 | 464 | 0.1220 | {'precision': 0.7304964539007093, 'recall': 0.8879310344827587, 'f1': 0.8015564202334631, 'number': 116} | {'precision': 0.6216216216216216, 'recall': 0.696969696969697, 'f1': 0.6571428571428571, 'number': 33} | {'precision': 0.8627049180327869, 'recall': 0.9014989293361885, 'f1': 0.881675392670157, 'number': 467} | {'precision': 0.5333333333333333, 'recall': 0.8421052631578947, 'f1': 0.653061224489796, 'number': 38} | {'precision': 0.82, 'recall': 0.8241206030150754, 'f1': 0.8220551378446116, 'number': 199} | {'precision': 0.9515951595159516, 'recall': 0.9851936218678815, 'f1': 0.9681029658645776, 'number': 878} | {'precision': 0.686411149825784, 'recall': 0.9205607476635514, 'f1': 0.7864271457085829, 'number': 214} | 0.8506 | 0.9280 | 0.8876 | 0.9571 |
0.1527 | 3.0 | 696 | 0.1208 | {'precision': 1.0, 'recall': 0.27586206896551724, 'f1': 0.4324324324324324, 'number': 116} | {'precision': 0.7058823529411765, 'recall': 0.7272727272727273, 'f1': 0.7164179104477613, 'number': 33} | {'precision': 0.9049773755656109, 'recall': 0.8565310492505354, 'f1': 0.8800880088008801, 'number': 467} | {'precision': 0.6785714285714286, 'recall': 0.5, 'f1': 0.5757575757575758, 'number': 38} | {'precision': 0.9112426035502958, 'recall': 0.7738693467336684, 'f1': 0.8369565217391305, 'number': 199} | {'precision': 0.9047619047619048, 'recall': 0.9954441913439636, 'f1': 0.947939262472885, 'number': 878} | {'precision': 0.7903930131004366, 'recall': 0.8457943925233645, 'f1': 0.8171557562076749, 'number': 214} | 0.8863 | 0.8658 | 0.8759 | 0.9612 |
0.1527 | 4.0 | 928 | 0.1300 | {'precision': 0.8840579710144928, 'recall': 0.5258620689655172, 'f1': 0.6594594594594594, 'number': 116} | {'precision': 0.6041666666666666, 'recall': 0.8787878787878788, 'f1': 0.7160493827160493, 'number': 33} | {'precision': 0.8778004073319755, 'recall': 0.9229122055674518, 'f1': 0.8997912317327765, 'number': 467} | {'precision': 0.6, 'recall': 0.7894736842105263, 'f1': 0.6818181818181819, 'number': 38} | {'precision': 0.9, 'recall': 0.8140703517587939, 'f1': 0.8548812664907651, 'number': 199} | {'precision': 0.9164054336468129, 'recall': 0.9988610478359908, 'f1': 0.9558583106267029, 'number': 878} | {'precision': 0.5959302325581395, 'recall': 0.9579439252336449, 'f1': 0.7347670250896057, 'number': 214} | 0.8392 | 0.9229 | 0.8790 | 0.9601 |
0.0581 | 5.0 | 1160 | 0.1442 | {'precision': 0.9117647058823529, 'recall': 0.8017241379310345, 'f1': 0.8532110091743118, 'number': 116} | {'precision': 0.6756756756756757, 'recall': 0.7575757575757576, 'f1': 0.7142857142857142, 'number': 33} | {'precision': 0.9274725274725275, 'recall': 0.9036402569593148, 'f1': 0.9154013015184383, 'number': 467} | {'precision': 0.7692307692307693, 'recall': 0.5263157894736842, 'f1': 0.625, 'number': 38} | {'precision': 0.9032258064516129, 'recall': 0.8442211055276382, 'f1': 0.8727272727272727, 'number': 199} | {'precision': 0.9252136752136753, 'recall': 0.9863325740318907, 'f1': 0.9547960308710034, 'number': 878} | {'precision': 0.827433628318584, 'recall': 0.8738317757009346, 'f1': 0.85, 'number': 214} | 0.9050 | 0.9157 | 0.9103 | 0.9685 |
0.0581 | 6.0 | 1392 | 0.1308 | {'precision': 0.9528301886792453, 'recall': 0.8706896551724138, 'f1': 0.9099099099099099, 'number': 116} | {'precision': 0.75, 'recall': 0.8181818181818182, 'f1': 0.7826086956521738, 'number': 33} | {'precision': 0.9172113289760349, 'recall': 0.9014989293361885, 'f1': 0.9092872570194385, 'number': 467} | {'precision': 0.7428571428571429, 'recall': 0.6842105263157895, 'f1': 0.7123287671232877, 'number': 38} | {'precision': 0.900523560209424, 'recall': 0.864321608040201, 'f1': 0.8820512820512819, 'number': 199} | {'precision': 0.9382448537378115, 'recall': 0.9863325740318907, 'f1': 0.9616879511382567, 'number': 878} | {'precision': 0.8392857142857143, 'recall': 0.8785046728971962, 'f1': 0.8584474885844748, 'number': 214} | 0.9124 | 0.9260 | 0.9191 | 0.9696 |
0.0272 | 7.0 | 1624 | 0.1657 | {'precision': 0.872093023255814, 'recall': 0.646551724137931, 'f1': 0.7425742574257427, 'number': 116} | {'precision': 0.7, 'recall': 0.8484848484848485, 'f1': 0.7671232876712328, 'number': 33} | {'precision': 0.9114470842332614, 'recall': 0.9036402569593148, 'f1': 0.9075268817204302, 'number': 467} | {'precision': 0.58, 'recall': 0.7631578947368421, 'f1': 0.6590909090909091, 'number': 38} | {'precision': 0.8917525773195877, 'recall': 0.8693467336683417, 'f1': 0.8804071246819339, 'number': 199} | {'precision': 0.9235668789808917, 'recall': 0.9908883826879271, 'f1': 0.9560439560439561, 'number': 878} | {'precision': 0.8451327433628318, 'recall': 0.8925233644859814, 'f1': 0.8681818181818182, 'number': 214} | 0.8936 | 0.9193 | 0.9062 | 0.9669 |
0.0272 | 8.0 | 1856 | 0.1579 | {'precision': 0.8990825688073395, 'recall': 0.8448275862068966, 'f1': 0.8711111111111111, 'number': 116} | {'precision': 0.7297297297297297, 'recall': 0.8181818181818182, 'f1': 0.7714285714285715, 'number': 33} | {'precision': 0.9163090128755365, 'recall': 0.9143468950749465, 'f1': 0.9153269024651661, 'number': 467} | {'precision': 0.675, 'recall': 0.7105263157894737, 'f1': 0.6923076923076923, 'number': 38} | {'precision': 0.8578680203045685, 'recall': 0.8492462311557789, 'f1': 0.8535353535353536, 'number': 199} | {'precision': 0.9397201291711518, 'recall': 0.9943052391799544, 'f1': 0.9662423907028223, 'number': 878} | {'precision': 0.8625592417061612, 'recall': 0.8504672897196262, 'f1': 0.8564705882352942, 'number': 214} | 0.9065 | 0.9270 | 0.9166 | 0.9683 |
0.0111 | 9.0 | 2088 | 0.1776 | {'precision': 0.8611111111111112, 'recall': 0.8017241379310345, 'f1': 0.8303571428571429, 'number': 116} | {'precision': 0.7105263157894737, 'recall': 0.8181818181818182, 'f1': 0.7605633802816901, 'number': 33} | {'precision': 0.9252136752136753, 'recall': 0.9271948608137045, 'f1': 0.9262032085561497, 'number': 467} | {'precision': 0.6829268292682927, 'recall': 0.7368421052631579, 'f1': 0.7088607594936709, 'number': 38} | {'precision': 0.8578431372549019, 'recall': 0.8793969849246231, 'f1': 0.8684863523573201, 'number': 199} | {'precision': 0.9383783783783783, 'recall': 0.9886104783599089, 'f1': 0.9628397115917915, 'number': 878} | {'precision': 0.7834645669291339, 'recall': 0.9299065420560748, 'f1': 0.8504273504273504, 'number': 214} | 0.8945 | 0.9373 | 0.9154 | 0.9683 |
0.0111 | 10.0 | 2320 | 0.1819 | {'precision': 0.8571428571428571, 'recall': 0.7758620689655172, 'f1': 0.8144796380090498, 'number': 116} | {'precision': 0.7297297297297297, 'recall': 0.8181818181818182, 'f1': 0.7714285714285715, 'number': 33} | {'precision': 0.9229122055674518, 'recall': 0.9229122055674518, 'f1': 0.9229122055674518, 'number': 467} | {'precision': 0.675, 'recall': 0.7105263157894737, 'f1': 0.6923076923076923, 'number': 38} | {'precision': 0.8682926829268293, 'recall': 0.8944723618090452, 'f1': 0.8811881188118812, 'number': 199} | {'precision': 0.9344790547798066, 'recall': 0.9908883826879271, 'f1': 0.9618573797678275, 'number': 878} | {'precision': 0.8609865470852018, 'recall': 0.897196261682243, 'f1': 0.8787185354691077, 'number': 214} | 0.9039 | 0.9332 | 0.9183 | 0.9698 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for Kudod/roberta-large-ner-ghtk-cs-7-label-old-data-3090-15Aug-3
Base model
FacebookAI/xlm-roberta-large