metadata
license: mit
base_model: microsoft/deberta-v3-large
tags:
- generated_from_trainer
model-index:
- name: bbc-ner-deberta-large_baseline2_dims
results: []
bbc-ner-deberta-large_baseline2_dims
This model is a fine-tuned version of microsoft/deberta-v3-large on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1481
- Cargo Dimension Precision: 0.7606
- Cargo Dimension Recall: 0.9474
- Cargo Dimension F1: 0.8437
- Cargo Dimension Number: 114
- Cargo Quantity Precision: 0.8079
- Cargo Quantity Recall: 0.8937
- Cargo Quantity F1: 0.8486
- Cargo Quantity Number: 207
- Cargo Requirements Precision: 0.4962
- Cargo Requirements Recall: 0.6535
- Cargo Requirements F1: 0.5641
- Cargo Requirements Number: 202
- Cargo Stowage Factor Precision: 0.8226
- Cargo Stowage Factor Recall: 0.8361
- Cargo Stowage Factor F1: 0.8293
- Cargo Stowage Factor Number: 122
- Cargo Type Precision: 0.7885
- Cargo Type Recall: 0.8183
- Cargo Type F1: 0.8031
- Cargo Type Number: 688
- Cargo Weigh Volume Precision: 0.8528
- Cargo Weigh Volume Recall: 0.9026
- Cargo Weigh Volume F1: 0.8770
- Cargo Weigh Volume Number: 719
- Commission Rate Precision: 0.7955
- Commission Rate Recall: 0.8452
- Commission Rate F1: 0.8196
- Commission Rate Number: 336
- Discharging Port Precision: 0.8706
- Discharging Port Recall: 0.9015
- Discharging Port F1: 0.8858
- Discharging Port Number: 843
- Laycan Date Precision: 0.8260
- Laycan Date Recall: 0.8710
- Laycan Date F1: 0.8479
- Laycan Date Number: 496
- Loading Discharging Terms Precision: 0.7211
- Loading Discharging Terms Recall: 0.7975
- Loading Discharging Terms F1: 0.7574
- Loading Discharging Terms Number: 321
- Loading Port Precision: 0.8906
- Loading Port Recall: 0.9232
- Loading Port F1: 0.9066
- Loading Port Number: 899
- Shipment Terms Precision: 0.6780
- Shipment Terms Recall: 0.6780
- Shipment Terms F1: 0.6780
- Shipment Terms Number: 118
- Vessel Requirements Precision: 0.3786
- Vessel Requirements Recall: 0.5132
- Vessel Requirements F1: 0.4358
- Vessel Requirements Number: 76
- Overall Precision: 0.8041
- Overall Recall: 0.8598
- Overall F1: 0.8310
- Overall Accuracy: 0.9688
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10.0
Training results
Training Loss | Epoch | Step | Validation Loss | Cargo Dimension Precision | Cargo Dimension Recall | Cargo Dimension F1 | Cargo Dimension Number | Cargo Quantity Precision | Cargo Quantity Recall | Cargo Quantity F1 | Cargo Quantity Number | Cargo Requirements Precision | Cargo Requirements Recall | Cargo Requirements F1 | Cargo Requirements Number | Cargo Stowage Factor Precision | Cargo Stowage Factor Recall | Cargo Stowage Factor F1 | Cargo Stowage Factor Number | Cargo Type Precision | Cargo Type Recall | Cargo Type F1 | Cargo Type Number | Cargo Weigh Volume Precision | Cargo Weigh Volume Recall | Cargo Weigh Volume F1 | Cargo Weigh Volume Number | Commission Rate Precision | Commission Rate Recall | Commission Rate F1 | Commission Rate Number | Discharging Port Precision | Discharging Port Recall | Discharging Port F1 | Discharging Port Number | Laycan Date Precision | Laycan Date Recall | Laycan Date F1 | Laycan Date Number | Loading Discharging Terms Precision | Loading Discharging Terms Recall | Loading Discharging Terms F1 | Loading Discharging Terms Number | Loading Port Precision | Loading Port Recall | Loading Port F1 | Loading Port Number | Shipment Terms Precision | Shipment Terms Recall | Shipment Terms F1 | Shipment Terms Number | Vessel Requirements Precision | Vessel Requirements Recall | Vessel Requirements F1 | Vessel Requirements Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.4476 | 1.0 | 119 | 0.1523 | 0.6169 | 0.8333 | 0.7090 | 114 | 0.7087 | 0.8696 | 0.7809 | 207 | 0.3007 | 0.4257 | 0.3525 | 202 | 0.6284 | 0.7623 | 0.6889 | 122 | 0.6478 | 0.6308 | 0.6392 | 688 | 0.6983 | 0.8178 | 0.7534 | 719 | 0.7099 | 0.7649 | 0.7364 | 336 | 0.7212 | 0.8683 | 0.7879 | 843 | 0.7482 | 0.8266 | 0.7854 | 496 | 0.5669 | 0.6729 | 0.6154 | 321 | 0.8280 | 0.8565 | 0.8420 | 899 | 0.6556 | 0.5 | 0.5673 | 118 | 0.0658 | 0.0658 | 0.0658 | 76 | 0.6819 | 0.7635 | 0.7204 | 0.9581 |
0.132 | 2.0 | 239 | 0.1438 | 0.6471 | 0.8684 | 0.7416 | 114 | 0.7212 | 0.9372 | 0.8151 | 207 | 0.2857 | 0.4455 | 0.3482 | 202 | 0.7083 | 0.8361 | 0.7669 | 122 | 0.6586 | 0.8270 | 0.7332 | 688 | 0.7191 | 0.8901 | 0.7955 | 719 | 0.7863 | 0.8542 | 0.8188 | 336 | 0.7639 | 0.8944 | 0.8240 | 843 | 0.7370 | 0.8589 | 0.7933 | 496 | 0.5497 | 0.7757 | 0.6434 | 321 | 0.8533 | 0.8932 | 0.8728 | 899 | 0.5669 | 0.6102 | 0.5878 | 118 | 0.216 | 0.3553 | 0.2687 | 76 | 0.6943 | 0.8387 | 0.7597 | 0.9579 |
0.1022 | 3.0 | 358 | 0.1217 | 0.7574 | 0.9035 | 0.8240 | 114 | 0.8087 | 0.8986 | 0.8513 | 207 | 0.4081 | 0.5495 | 0.4684 | 202 | 0.75 | 0.8361 | 0.7907 | 122 | 0.7301 | 0.8140 | 0.7698 | 688 | 0.8132 | 0.8720 | 0.8416 | 719 | 0.7983 | 0.8482 | 0.8225 | 336 | 0.8123 | 0.9087 | 0.8578 | 843 | 0.7901 | 0.8730 | 0.8295 | 496 | 0.7583 | 0.7819 | 0.7699 | 321 | 0.8783 | 0.9232 | 0.9002 | 899 | 0.7596 | 0.6695 | 0.7117 | 118 | 0.3529 | 0.4737 | 0.4045 | 76 | 0.7744 | 0.8498 | 0.8103 | 0.9677 |
0.0802 | 4.0 | 478 | 0.1291 | 0.7589 | 0.9386 | 0.8392 | 114 | 0.8097 | 0.8841 | 0.8453 | 207 | 0.4566 | 0.5990 | 0.5182 | 202 | 0.8347 | 0.8279 | 0.8313 | 122 | 0.7710 | 0.7733 | 0.7721 | 688 | 0.8015 | 0.8818 | 0.8397 | 719 | 0.8203 | 0.8423 | 0.8311 | 336 | 0.8449 | 0.9110 | 0.8767 | 843 | 0.8252 | 0.8851 | 0.8541 | 496 | 0.7202 | 0.7539 | 0.7367 | 321 | 0.8637 | 0.9232 | 0.8925 | 899 | 0.7404 | 0.6525 | 0.6937 | 118 | 0.4359 | 0.4474 | 0.4416 | 76 | 0.7912 | 0.8463 | 0.8179 | 0.9683 |
0.0733 | 5.0 | 597 | 0.1269 | 0.7347 | 0.9474 | 0.8276 | 114 | 0.7975 | 0.9130 | 0.8514 | 207 | 0.4306 | 0.6139 | 0.5061 | 202 | 0.8062 | 0.8525 | 0.8287 | 122 | 0.7564 | 0.8169 | 0.7855 | 688 | 0.8142 | 0.8901 | 0.8505 | 719 | 0.8184 | 0.8452 | 0.8316 | 336 | 0.8760 | 0.9051 | 0.8903 | 843 | 0.8180 | 0.8790 | 0.8474 | 496 | 0.7303 | 0.8100 | 0.7681 | 321 | 0.8747 | 0.9321 | 0.9025 | 899 | 0.6231 | 0.6864 | 0.6532 | 118 | 0.3407 | 0.4079 | 0.3713 | 76 | 0.7870 | 0.8598 | 0.8218 | 0.9681 |
0.0541 | 6.0 | 717 | 0.1299 | 0.7379 | 0.9386 | 0.8263 | 114 | 0.7899 | 0.9082 | 0.8449 | 207 | 0.4330 | 0.6238 | 0.5112 | 202 | 0.8095 | 0.8361 | 0.8226 | 122 | 0.7681 | 0.8183 | 0.7924 | 688 | 0.7916 | 0.8929 | 0.8392 | 719 | 0.8182 | 0.8571 | 0.8372 | 336 | 0.8491 | 0.9075 | 0.8773 | 843 | 0.8056 | 0.8690 | 0.8361 | 496 | 0.6981 | 0.7850 | 0.7390 | 321 | 0.8828 | 0.9388 | 0.9100 | 899 | 0.6991 | 0.6695 | 0.6840 | 118 | 0.38 | 0.5 | 0.4318 | 76 | 0.7815 | 0.8607 | 0.8192 | 0.9676 |
0.0463 | 7.0 | 836 | 0.1311 | 0.7626 | 0.9298 | 0.8379 | 114 | 0.8243 | 0.8841 | 0.8531 | 207 | 0.4731 | 0.6089 | 0.5325 | 202 | 0.7895 | 0.8607 | 0.8235 | 122 | 0.7875 | 0.7863 | 0.7869 | 688 | 0.8366 | 0.8901 | 0.8625 | 719 | 0.8067 | 0.8571 | 0.8312 | 336 | 0.8928 | 0.8992 | 0.8960 | 843 | 0.8314 | 0.8649 | 0.8478 | 496 | 0.7672 | 0.8006 | 0.7835 | 321 | 0.9013 | 0.9143 | 0.9078 | 899 | 0.6529 | 0.6695 | 0.6611 | 118 | 0.3608 | 0.4605 | 0.4046 | 76 | 0.8096 | 0.8493 | 0.8289 | 0.9686 |
0.0434 | 8.0 | 956 | 0.1430 | 0.7448 | 0.9474 | 0.8340 | 114 | 0.7957 | 0.9034 | 0.8462 | 207 | 0.4765 | 0.6535 | 0.5511 | 202 | 0.7863 | 0.8443 | 0.8142 | 122 | 0.7916 | 0.8227 | 0.8068 | 688 | 0.8274 | 0.8999 | 0.8621 | 719 | 0.8141 | 0.8601 | 0.8365 | 336 | 0.8727 | 0.9027 | 0.8875 | 843 | 0.8241 | 0.8690 | 0.8459 | 496 | 0.7298 | 0.8162 | 0.7706 | 321 | 0.8943 | 0.9132 | 0.9037 | 899 | 0.5797 | 0.6780 | 0.6250 | 118 | 0.3491 | 0.4868 | 0.4066 | 76 | 0.7963 | 0.8605 | 0.8271 | 0.9681 |
0.0373 | 9.0 | 1075 | 0.1435 | 0.75 | 0.9474 | 0.8372 | 114 | 0.8017 | 0.8986 | 0.8474 | 207 | 0.4815 | 0.6436 | 0.5508 | 202 | 0.8254 | 0.8525 | 0.8387 | 122 | 0.7762 | 0.8169 | 0.7960 | 688 | 0.8409 | 0.9040 | 0.8713 | 719 | 0.8011 | 0.8512 | 0.8254 | 336 | 0.8648 | 0.8956 | 0.8800 | 843 | 0.8267 | 0.875 | 0.8501 | 496 | 0.7227 | 0.8037 | 0.7611 | 321 | 0.8874 | 0.9288 | 0.9076 | 899 | 0.6349 | 0.6780 | 0.6557 | 118 | 0.3645 | 0.5132 | 0.4262 | 76 | 0.7969 | 0.8611 | 0.8278 | 0.9686 |
0.0348 | 9.96 | 1190 | 0.1481 | 0.7606 | 0.9474 | 0.8437 | 114 | 0.8079 | 0.8937 | 0.8486 | 207 | 0.4962 | 0.6535 | 0.5641 | 202 | 0.8226 | 0.8361 | 0.8293 | 122 | 0.7885 | 0.8183 | 0.8031 | 688 | 0.8528 | 0.9026 | 0.8770 | 719 | 0.7955 | 0.8452 | 0.8196 | 336 | 0.8706 | 0.9015 | 0.8858 | 843 | 0.8260 | 0.8710 | 0.8479 | 496 | 0.7211 | 0.7975 | 0.7574 | 321 | 0.8906 | 0.9232 | 0.9066 | 899 | 0.6780 | 0.6780 | 0.6780 | 118 | 0.3786 | 0.5132 | 0.4358 | 76 | 0.8041 | 0.8598 | 0.8310 | 0.9688 |
Framework versions
- Transformers 4.36.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.14.5
- Tokenizers 0.15.0