ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k14_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6658
  • Qwk: 0.4692
  • Mse: 0.6658
  • Rmse: 0.8160

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0556 2 3.8567 -0.0047 3.8567 1.9639
No log 0.1111 4 2.0105 0.0727 2.0105 1.4179
No log 0.1667 6 1.2549 -0.0148 1.2549 1.1202
No log 0.2222 8 1.3524 -0.0245 1.3524 1.1629
No log 0.2778 10 2.0829 0.0342 2.0829 1.4432
No log 0.3333 12 1.9277 0.0733 1.9277 1.3884
No log 0.3889 14 1.3579 -0.0212 1.3579 1.1653
No log 0.4444 16 1.1292 0.2068 1.1292 1.0626
No log 0.5 18 1.0421 0.2140 1.0421 1.0209
No log 0.5556 20 1.0086 0.1504 1.0086 1.0043
No log 0.6111 22 1.0858 0.2873 1.0858 1.0420
No log 0.6667 24 1.0611 0.4051 1.0611 1.0301
No log 0.7222 26 1.0456 0.3014 1.0456 1.0226
No log 0.7778 28 1.1047 0.2004 1.1047 1.0510
No log 0.8333 30 1.1097 0.2030 1.1097 1.0534
No log 0.8889 32 1.0770 0.2100 1.0770 1.0378
No log 0.9444 34 1.0185 0.3540 1.0185 1.0092
No log 1.0 36 1.0004 0.2416 1.0004 1.0002
No log 1.0556 38 1.0030 0.2978 1.0030 1.0015
No log 1.1111 40 0.9584 0.2214 0.9584 0.9790
No log 1.1667 42 1.0020 0.2611 1.0020 1.0010
No log 1.2222 44 0.9943 0.2196 0.9943 0.9971
No log 1.2778 46 0.9484 0.2114 0.9484 0.9738
No log 1.3333 48 0.9981 0.2077 0.9981 0.9991
No log 1.3889 50 1.1638 0.2188 1.1638 1.0788
No log 1.4444 52 1.0401 0.2604 1.0401 1.0199
No log 1.5 54 0.8369 0.4 0.8369 0.9148
No log 1.5556 56 0.9467 0.3541 0.9467 0.9730
No log 1.6111 58 0.9212 0.3666 0.9212 0.9598
No log 1.6667 60 0.8263 0.3876 0.8263 0.9090
No log 1.7222 62 0.8329 0.3961 0.8329 0.9126
No log 1.7778 64 0.9074 0.4025 0.9074 0.9526
No log 1.8333 66 0.9100 0.4273 0.9100 0.9539
No log 1.8889 68 0.7956 0.4516 0.7956 0.8920
No log 1.9444 70 0.7761 0.4019 0.7761 0.8810
No log 2.0 72 0.7771 0.4086 0.7771 0.8816
No log 2.0556 74 0.8137 0.3071 0.8137 0.9021
No log 2.1111 76 0.8307 0.3476 0.8307 0.9114
No log 2.1667 78 0.8999 0.3569 0.8999 0.9487
No log 2.2222 80 0.9733 0.3308 0.9733 0.9865
No log 2.2778 82 0.9681 0.4533 0.9681 0.9839
No log 2.3333 84 0.8659 0.5176 0.8659 0.9305
No log 2.3889 86 0.8541 0.3721 0.8541 0.9242
No log 2.4444 88 0.9335 0.3654 0.9335 0.9662
No log 2.5 90 1.0241 0.2794 1.0241 1.0120
No log 2.5556 92 0.9529 0.2618 0.9529 0.9762
No log 2.6111 94 1.0164 0.2171 1.0164 1.0081
No log 2.6667 96 1.0313 0.2465 1.0313 1.0155
No log 2.7222 98 0.8804 0.3414 0.8804 0.9383
No log 2.7778 100 0.8379 0.4301 0.8379 0.9154
No log 2.8333 102 0.9872 0.3400 0.9872 0.9936
No log 2.8889 104 1.0138 0.3663 1.0138 1.0069
No log 2.9444 106 1.1835 0.3293 1.1835 1.0879
No log 3.0 108 1.0381 0.3744 1.0381 1.0189
No log 3.0556 110 0.7990 0.5680 0.7990 0.8938
No log 3.1111 112 0.8047 0.4734 0.8047 0.8970
No log 3.1667 114 0.7846 0.4644 0.7846 0.8858
No log 3.2222 116 0.8866 0.2873 0.8866 0.9416
No log 3.2778 118 0.8347 0.3658 0.8347 0.9136
No log 3.3333 120 0.7583 0.4822 0.7583 0.8708
No log 3.3889 122 0.7847 0.4472 0.7847 0.8859
No log 3.4444 124 0.7473 0.4914 0.7473 0.8645
No log 3.5 126 0.8041 0.4697 0.8041 0.8967
No log 3.5556 128 0.7461 0.5572 0.7461 0.8638
No log 3.6111 130 0.6856 0.5302 0.6856 0.8280
No log 3.6667 132 0.6942 0.5428 0.6942 0.8332
No log 3.7222 134 0.6679 0.5635 0.6679 0.8172
No log 3.7778 136 0.6672 0.5635 0.6672 0.8168
No log 3.8333 138 0.6548 0.5635 0.6548 0.8092
No log 3.8889 140 0.6604 0.6001 0.6604 0.8126
No log 3.9444 142 0.6668 0.5302 0.6668 0.8166
No log 4.0 144 0.8030 0.5359 0.8030 0.8961
No log 4.0556 146 0.9382 0.4764 0.9382 0.9686
No log 4.1111 148 0.8548 0.4752 0.8548 0.9246
No log 4.1667 150 0.6831 0.4960 0.6831 0.8265
No log 4.2222 152 0.7376 0.4560 0.7376 0.8589
No log 4.2778 154 0.7601 0.4162 0.7601 0.8719
No log 4.3333 156 0.7126 0.5108 0.7126 0.8442
No log 4.3889 158 0.6749 0.6127 0.6749 0.8215
No log 4.4444 160 0.7123 0.5083 0.7123 0.8440
No log 4.5 162 0.9315 0.3847 0.9315 0.9651
No log 4.5556 164 1.0390 0.3744 1.0390 1.0193
No log 4.6111 166 0.8702 0.4284 0.8702 0.9329
No log 4.6667 168 0.7008 0.5712 0.7008 0.8372
No log 4.7222 170 0.7060 0.5202 0.7060 0.8402
No log 4.7778 172 0.6833 0.5060 0.6833 0.8266
No log 4.8333 174 0.7466 0.4714 0.7466 0.8641
No log 4.8889 176 0.8303 0.3864 0.8303 0.9112
No log 4.9444 178 0.8768 0.4407 0.8768 0.9364
No log 5.0 180 0.7330 0.5400 0.7330 0.8562
No log 5.0556 182 0.6685 0.5432 0.6685 0.8176
No log 5.1111 184 0.6778 0.5441 0.6778 0.8233
No log 5.1667 186 0.6483 0.6475 0.6483 0.8052
No log 5.2222 188 0.6542 0.6465 0.6542 0.8089
No log 5.2778 190 0.6345 0.5432 0.6345 0.7966
No log 5.3333 192 0.7223 0.4893 0.7223 0.8499
No log 5.3889 194 0.8536 0.4841 0.8536 0.9239
No log 5.4444 196 0.7374 0.5459 0.7374 0.8587
No log 5.5 198 0.6579 0.5386 0.6579 0.8111
No log 5.5556 200 0.7636 0.5356 0.7636 0.8739
No log 5.6111 202 0.8157 0.4681 0.8157 0.9032
No log 5.6667 204 0.7589 0.4352 0.7589 0.8712
No log 5.7222 206 0.7304 0.5171 0.7304 0.8547
No log 5.7778 208 0.8545 0.4578 0.8545 0.9244
No log 5.8333 210 0.8673 0.4240 0.8673 0.9313
No log 5.8889 212 0.7512 0.4998 0.7512 0.8667
No log 5.9444 214 0.6793 0.5156 0.6793 0.8242
No log 6.0 216 0.7309 0.5433 0.7309 0.8549
No log 6.0556 218 0.7435 0.5618 0.7435 0.8622
No log 6.1111 220 0.6781 0.5605 0.6781 0.8235
No log 6.1667 222 0.6499 0.5523 0.6499 0.8062
No log 6.2222 224 0.6521 0.5647 0.6521 0.8075
No log 6.2778 226 0.6465 0.5626 0.6465 0.8041
No log 6.3333 228 0.6457 0.5274 0.6457 0.8036
No log 6.3889 230 0.6560 0.6073 0.6560 0.8099
No log 6.4444 232 0.6533 0.5505 0.6533 0.8083
No log 6.5 234 0.6527 0.5523 0.6527 0.8079
No log 6.5556 236 0.6646 0.6154 0.6646 0.8153
No log 6.6111 238 0.6942 0.6415 0.6942 0.8332
No log 6.6667 240 0.6928 0.6051 0.6928 0.8323
No log 6.7222 242 0.6591 0.6311 0.6591 0.8118
No log 6.7778 244 0.6321 0.6076 0.6321 0.7951
No log 6.8333 246 0.6480 0.5830 0.6480 0.8050
No log 6.8889 248 0.6462 0.6144 0.6462 0.8039
No log 6.9444 250 0.6119 0.6046 0.6119 0.7823
No log 7.0 252 0.6487 0.6479 0.6487 0.8054
No log 7.0556 254 0.7613 0.5120 0.7613 0.8725
No log 7.1111 256 0.7467 0.5120 0.7467 0.8641
No log 7.1667 258 0.6497 0.5774 0.6497 0.8060
No log 7.2222 260 0.6021 0.6606 0.6021 0.7760
No log 7.2778 262 0.6476 0.5536 0.6476 0.8047
No log 7.3333 264 0.6541 0.5635 0.6541 0.8088
No log 7.3889 266 0.6790 0.5558 0.6790 0.8240
No log 7.4444 268 0.7201 0.5400 0.7201 0.8486
No log 7.5 270 0.7019 0.5688 0.7019 0.8378
No log 7.5556 272 0.6896 0.5415 0.6896 0.8304
No log 7.6111 274 0.7182 0.5312 0.7182 0.8475
No log 7.6667 276 0.7048 0.5300 0.7048 0.8395
No log 7.7222 278 0.7254 0.5748 0.7254 0.8517
No log 7.7778 280 0.7108 0.6035 0.7108 0.8431
No log 7.8333 282 0.7227 0.5748 0.7227 0.8501
No log 7.8889 284 0.6794 0.5936 0.6794 0.8242
No log 7.9444 286 0.6504 0.5887 0.6504 0.8065
No log 8.0 288 0.6483 0.6154 0.6483 0.8052
No log 8.0556 290 0.6534 0.6262 0.6534 0.8083
No log 8.1111 292 0.6549 0.5932 0.6549 0.8093
No log 8.1667 294 0.6574 0.6325 0.6574 0.8108
No log 8.2222 296 0.6522 0.6113 0.6522 0.8076
No log 8.2778 298 0.6485 0.6335 0.6485 0.8053
No log 8.3333 300 0.6521 0.5722 0.6521 0.8075
No log 8.3889 302 0.6817 0.5640 0.6817 0.8256
No log 8.4444 304 0.6939 0.4473 0.6939 0.8330
No log 8.5 306 0.6712 0.5432 0.6712 0.8193
No log 8.5556 308 0.6979 0.4510 0.6979 0.8354
No log 8.6111 310 0.6809 0.4868 0.6809 0.8252
No log 8.6667 312 0.6536 0.5856 0.6536 0.8084
No log 8.7222 314 0.6560 0.6165 0.6560 0.8100
No log 8.7778 316 0.6687 0.6043 0.6687 0.8177
No log 8.8333 318 0.6792 0.5375 0.6792 0.8241
No log 8.8889 320 0.6884 0.4778 0.6884 0.8297
No log 8.9444 322 0.7004 0.4888 0.7004 0.8369
No log 9.0 324 0.7251 0.4981 0.7251 0.8515
No log 9.0556 326 0.7389 0.5446 0.7389 0.8596
No log 9.1111 328 0.7136 0.4858 0.7136 0.8447
No log 9.1667 330 0.6789 0.4995 0.6789 0.8240
No log 9.2222 332 0.6778 0.4858 0.6778 0.8233
No log 9.2778 334 0.7040 0.4966 0.7040 0.8391
No log 9.3333 336 0.7311 0.5385 0.7311 0.8550
No log 9.3889 338 0.7625 0.5672 0.7625 0.8732
No log 9.4444 340 0.7041 0.5292 0.7041 0.8391
No log 9.5 342 0.6118 0.6096 0.6118 0.7822
No log 9.5556 344 0.5940 0.6096 0.5940 0.7707
No log 9.6111 346 0.6146 0.5712 0.6146 0.7840
No log 9.6667 348 0.6602 0.6053 0.6602 0.8125
No log 9.7222 350 0.6863 0.5905 0.6863 0.8284
No log 9.7778 352 0.6276 0.5932 0.6276 0.7922
No log 9.8333 354 0.6096 0.5647 0.6096 0.7808
No log 9.8889 356 0.6545 0.5242 0.6545 0.8090
No log 9.9444 358 0.6652 0.5242 0.6652 0.8156
No log 10.0 360 0.6239 0.5199 0.6239 0.7898
No log 10.0556 362 0.6163 0.6096 0.6163 0.7851
No log 10.1111 364 0.6531 0.6054 0.6531 0.8082
No log 10.1667 366 0.6405 0.6065 0.6405 0.8003
No log 10.2222 368 0.6097 0.5724 0.6097 0.7808
No log 10.2778 370 0.6081 0.5505 0.6081 0.7798
No log 10.3333 372 0.6033 0.5724 0.6033 0.7767
No log 10.3889 374 0.5995 0.6065 0.5995 0.7743
No log 10.4444 376 0.6005 0.6065 0.6005 0.7749
No log 10.5 378 0.6027 0.6407 0.6027 0.7763
No log 10.5556 380 0.5776 0.6186 0.5776 0.7600
No log 10.6111 382 0.5751 0.5988 0.5751 0.7583
No log 10.6667 384 0.5997 0.6119 0.5997 0.7744
No log 10.7222 386 0.5899 0.6119 0.5899 0.7680
No log 10.7778 388 0.5658 0.6796 0.5658 0.7522
No log 10.8333 390 0.5950 0.6597 0.5950 0.7713
No log 10.8889 392 0.5903 0.6639 0.5903 0.7683
No log 10.9444 394 0.5907 0.5882 0.5907 0.7685
No log 11.0 396 0.5926 0.5659 0.5926 0.7698
No log 11.0556 398 0.5859 0.6427 0.5859 0.7655
No log 11.1111 400 0.6109 0.6479 0.6109 0.7816
No log 11.1667 402 0.6601 0.5846 0.6601 0.8124
No log 11.2222 404 0.6575 0.5521 0.6575 0.8108
No log 11.2778 406 0.6268 0.5640 0.6268 0.7917
No log 11.3333 408 0.6207 0.5644 0.6207 0.7878
No log 11.3889 410 0.6283 0.4554 0.6283 0.7927
No log 11.4444 412 0.6256 0.4554 0.6256 0.7910
No log 11.5 414 0.6096 0.5288 0.6096 0.7808
No log 11.5556 416 0.5969 0.5939 0.5969 0.7726
No log 11.6111 418 0.5971 0.6133 0.5971 0.7727
No log 11.6667 420 0.6110 0.6500 0.6110 0.7817
No log 11.7222 422 0.6229 0.6564 0.6229 0.7892
No log 11.7778 424 0.6050 0.6133 0.6050 0.7778
No log 11.8333 426 0.6056 0.5505 0.6056 0.7782
No log 11.8889 428 0.6218 0.5428 0.6218 0.7885
No log 11.9444 430 0.6260 0.5555 0.6260 0.7912
No log 12.0 432 0.6217 0.6013 0.6217 0.7885
No log 12.0556 434 0.6157 0.6219 0.6157 0.7846
No log 12.1111 436 0.6028 0.6606 0.6028 0.7764
No log 12.1667 438 0.6124 0.6632 0.6124 0.7825
No log 12.2222 440 0.7129 0.5572 0.7129 0.8443
No log 12.2778 442 0.7703 0.5417 0.7703 0.8777
No log 12.3333 444 0.7334 0.5543 0.7334 0.8564
No log 12.3889 446 0.6798 0.5888 0.6798 0.8245
No log 12.4444 448 0.6413 0.5949 0.6413 0.8008
No log 12.5 450 0.6256 0.5784 0.6256 0.7910
No log 12.5556 452 0.6255 0.5784 0.6255 0.7909
No log 12.6111 454 0.6456 0.5823 0.6456 0.8035
No log 12.6667 456 0.6693 0.5833 0.6693 0.8181
No log 12.7222 458 0.6561 0.6177 0.6561 0.8100
No log 12.7778 460 0.6258 0.6335 0.6258 0.7911
No log 12.8333 462 0.6302 0.6133 0.6302 0.7939
No log 12.8889 464 0.6594 0.6133 0.6594 0.8120
No log 12.9444 466 0.6565 0.6133 0.6565 0.8102
No log 13.0 468 0.6298 0.5712 0.6298 0.7936
No log 13.0556 470 0.6144 0.5498 0.6144 0.7839
No log 13.1111 472 0.6108 0.5939 0.6108 0.7815
No log 13.1667 474 0.6064 0.5939 0.6064 0.7787
No log 13.2222 476 0.6052 0.5939 0.6052 0.7779
No log 13.2778 478 0.6047 0.5939 0.6047 0.7776
No log 13.3333 480 0.6035 0.5939 0.6035 0.7769
No log 13.3889 482 0.6271 0.5329 0.6271 0.7919
No log 13.4444 484 0.6543 0.5343 0.6543 0.8089
No log 13.5 486 0.6666 0.5228 0.6666 0.8164
No log 13.5556 488 0.6569 0.5228 0.6569 0.8105
No log 13.6111 490 0.6600 0.5112 0.6600 0.8124
No log 13.6667 492 0.6355 0.5328 0.6355 0.7972
No log 13.7222 494 0.6101 0.5626 0.6101 0.7811
No log 13.7778 496 0.6057 0.5939 0.6057 0.7783
No log 13.8333 498 0.6061 0.6133 0.6061 0.7785
0.2855 13.8889 500 0.6159 0.5412 0.6159 0.7848
0.2855 13.9444 502 0.6816 0.5112 0.6816 0.8256
0.2855 14.0 504 0.8116 0.4686 0.8116 0.9009
0.2855 14.0556 506 0.8213 0.4670 0.8213 0.9063
0.2855 14.1111 508 0.7411 0.4755 0.7411 0.8609
0.2855 14.1667 510 0.6658 0.4692 0.6658 0.8160

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k14_task5_organization

Finetuned
(4222)
this model