ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k15_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6886
  • Qwk: 0.7347
  • Mse: 0.6886
  • Rmse: 0.8298

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0286 2 7.0024 -0.0056 7.0024 2.6462
No log 0.0571 4 5.1780 -0.0079 5.1780 2.2755
No log 0.0857 6 4.0001 -0.0784 4.0001 2.0000
No log 0.1143 8 2.4915 0.1399 2.4915 1.5785
No log 0.1429 10 2.0463 0.2047 2.0463 1.4305
No log 0.1714 12 1.7302 0.1802 1.7302 1.3154
No log 0.2 14 1.6767 0.2407 1.6767 1.2949
No log 0.2286 16 1.5575 0.2569 1.5575 1.2480
No log 0.2571 18 1.7990 0.3566 1.7990 1.3413
No log 0.2857 20 2.8821 0.1081 2.8821 1.6977
No log 0.3143 22 2.6672 0.1685 2.6672 1.6332
No log 0.3429 24 1.9365 0.3862 1.9365 1.3916
No log 0.3714 26 1.5747 0.2857 1.5747 1.2549
No log 0.4 28 1.4914 0.2881 1.4914 1.2212
No log 0.4286 30 1.3951 0.2679 1.3951 1.1811
No log 0.4571 32 1.3621 0.3833 1.3621 1.1671
No log 0.4857 34 1.7485 0.3407 1.7485 1.3223
No log 0.5143 36 2.5358 0.2333 2.5358 1.5924
No log 0.5429 38 3.3410 0.2613 3.3410 1.8278
No log 0.5714 40 2.7741 0.3178 2.7741 1.6656
No log 0.6 42 2.1858 0.4175 2.1858 1.4785
No log 0.6286 44 1.4147 0.5517 1.4147 1.1894
No log 0.6571 46 0.9817 0.7089 0.9817 0.9908
No log 0.6857 48 0.9036 0.6757 0.9036 0.9506
No log 0.7143 50 0.9898 0.6569 0.9898 0.9949
No log 0.7429 52 0.9905 0.6187 0.9905 0.9952
No log 0.7714 54 1.1301 0.6184 1.1301 1.0631
No log 0.8 56 1.3984 0.5478 1.3984 1.1826
No log 0.8286 58 1.2728 0.5395 1.2728 1.1282
No log 0.8571 60 1.0622 0.6056 1.0622 1.0306
No log 0.8857 62 0.9881 0.6528 0.9881 0.9940
No log 0.9143 64 0.8393 0.7273 0.8393 0.9161
No log 0.9429 66 0.7963 0.7297 0.7963 0.8923
No log 0.9714 68 0.8146 0.7027 0.8146 0.9026
No log 1.0 70 1.1647 0.6115 1.1647 1.0792
No log 1.0286 72 1.4834 0.5366 1.4834 1.2179
No log 1.0571 74 1.2058 0.5535 1.2058 1.0981
No log 1.0857 76 0.9245 0.6531 0.9245 0.9615
No log 1.1143 78 0.8027 0.6765 0.8027 0.8960
No log 1.1429 80 0.8018 0.6765 0.8018 0.8955
No log 1.1714 82 0.9466 0.6533 0.9466 0.9729
No log 1.2 84 0.8516 0.6974 0.8516 0.9228
No log 1.2286 86 0.7435 0.7320 0.7435 0.8623
No log 1.2571 88 0.7816 0.7320 0.7816 0.8841
No log 1.2857 90 0.8902 0.6622 0.8902 0.9435
No log 1.3143 92 1.0705 0.6182 1.0705 1.0346
No log 1.3429 94 1.4062 0.5978 1.4062 1.1858
No log 1.3714 96 1.4508 0.5829 1.4508 1.2045
No log 1.4 98 1.2164 0.5854 1.2164 1.1029
No log 1.4286 100 0.9879 0.6795 0.9879 0.9939
No log 1.4571 102 0.7723 0.7383 0.7723 0.8788
No log 1.4857 104 0.7093 0.7949 0.7093 0.8422
No log 1.5143 106 0.6659 0.7975 0.6659 0.8160
No log 1.5429 108 0.6533 0.8050 0.6533 0.8083
No log 1.5714 110 0.7013 0.75 0.7013 0.8374
No log 1.6 112 0.6999 0.75 0.6999 0.8366
No log 1.6286 114 0.8096 0.7329 0.8096 0.8998
No log 1.6571 116 0.9308 0.7059 0.9308 0.9648
No log 1.6857 118 0.8529 0.7219 0.8529 0.9235
No log 1.7143 120 0.7000 0.7284 0.7000 0.8366
No log 1.7429 122 0.7040 0.7485 0.7040 0.8391
No log 1.7714 124 0.7204 0.7561 0.7204 0.8488
No log 1.8 126 0.6865 0.7692 0.6865 0.8285
No log 1.8286 128 0.7013 0.7467 0.7013 0.8375
No log 1.8571 130 0.6752 0.7467 0.6752 0.8217
No log 1.8857 132 0.7899 0.7211 0.7899 0.8888
No log 1.9143 134 0.8598 0.75 0.8598 0.9273
No log 1.9429 136 0.7182 0.7517 0.7182 0.8475
No log 1.9714 138 0.6643 0.7517 0.6643 0.8151
No log 2.0 140 0.6306 0.7947 0.6306 0.7941
No log 2.0286 142 0.7043 0.7412 0.7043 0.8392
No log 2.0571 144 0.8603 0.7283 0.8603 0.9275
No log 2.0857 146 0.8142 0.7326 0.8142 0.9023
No log 2.1143 148 0.7914 0.7349 0.7914 0.8896
No log 2.1429 150 0.8418 0.7262 0.8418 0.9175
No log 2.1714 152 0.8696 0.7262 0.8696 0.9325
No log 2.2 154 0.6940 0.7205 0.6940 0.8331
No log 2.2286 156 0.6549 0.7625 0.6549 0.8092
No log 2.2571 158 0.7052 0.7089 0.7052 0.8398
No log 2.2857 160 0.8223 0.7125 0.8223 0.9068
No log 2.3143 162 0.8829 0.7037 0.8829 0.9396
No log 2.3429 164 0.7530 0.7162 0.7530 0.8678
No log 2.3714 166 0.6450 0.7586 0.6450 0.8031
No log 2.4 168 0.7012 0.7183 0.7012 0.8374
No log 2.4286 170 0.7899 0.6667 0.7899 0.8888
No log 2.4571 172 1.0808 0.6503 1.0808 1.0396
No log 2.4857 174 1.1299 0.6474 1.1299 1.0630
No log 2.5143 176 0.8905 0.6826 0.8905 0.9437
No log 2.5429 178 0.8697 0.6746 0.8697 0.9326
No log 2.5714 180 0.9917 0.6966 0.9917 0.9958
No log 2.6 182 1.1368 0.6630 1.1368 1.0662
No log 2.6286 184 1.0475 0.6552 1.0475 1.0235
No log 2.6571 186 1.0556 0.6467 1.0556 1.0274
No log 2.6857 188 1.0912 0.6182 1.0912 1.0446
No log 2.7143 190 1.1109 0.6115 1.1109 1.0540
No log 2.7429 192 0.9344 0.6533 0.9344 0.9666
No log 2.7714 194 0.8154 0.6933 0.8154 0.9030
No log 2.8 196 0.9884 0.6460 0.9884 0.9942
No log 2.8286 198 0.9994 0.6506 0.9994 0.9997
No log 2.8571 200 0.7430 0.7237 0.7430 0.8620
No log 2.8857 202 0.6831 0.7785 0.6831 0.8265
No log 2.9143 204 0.7326 0.7532 0.7326 0.8559
No log 2.9429 206 0.9074 0.6667 0.9074 0.9526
No log 2.9714 208 0.9040 0.6486 0.9040 0.9508
No log 3.0 210 0.7631 0.7034 0.7631 0.8736
No log 3.0286 212 0.6742 0.7619 0.6742 0.8211
No log 3.0571 214 0.6207 0.7867 0.6207 0.7878
No log 3.0857 216 0.5986 0.8101 0.5986 0.7737
No log 3.1143 218 0.6010 0.8050 0.6010 0.7752
No log 3.1429 220 0.6380 0.7771 0.6380 0.7988
No log 3.1714 222 0.8098 0.7456 0.8098 0.8999
No log 3.2 224 0.7177 0.7665 0.7177 0.8472
No log 3.2286 226 0.6731 0.7590 0.6731 0.8204
No log 3.2571 228 0.7315 0.7470 0.7315 0.8553
No log 3.2857 230 0.9299 0.6994 0.9299 0.9643
No log 3.3143 232 1.0666 0.6341 1.0666 1.0328
No log 3.3429 234 1.0844 0.5926 1.0844 1.0413
No log 3.3714 236 0.9004 0.6577 0.9004 0.9489
No log 3.4 238 0.7594 0.7273 0.7594 0.8715
No log 3.4286 240 0.7309 0.7383 0.7309 0.8549
No log 3.4571 242 0.7592 0.7125 0.7592 0.8713
No log 3.4857 244 0.8205 0.7305 0.8205 0.9058
No log 3.5143 246 0.8643 0.7108 0.8643 0.9297
No log 3.5429 248 0.7742 0.7381 0.7742 0.8799
No log 3.5714 250 0.6716 0.7456 0.6716 0.8195
No log 3.6 252 0.7561 0.7229 0.7561 0.8696
No log 3.6286 254 0.8671 0.7011 0.8671 0.9312
No log 3.6571 256 0.8868 0.6748 0.8868 0.9417
No log 3.6857 258 0.6890 0.7260 0.6890 0.8301
No log 3.7143 260 0.6479 0.7313 0.6479 0.8049
No log 3.7429 262 0.7348 0.7206 0.7348 0.8572
No log 3.7714 264 0.6978 0.7206 0.6978 0.8354
No log 3.8 266 0.6263 0.7724 0.6263 0.7914
No log 3.8286 268 0.6601 0.7843 0.6601 0.8125
No log 3.8571 270 0.6492 0.7843 0.6492 0.8057
No log 3.8857 272 0.6664 0.7838 0.6664 0.8163
No log 3.9143 274 0.6952 0.7724 0.6952 0.8338
No log 3.9429 276 0.6869 0.7552 0.6869 0.8288
No log 3.9714 278 0.6881 0.7763 0.6881 0.8295
No log 4.0 280 0.7432 0.7162 0.7432 0.8621
No log 4.0286 282 0.7804 0.6806 0.7804 0.8834
No log 4.0571 284 0.8592 0.6620 0.8592 0.9269
No log 4.0857 286 0.8150 0.6525 0.8150 0.9028
No log 4.1143 288 0.7322 0.7234 0.7322 0.8557
No log 4.1429 290 0.6974 0.7310 0.6974 0.8351
No log 4.1714 292 0.7461 0.6939 0.7461 0.8638
No log 4.2 294 0.8205 0.6581 0.8205 0.9058
No log 4.2286 296 0.8738 0.6494 0.8738 0.9348
No log 4.2571 298 0.8245 0.6711 0.8245 0.9080
No log 4.2857 300 0.7546 0.7432 0.7546 0.8687
No log 4.3143 302 0.7457 0.7451 0.7457 0.8636
No log 4.3429 304 0.8255 0.7421 0.8255 0.9086
No log 4.3714 306 0.8731 0.7456 0.8731 0.9344
No log 4.4 308 0.8232 0.7456 0.8232 0.9073
No log 4.4286 310 0.7224 0.7421 0.7224 0.8500
No log 4.4571 312 0.6073 0.7619 0.6073 0.7793
No log 4.4857 314 0.5954 0.7571 0.5954 0.7716
No log 4.5143 316 0.6596 0.7338 0.6596 0.8122
No log 4.5429 318 0.6876 0.7111 0.6876 0.8292
No log 4.5714 320 0.6776 0.7111 0.6776 0.8232
No log 4.6 322 0.7374 0.7133 0.7374 0.8587
No log 4.6286 324 0.7473 0.6986 0.7473 0.8645
No log 4.6571 326 0.7768 0.6759 0.7768 0.8813
No log 4.6857 328 0.6953 0.7517 0.6953 0.8339
No log 4.7143 330 0.6544 0.7517 0.6544 0.8089
No log 4.7429 332 0.6472 0.7848 0.6472 0.8045
No log 4.7714 334 0.6375 0.7702 0.6375 0.7984
No log 4.8 336 0.6012 0.7643 0.6012 0.7754
No log 4.8286 338 0.6194 0.7702 0.6194 0.7870
No log 4.8571 340 0.6931 0.7636 0.6931 0.8325
No log 4.8857 342 0.6856 0.7692 0.6856 0.8280
No log 4.9143 344 0.6243 0.7651 0.6243 0.7901
No log 4.9429 346 0.5785 0.8026 0.5785 0.7606
No log 4.9714 348 0.5595 0.8050 0.5595 0.7480
No log 5.0 350 0.7007 0.7849 0.7007 0.8371
No log 5.0286 352 0.7554 0.7701 0.7554 0.8691
No log 5.0571 354 0.6024 0.8066 0.6024 0.7761
No log 5.0857 356 0.4968 0.8125 0.4968 0.7048
No log 5.1143 358 0.7060 0.7517 0.7060 0.8402
No log 5.1429 360 0.7117 0.7397 0.7117 0.8436
No log 5.1714 362 0.5448 0.8 0.5448 0.7381
No log 5.2 364 0.5509 0.7898 0.5509 0.7422
No log 5.2286 366 0.6423 0.8025 0.6423 0.8014
No log 5.2571 368 0.5814 0.8272 0.5814 0.7625
No log 5.2857 370 0.5326 0.7949 0.5326 0.7298
No log 5.3143 372 0.5593 0.7949 0.5593 0.7479
No log 5.3429 374 0.5977 0.7975 0.5977 0.7731
No log 5.3714 376 0.6544 0.7975 0.6544 0.8089
No log 5.4 378 0.6728 0.7771 0.6728 0.8202
No log 5.4286 380 0.6340 0.7582 0.6340 0.7962
No log 5.4571 382 0.5780 0.7763 0.5780 0.7602
No log 5.4857 384 0.6022 0.7871 0.6022 0.7760
No log 5.5143 386 0.6778 0.7662 0.6778 0.8233
No log 5.5429 388 0.7628 0.6753 0.7628 0.8734
No log 5.5714 390 0.8416 0.6447 0.8416 0.9174
No log 5.6 392 0.7643 0.6667 0.7643 0.8742
No log 5.6286 394 0.7216 0.7448 0.7216 0.8494
No log 5.6571 396 0.6691 0.7733 0.6691 0.8180
No log 5.6857 398 0.6799 0.7805 0.6799 0.8245
No log 5.7143 400 0.7568 0.7283 0.7568 0.8699
No log 5.7429 402 0.8073 0.7191 0.8073 0.8985
No log 5.7714 404 0.7185 0.7614 0.7185 0.8476
No log 5.8 406 0.6101 0.8205 0.6101 0.7811
No log 5.8286 408 0.6861 0.7286 0.6861 0.8283
No log 5.8571 410 0.7583 0.7338 0.7583 0.8708
No log 5.8857 412 0.7571 0.7338 0.7571 0.8701
No log 5.9143 414 0.7099 0.7429 0.7099 0.8425
No log 5.9429 416 0.7587 0.6897 0.7587 0.8710
No log 5.9714 418 0.8702 0.6887 0.8702 0.9329
No log 6.0 420 0.8389 0.6832 0.8389 0.9159
No log 6.0286 422 0.6974 0.7771 0.6974 0.8351
No log 6.0571 424 0.5809 0.7815 0.5809 0.7622
No log 6.0857 426 0.6146 0.7692 0.6146 0.7840
No log 6.1143 428 0.6754 0.7299 0.6754 0.8218
No log 6.1429 430 0.7157 0.7353 0.7157 0.8460
No log 6.1714 432 0.7740 0.7034 0.7740 0.8798
No log 6.2 434 0.7423 0.7383 0.7423 0.8616
No log 6.2286 436 0.6443 0.7763 0.6443 0.8027
No log 6.2571 438 0.6010 0.8026 0.6010 0.7752
No log 6.2857 440 0.5973 0.7922 0.5973 0.7728
No log 6.3143 442 0.5843 0.8026 0.5843 0.7644
No log 6.3429 444 0.6174 0.7947 0.6174 0.7858
No log 6.3714 446 0.6793 0.7662 0.6793 0.8242
No log 6.4 448 0.6811 0.76 0.6811 0.8253
No log 6.4286 450 0.6409 0.7733 0.6409 0.8006
No log 6.4571 452 0.6133 0.7947 0.6133 0.7831
No log 6.4857 454 0.6104 0.7867 0.6104 0.7813
No log 6.5143 456 0.6390 0.7651 0.6390 0.7994
No log 6.5429 458 0.7097 0.75 0.7097 0.8424
No log 6.5714 460 0.7363 0.7742 0.7363 0.8581
No log 6.6 462 0.7342 0.7417 0.7342 0.8569
No log 6.6286 464 0.6794 0.7467 0.6794 0.8243
No log 6.6571 466 0.6565 0.7586 0.6565 0.8103
No log 6.6857 468 0.6634 0.7619 0.6634 0.8145
No log 6.7143 470 0.6661 0.7619 0.6661 0.8162
No log 6.7429 472 0.7111 0.7550 0.7111 0.8433
No log 6.7714 474 0.7352 0.7368 0.7352 0.8574
No log 6.8 476 0.7759 0.7582 0.7759 0.8809
No log 6.8286 478 0.7569 0.7582 0.7569 0.8700
No log 6.8571 480 0.7218 0.7368 0.7218 0.8496
No log 6.8857 482 0.7113 0.7662 0.7113 0.8434
No log 6.9143 484 0.6569 0.7308 0.6569 0.8105
No log 6.9429 486 0.6394 0.7901 0.6394 0.7996
No log 6.9714 488 0.6180 0.7654 0.6180 0.7862
No log 7.0 490 0.6376 0.7758 0.6376 0.7985
No log 7.0286 492 0.6527 0.7805 0.6527 0.8079
No log 7.0571 494 0.6656 0.7799 0.6656 0.8159
No log 7.0857 496 0.7415 0.7625 0.7415 0.8611
No log 7.1143 498 0.7630 0.7547 0.7630 0.8735
0.3648 7.1429 500 0.7161 0.7582 0.7161 0.8462
0.3648 7.1714 502 0.6264 0.7815 0.6264 0.7914
0.3648 7.2 504 0.5807 0.7639 0.5807 0.7621
0.3648 7.2286 506 0.5862 0.7724 0.5862 0.7656
0.3648 7.2571 508 0.5708 0.7843 0.5708 0.7555
0.3648 7.2857 510 0.5794 0.8025 0.5794 0.7612
0.3648 7.3143 512 0.6556 0.7831 0.6556 0.8097
0.3648 7.3429 514 0.7783 0.7394 0.7783 0.8822
0.3648 7.3714 516 0.8181 0.7081 0.8181 0.9045
0.3648 7.4 518 0.7448 0.7403 0.7448 0.8630
0.3648 7.4286 520 0.7253 0.7517 0.7253 0.8516
0.3648 7.4571 522 0.6886 0.7347 0.6886 0.8298

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
182
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k15_task1_organization

Finetuned
(4222)
this model