ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k6_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9690
  • Qwk: 0.5756
  • Mse: 0.9690
  • Rmse: 0.9844

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0625 2 5.2504 -0.0262 5.2504 2.2914
No log 0.125 4 3.0341 0.0803 3.0341 1.7419
No log 0.1875 6 2.1265 -0.0043 2.1265 1.4583
No log 0.25 8 1.4556 0.1204 1.4556 1.2065
No log 0.3125 10 1.3315 0.1426 1.3315 1.1539
No log 0.375 12 1.2638 0.2416 1.2638 1.1242
No log 0.4375 14 1.2780 0.2229 1.2780 1.1305
No log 0.5 16 1.2747 0.2197 1.2747 1.1290
No log 0.5625 18 1.1507 0.2650 1.1507 1.0727
No log 0.625 20 1.1452 0.2516 1.1452 1.0701
No log 0.6875 22 1.2347 0.1518 1.2347 1.1111
No log 0.75 24 1.4015 0.1468 1.4015 1.1839
No log 0.8125 26 1.2423 0.2451 1.2423 1.1146
No log 0.875 28 1.1999 0.3135 1.1999 1.0954
No log 0.9375 30 1.1230 0.2934 1.1230 1.0597
No log 1.0 32 1.0930 0.2753 1.0930 1.0454
No log 1.0625 34 1.2485 0.0918 1.2485 1.1174
No log 1.125 36 1.2669 0.1061 1.2669 1.1256
No log 1.1875 38 1.1984 0.0738 1.1984 1.0947
No log 1.25 40 1.1514 0.1234 1.1514 1.0730
No log 1.3125 42 1.1936 0.1791 1.1936 1.0925
No log 1.375 44 1.3786 0.2673 1.3786 1.1741
No log 1.4375 46 1.2895 0.2812 1.2895 1.1356
No log 1.5 48 1.1012 0.2592 1.1012 1.0494
No log 1.5625 50 1.0963 0.1707 1.0963 1.0470
No log 1.625 52 1.2641 0.0715 1.2641 1.1243
No log 1.6875 54 1.5478 0.0103 1.5478 1.2441
No log 1.75 56 1.3973 0.0551 1.3973 1.1821
No log 1.8125 58 1.1468 0.0918 1.1468 1.0709
No log 1.875 60 1.1260 0.1858 1.1260 1.0611
No log 1.9375 62 1.1387 0.2411 1.1387 1.0671
No log 2.0 64 1.2650 0.2307 1.2650 1.1247
No log 2.0625 66 1.2583 0.2017 1.2583 1.1217
No log 2.125 68 0.9972 0.3546 0.9972 0.9986
No log 2.1875 70 0.9039 0.3810 0.9039 0.9507
No log 2.25 72 1.1132 0.4256 1.1132 1.0551
No log 2.3125 74 1.2133 0.3891 1.2133 1.1015
No log 2.375 76 1.0962 0.3831 1.0962 1.0470
No log 2.4375 78 0.8882 0.5024 0.8882 0.9425
No log 2.5 80 0.8624 0.5083 0.8624 0.9286
No log 2.5625 82 0.8654 0.4733 0.8654 0.9303
No log 2.625 84 0.9599 0.4336 0.9599 0.9797
No log 2.6875 86 1.0908 0.3984 1.0908 1.0444
No log 2.75 88 1.1695 0.4583 1.1695 1.0814
No log 2.8125 90 1.0267 0.5168 1.0267 1.0133
No log 2.875 92 0.9007 0.5917 0.9007 0.9490
No log 2.9375 94 0.9205 0.6062 0.9205 0.9594
No log 3.0 96 0.9059 0.6164 0.9059 0.9518
No log 3.0625 98 0.9110 0.6203 0.9110 0.9544
No log 3.125 100 0.9776 0.6047 0.9776 0.9887
No log 3.1875 102 1.1392 0.5218 1.1392 1.0673
No log 3.25 104 1.3512 0.4522 1.3512 1.1624
No log 3.3125 106 1.4579 0.3899 1.4579 1.2074
No log 3.375 108 1.4760 0.3898 1.4760 1.2149
No log 3.4375 110 1.3639 0.4115 1.3639 1.1679
No log 3.5 112 1.1171 0.5030 1.1171 1.0569
No log 3.5625 114 0.9771 0.5453 0.9771 0.9885
No log 3.625 116 0.9250 0.6348 0.9250 0.9618
No log 3.6875 118 0.9747 0.6203 0.9747 0.9873
No log 3.75 120 1.1216 0.5489 1.1216 1.0590
No log 3.8125 122 1.1125 0.5669 1.1125 1.0548
No log 3.875 124 0.8804 0.6422 0.8804 0.9383
No log 3.9375 126 0.7324 0.6345 0.7324 0.8558
No log 4.0 128 0.7301 0.5846 0.7301 0.8545
No log 4.0625 130 0.7601 0.4972 0.7601 0.8718
No log 4.125 132 0.7585 0.5302 0.7585 0.8709
No log 4.1875 134 0.7856 0.5660 0.7856 0.8864
No log 4.25 136 0.8435 0.5785 0.8435 0.9184
No log 4.3125 138 0.9188 0.6036 0.9188 0.9585
No log 4.375 140 1.0166 0.5319 1.0166 1.0083
No log 4.4375 142 1.1238 0.4863 1.1238 1.0601
No log 4.5 144 1.2495 0.4421 1.2495 1.1178
No log 4.5625 146 1.1812 0.4520 1.1812 1.0868
No log 4.625 148 1.0234 0.4999 1.0234 1.0116
No log 4.6875 150 0.8395 0.5921 0.8395 0.9163
No log 4.75 152 0.7787 0.6487 0.7787 0.8824
No log 4.8125 154 0.7712 0.6594 0.7712 0.8782
No log 4.875 156 0.7710 0.6309 0.7710 0.8781
No log 4.9375 158 0.8857 0.5993 0.8857 0.9411
No log 5.0 160 1.0305 0.5469 1.0305 1.0151
No log 5.0625 162 1.0783 0.5079 1.0783 1.0384
No log 5.125 164 1.0111 0.5229 1.0111 1.0055
No log 5.1875 166 0.9472 0.5828 0.9472 0.9732
No log 5.25 168 0.8551 0.6268 0.8551 0.9247
No log 5.3125 170 0.9052 0.6105 0.9052 0.9514
No log 5.375 172 1.0375 0.5270 1.0375 1.0186
No log 5.4375 174 0.9493 0.5756 0.9493 0.9743
No log 5.5 176 0.8239 0.6279 0.8239 0.9077
No log 5.5625 178 0.7596 0.6073 0.7596 0.8715
No log 5.625 180 0.7561 0.6313 0.7561 0.8695
No log 5.6875 182 0.7717 0.6695 0.7717 0.8785
No log 5.75 184 0.9234 0.6219 0.9234 0.9609
No log 5.8125 186 1.0330 0.5481 1.0330 1.0164
No log 5.875 188 0.9558 0.5935 0.9558 0.9776
No log 5.9375 190 0.8249 0.6318 0.8249 0.9083
No log 6.0 192 0.7502 0.6298 0.7502 0.8661
No log 6.0625 194 0.7504 0.6400 0.7504 0.8662
No log 6.125 196 0.8782 0.6020 0.8782 0.9371
No log 6.1875 198 0.9453 0.5796 0.9453 0.9723
No log 6.25 200 0.8863 0.5838 0.8863 0.9414
No log 6.3125 202 0.9256 0.5811 0.9256 0.9621
No log 6.375 204 0.9733 0.5607 0.9733 0.9865
No log 6.4375 206 1.0979 0.5593 1.0979 1.0478
No log 6.5 208 1.1859 0.5418 1.1859 1.0890
No log 6.5625 210 0.9862 0.6111 0.9862 0.9931
No log 6.625 212 0.7689 0.6456 0.7689 0.8769
No log 6.6875 214 0.7493 0.6448 0.7493 0.8656
No log 6.75 216 0.8005 0.5993 0.8005 0.8947
No log 6.8125 218 0.8981 0.5551 0.8981 0.9477
No log 6.875 220 1.0091 0.5415 1.0091 1.0045
No log 6.9375 222 0.9346 0.5471 0.9346 0.9668
No log 7.0 224 0.7817 0.6146 0.7817 0.8841
No log 7.0625 226 0.7202 0.6717 0.7202 0.8487
No log 7.125 228 0.7275 0.6642 0.7275 0.8530
No log 7.1875 230 0.8599 0.6251 0.8599 0.9273
No log 7.25 232 1.1184 0.5390 1.1184 1.0575
No log 7.3125 234 1.2822 0.5169 1.2822 1.1323
No log 7.375 236 1.1806 0.5436 1.1806 1.0865
No log 7.4375 238 0.9255 0.5905 0.9255 0.9620
No log 7.5 240 0.7369 0.6736 0.7369 0.8584
No log 7.5625 242 0.7068 0.6507 0.7068 0.8407
No log 7.625 244 0.7112 0.6454 0.7112 0.8433
No log 7.6875 246 0.7471 0.6103 0.7471 0.8643
No log 7.75 248 0.8207 0.5991 0.8207 0.9059
No log 7.8125 250 0.9000 0.5861 0.9000 0.9487
No log 7.875 252 0.8725 0.5946 0.8725 0.9341
No log 7.9375 254 0.7750 0.6573 0.7750 0.8804
No log 8.0 256 0.6990 0.6705 0.6990 0.8361
No log 8.0625 258 0.6833 0.6652 0.6833 0.8266
No log 8.125 260 0.7333 0.6655 0.7333 0.8564
No log 8.1875 262 0.8180 0.6291 0.8181 0.9045
No log 8.25 264 0.9165 0.6079 0.9165 0.9574
No log 8.3125 266 0.9287 0.6179 0.9287 0.9637
No log 8.375 268 0.8924 0.6440 0.8924 0.9447
No log 8.4375 270 0.8078 0.6697 0.8078 0.8988
No log 8.5 272 0.7831 0.6841 0.7831 0.8849
No log 8.5625 274 0.7563 0.6782 0.7563 0.8696
No log 8.625 276 0.7837 0.6534 0.7837 0.8853
No log 8.6875 278 0.7507 0.6525 0.7507 0.8664
No log 8.75 280 0.7546 0.6210 0.7546 0.8686
No log 8.8125 282 0.7037 0.6598 0.7037 0.8389
No log 8.875 284 0.6752 0.6792 0.6752 0.8217
No log 8.9375 286 0.6842 0.6684 0.6842 0.8272
No log 9.0 288 0.8128 0.6200 0.8128 0.9016
No log 9.0625 290 0.9516 0.5902 0.9516 0.9755
No log 9.125 292 0.9467 0.5767 0.9467 0.9730
No log 9.1875 294 0.8841 0.6309 0.8841 0.9403
No log 9.25 296 0.8533 0.6292 0.8533 0.9237
No log 9.3125 298 0.8004 0.6742 0.8004 0.8947
No log 9.375 300 0.7949 0.6629 0.7949 0.8916
No log 9.4375 302 0.9052 0.6165 0.9052 0.9514
No log 9.5 304 0.9455 0.5548 0.9455 0.9724
No log 9.5625 306 0.8704 0.6218 0.8704 0.9330
No log 9.625 308 0.7810 0.6365 0.7810 0.8837
No log 9.6875 310 0.7092 0.6625 0.7092 0.8421
No log 9.75 312 0.7340 0.6578 0.7340 0.8567
No log 9.8125 314 0.7947 0.6778 0.7947 0.8915
No log 9.875 316 0.8763 0.6578 0.8763 0.9361
No log 9.9375 318 1.0022 0.6215 1.0022 1.0011
No log 10.0 320 1.2468 0.5703 1.2468 1.1166
No log 10.0625 322 1.2134 0.5343 1.2134 1.1016
No log 10.125 324 1.1350 0.5327 1.1350 1.0654
No log 10.1875 326 0.9990 0.5506 0.9990 0.9995
No log 10.25 328 0.9445 0.5336 0.9445 0.9718
No log 10.3125 330 0.8514 0.5607 0.8514 0.9227
No log 10.375 332 0.7951 0.6250 0.7951 0.8917
No log 10.4375 334 0.7590 0.6562 0.7590 0.8712
No log 10.5 336 0.7210 0.6880 0.7210 0.8491
No log 10.5625 338 0.8356 0.6877 0.8356 0.9141
No log 10.625 340 0.9057 0.6321 0.9057 0.9517
No log 10.6875 342 0.8237 0.6769 0.8237 0.9076
No log 10.75 344 0.7208 0.6862 0.7208 0.8490
No log 10.8125 346 0.7430 0.6805 0.7430 0.8620
No log 10.875 348 0.8767 0.5742 0.8767 0.9363
No log 10.9375 350 1.1227 0.5127 1.1227 1.0596
No log 11.0 352 1.1998 0.4965 1.1998 1.0953
No log 11.0625 354 1.0690 0.5438 1.0690 1.0339
No log 11.125 356 0.7916 0.6768 0.7916 0.8897
No log 11.1875 358 0.6944 0.6597 0.6944 0.8333
No log 11.25 360 0.6974 0.6597 0.6974 0.8351
No log 11.3125 362 0.7215 0.6892 0.7215 0.8494
No log 11.375 364 0.8892 0.5639 0.8892 0.9430
No log 11.4375 366 1.3077 0.4629 1.3077 1.1435
No log 11.5 368 1.5623 0.4740 1.5623 1.2499
No log 11.5625 370 1.5154 0.4756 1.5154 1.2310
No log 11.625 372 1.2883 0.4590 1.2883 1.1351
No log 11.6875 374 1.0172 0.5324 1.0172 1.0085
No log 11.75 376 0.8585 0.6170 0.8585 0.9265
No log 11.8125 378 0.7911 0.6646 0.7911 0.8894
No log 11.875 380 0.7552 0.6707 0.7552 0.8690
No log 11.9375 382 0.7000 0.6895 0.7000 0.8367
No log 12.0 384 0.7353 0.6582 0.7353 0.8575
No log 12.0625 386 0.9256 0.6024 0.9256 0.9621
No log 12.125 388 1.0966 0.6082 1.0966 1.0472
No log 12.1875 390 1.0075 0.6234 1.0075 1.0038
No log 12.25 392 0.7830 0.6358 0.7830 0.8849
No log 12.3125 394 0.6941 0.6856 0.6941 0.8331
No log 12.375 396 0.6898 0.6856 0.6898 0.8306
No log 12.4375 398 0.7624 0.6636 0.7624 0.8732
No log 12.5 400 0.8093 0.6619 0.8093 0.8996
No log 12.5625 402 0.8886 0.6181 0.8886 0.9427
No log 12.625 404 0.8798 0.6508 0.8798 0.9380
No log 12.6875 406 0.7730 0.6727 0.7730 0.8792
No log 12.75 408 0.7439 0.6739 0.7439 0.8625
No log 12.8125 410 0.7368 0.6750 0.7368 0.8584
No log 12.875 412 0.7232 0.6708 0.7232 0.8504
No log 12.9375 414 0.7813 0.6263 0.7813 0.8839
No log 13.0 416 0.8858 0.5861 0.8858 0.9412
No log 13.0625 418 0.9662 0.5695 0.9662 0.9830
No log 13.125 420 0.9364 0.5855 0.9364 0.9677
No log 13.1875 422 0.8247 0.6387 0.8247 0.9081
No log 13.25 424 0.7576 0.6687 0.7576 0.8704
No log 13.3125 426 0.6977 0.6880 0.6977 0.8353
No log 13.375 428 0.6680 0.6902 0.6680 0.8173
No log 13.4375 430 0.6961 0.6520 0.6961 0.8343
No log 13.5 432 0.8465 0.6434 0.8465 0.9200
No log 13.5625 434 1.0821 0.5744 1.0821 1.0402
No log 13.625 436 1.1444 0.5734 1.1444 1.0698
No log 13.6875 438 1.0246 0.6020 1.0246 1.0122
No log 13.75 440 0.8416 0.6175 0.8416 0.9174
No log 13.8125 442 0.7324 0.6604 0.7324 0.8558
No log 13.875 444 0.6904 0.6570 0.6904 0.8309
No log 13.9375 446 0.7063 0.6550 0.7063 0.8404
No log 14.0 448 0.8081 0.5933 0.8081 0.8990
No log 14.0625 450 1.0121 0.6053 1.0121 1.0060
No log 14.125 452 1.0915 0.5712 1.0915 1.0447
No log 14.1875 454 0.9847 0.6159 0.9847 0.9923
No log 14.25 456 0.7941 0.6337 0.7941 0.8911
No log 14.3125 458 0.6805 0.6835 0.6805 0.8249
No log 14.375 460 0.6730 0.7021 0.6730 0.8204
No log 14.4375 462 0.7679 0.6787 0.7679 0.8763
No log 14.5 464 0.9665 0.5887 0.9665 0.9831
No log 14.5625 466 1.0664 0.5798 1.0664 1.0327
No log 14.625 468 1.0579 0.5621 1.0579 1.0286
No log 14.6875 470 1.0002 0.5440 1.0002 1.0001
No log 14.75 472 0.9267 0.5991 0.9267 0.9626
No log 14.8125 474 0.8938 0.6045 0.8938 0.9454
No log 14.875 476 0.9331 0.5918 0.9331 0.9660
No log 14.9375 478 0.9431 0.5806 0.9431 0.9711
No log 15.0 480 0.9279 0.6128 0.9279 0.9633
No log 15.0625 482 0.8093 0.6383 0.8093 0.8996
No log 15.125 484 0.6933 0.7066 0.6933 0.8326
No log 15.1875 486 0.6587 0.7137 0.6587 0.8116
No log 15.25 488 0.6772 0.6828 0.6772 0.8229
No log 15.3125 490 0.7576 0.6586 0.7576 0.8704
No log 15.375 492 0.8761 0.6156 0.8761 0.9360
No log 15.4375 494 0.9683 0.5493 0.9683 0.9840
No log 15.5 496 1.1461 0.5422 1.1461 1.0706
No log 15.5625 498 1.2053 0.5631 1.2053 1.0979
0.4452 15.625 500 1.0568 0.5940 1.0568 1.0280
0.4452 15.6875 502 0.8872 0.6388 0.8872 0.9419
0.4452 15.75 504 0.7808 0.6704 0.7808 0.8836
0.4452 15.8125 506 0.7210 0.6990 0.7210 0.8491
0.4452 15.875 508 0.7298 0.6767 0.7298 0.8543
0.4452 15.9375 510 0.8020 0.6512 0.8020 0.8956
0.4452 16.0 512 0.8665 0.5867 0.8665 0.9308
0.4452 16.0625 514 0.9371 0.5518 0.9371 0.9680
0.4452 16.125 516 0.9441 0.5571 0.9441 0.9717
0.4452 16.1875 518 0.8666 0.5993 0.8666 0.9309
0.4452 16.25 520 0.7820 0.6418 0.7820 0.8843
0.4452 16.3125 522 0.7527 0.7036 0.7527 0.8676
0.4452 16.375 524 0.7677 0.7064 0.7677 0.8762
0.4452 16.4375 526 0.7712 0.7071 0.7712 0.8782
0.4452 16.5 528 0.7584 0.6948 0.7584 0.8709
0.4452 16.5625 530 0.7525 0.6907 0.7525 0.8675
0.4452 16.625 532 0.8101 0.6588 0.8101 0.9000
0.4452 16.6875 534 0.7760 0.6567 0.7760 0.8809
0.4452 16.75 536 0.7756 0.6188 0.7756 0.8807
0.4452 16.8125 538 0.8617 0.6126 0.8617 0.9283
0.4452 16.875 540 0.8931 0.6126 0.8931 0.9450
0.4452 16.9375 542 0.8767 0.6172 0.8767 0.9363
0.4452 17.0 544 0.9234 0.5946 0.9234 0.9609
0.4452 17.0625 546 0.9311 0.5816 0.9311 0.9649
0.4452 17.125 548 0.9303 0.5889 0.9303 0.9645
0.4452 17.1875 550 0.9373 0.5732 0.9373 0.9681
0.4452 17.25 552 0.8936 0.6006 0.8936 0.9453
0.4452 17.3125 554 0.8158 0.6452 0.8158 0.9032
0.4452 17.375 556 0.7637 0.6404 0.7637 0.8739
0.4452 17.4375 558 0.7955 0.6401 0.7955 0.8919
0.4452 17.5 560 0.8614 0.6184 0.8614 0.9281
0.4452 17.5625 562 0.9137 0.5995 0.9137 0.9559
0.4452 17.625 564 0.8720 0.6312 0.8720 0.9338
0.4452 17.6875 566 0.8117 0.6393 0.8117 0.9009
0.4452 17.75 568 0.7086 0.6738 0.7086 0.8418
0.4452 17.8125 570 0.6734 0.6812 0.6734 0.8206
0.4452 17.875 572 0.6754 0.6762 0.6754 0.8218
0.4452 17.9375 574 0.7248 0.6458 0.7248 0.8513
0.4452 18.0 576 0.7771 0.6311 0.7771 0.8815
0.4452 18.0625 578 0.7990 0.6104 0.7990 0.8939
0.4452 18.125 580 0.8287 0.6207 0.8287 0.9103
0.4452 18.1875 582 0.8997 0.6132 0.8997 0.9485
0.4452 18.25 584 0.9575 0.5855 0.9575 0.9785
0.4452 18.3125 586 0.8920 0.5869 0.8920 0.9444
0.4452 18.375 588 0.7823 0.6108 0.7823 0.8845
0.4452 18.4375 590 0.7184 0.6998 0.7184 0.8476
0.4452 18.5 592 0.7256 0.6911 0.7256 0.8518
0.4452 18.5625 594 0.7796 0.6849 0.7796 0.8830
0.4452 18.625 596 0.8275 0.6601 0.8275 0.9097
0.4452 18.6875 598 0.8073 0.6735 0.8073 0.8985
0.4452 18.75 600 0.7527 0.6808 0.7527 0.8676
0.4452 18.8125 602 0.7853 0.6646 0.7853 0.8862
0.4452 18.875 604 0.8199 0.6221 0.8199 0.9055
0.4452 18.9375 606 0.8694 0.6021 0.8694 0.9324
0.4452 19.0 608 0.9046 0.5786 0.9046 0.9511
0.4452 19.0625 610 0.9334 0.5786 0.9334 0.9661
0.4452 19.125 612 0.9524 0.5705 0.9524 0.9759
0.4452 19.1875 614 0.9804 0.5631 0.9804 0.9901
0.4452 19.25 616 0.9690 0.5756 0.9690 0.9844

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k6_task1_organization

Finetuned
(4205)
this model