ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7276
  • Qwk: 0.2529
  • Mse: 0.7276
  • Rmse: 0.8530

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0290 2 4.2902 -0.0127 4.2902 2.0713
No log 0.0580 4 2.4642 0.0376 2.4642 1.5698
No log 0.0870 6 1.5471 0.0254 1.5471 1.2438
No log 0.1159 8 1.1052 -0.0217 1.1052 1.0513
No log 0.1449 10 0.8869 0.1209 0.8869 0.9418
No log 0.1739 12 0.7941 0.2285 0.7941 0.8911
No log 0.2029 14 0.7901 0.1737 0.7901 0.8889
No log 0.2319 16 1.0357 0.0544 1.0357 1.0177
No log 0.2609 18 2.3912 0.0062 2.3912 1.5464
No log 0.2899 20 2.7194 -0.0087 2.7194 1.6491
No log 0.3188 22 2.0563 0.1170 2.0563 1.4340
No log 0.3478 24 1.3040 0.0196 1.3040 1.1419
No log 0.3768 26 1.1000 0.0561 1.1000 1.0488
No log 0.4058 28 0.9403 0.1597 0.9403 0.9697
No log 0.4348 30 0.7837 0.2502 0.7837 0.8853
No log 0.4638 32 0.7575 0.2452 0.7575 0.8704
No log 0.4928 34 0.7615 0.2164 0.7615 0.8726
No log 0.5217 36 0.7983 0.2097 0.7983 0.8935
No log 0.5507 38 0.8087 0.2339 0.8087 0.8993
No log 0.5797 40 0.8112 0.2034 0.8112 0.9007
No log 0.6087 42 0.8158 0.2148 0.8158 0.9032
No log 0.6377 44 0.7815 0.1900 0.7815 0.8840
No log 0.6667 46 0.7156 0.2855 0.7156 0.8459
No log 0.6957 48 0.7029 0.2900 0.7029 0.8384
No log 0.7246 50 0.7163 0.2855 0.7163 0.8464
No log 0.7536 52 0.7587 0.1856 0.7587 0.8711
No log 0.7826 54 0.8042 0.1804 0.8042 0.8968
No log 0.8116 56 0.8352 0.1092 0.8352 0.9139
No log 0.8406 58 0.8102 0.1770 0.8102 0.9001
No log 0.8696 60 0.7439 0.2670 0.7439 0.8625
No log 0.8986 62 0.7100 0.2575 0.7100 0.8426
No log 0.9275 64 0.6868 0.2882 0.6868 0.8287
No log 0.9565 66 0.6815 0.3937 0.6815 0.8255
No log 0.9855 68 0.7689 0.2061 0.7689 0.8769
No log 1.0145 70 0.7380 0.3437 0.7380 0.8591
No log 1.0435 72 0.7077 0.4365 0.7077 0.8412
No log 1.0725 74 0.7109 0.3866 0.7109 0.8432
No log 1.1014 76 0.7270 0.3789 0.7270 0.8526
No log 1.1304 78 0.7541 0.3510 0.7541 0.8684
No log 1.1594 80 0.7780 0.2445 0.7780 0.8820
No log 1.1884 82 0.7540 0.3506 0.7540 0.8684
No log 1.2174 84 0.7658 0.4024 0.7658 0.8751
No log 1.2464 86 0.8211 0.3631 0.8211 0.9062
No log 1.2754 88 1.0347 0.2985 1.0347 1.0172
No log 1.3043 90 0.8765 0.3658 0.8765 0.9362
No log 1.3333 92 0.7981 0.3798 0.7981 0.8934
No log 1.3623 94 0.9491 0.3086 0.9491 0.9742
No log 1.3913 96 0.9798 0.3328 0.9798 0.9898
No log 1.4203 98 0.8541 0.3330 0.8541 0.9242
No log 1.4493 100 0.7282 0.4335 0.7282 0.8533
No log 1.4783 102 0.7211 0.4361 0.7211 0.8492
No log 1.5072 104 0.7571 0.3635 0.7571 0.8701
No log 1.5362 106 0.8384 0.3701 0.8384 0.9156
No log 1.5652 108 0.9115 0.3422 0.9115 0.9547
No log 1.5942 110 0.8372 0.2395 0.8372 0.9150
No log 1.6232 112 0.8039 0.2853 0.8039 0.8966
No log 1.6522 114 0.7950 0.3861 0.7950 0.8916
No log 1.6812 116 0.8581 0.3964 0.8581 0.9263
No log 1.7101 118 0.8338 0.3817 0.8338 0.9131
No log 1.7391 120 0.8272 0.3375 0.8272 0.9095
No log 1.7681 122 0.9330 0.3394 0.9330 0.9659
No log 1.7971 124 0.9539 0.2999 0.9539 0.9767
No log 1.8261 126 0.8230 0.2142 0.8230 0.9072
No log 1.8551 128 0.8733 0.3181 0.8733 0.9345
No log 1.8841 130 0.8632 0.3680 0.8632 0.9291
No log 1.9130 132 0.7598 0.3765 0.7598 0.8717
No log 1.9420 134 0.7521 0.3519 0.7521 0.8672
No log 1.9710 136 0.7650 0.3548 0.7650 0.8747
No log 2.0 138 0.7285 0.3353 0.7285 0.8535
No log 2.0290 140 0.7483 0.3581 0.7483 0.8650
No log 2.0580 142 0.7241 0.2738 0.7241 0.8510
No log 2.0870 144 0.7548 0.3156 0.7548 0.8688
No log 2.1159 146 1.0231 0.3241 1.0231 1.0115
No log 2.1449 148 1.2026 0.2954 1.2026 1.0966
No log 2.1739 150 1.0539 0.3387 1.0539 1.0266
No log 2.2029 152 0.7789 0.2938 0.7789 0.8825
No log 2.2319 154 0.7741 0.3350 0.7741 0.8798
No log 2.2609 156 0.7530 0.3566 0.7530 0.8678
No log 2.2899 158 0.7635 0.4047 0.7635 0.8738
No log 2.3188 160 0.8937 0.3835 0.8937 0.9454
No log 2.3478 162 0.9094 0.3459 0.9094 0.9536
No log 2.3768 164 0.8250 0.3184 0.8250 0.9083
No log 2.4058 166 0.7809 0.3782 0.7809 0.8837
No log 2.4348 168 0.7497 0.3508 0.7497 0.8658
No log 2.4638 170 0.7396 0.3546 0.7396 0.8600
No log 2.4928 172 0.7271 0.3656 0.7271 0.8527
No log 2.5217 174 0.7298 0.3463 0.7298 0.8543
No log 2.5507 176 0.7099 0.2948 0.7099 0.8425
No log 2.5797 178 0.6830 0.3715 0.6830 0.8264
No log 2.6087 180 0.6880 0.3475 0.6880 0.8295
No log 2.6377 182 0.7012 0.3091 0.7012 0.8374
No log 2.6667 184 0.7323 0.3428 0.7323 0.8558
No log 2.6957 186 0.7626 0.3954 0.7626 0.8733
No log 2.7246 188 0.8121 0.3546 0.8121 0.9011
No log 2.7536 190 0.7620 0.4537 0.7620 0.8729
No log 2.7826 192 0.7578 0.3663 0.7578 0.8705
No log 2.8116 194 0.7398 0.3860 0.7398 0.8601
No log 2.8406 196 0.7389 0.4020 0.7389 0.8596
No log 2.8696 198 0.7579 0.3972 0.7579 0.8706
No log 2.8986 200 0.7904 0.3814 0.7904 0.8891
No log 2.9275 202 0.7024 0.4061 0.7024 0.8381
No log 2.9565 204 0.7002 0.3682 0.7002 0.8368
No log 2.9855 206 0.6912 0.4128 0.6912 0.8314
No log 3.0145 208 0.6984 0.4537 0.6984 0.8357
No log 3.0435 210 0.7104 0.4673 0.7104 0.8428
No log 3.0725 212 0.7085 0.4436 0.7085 0.8417
No log 3.1014 214 0.7189 0.4201 0.7189 0.8479
No log 3.1304 216 0.7020 0.4176 0.7020 0.8379
No log 3.1594 218 0.7080 0.3844 0.7080 0.8414
No log 3.1884 220 0.7711 0.3778 0.7711 0.8781
No log 3.2174 222 0.8281 0.4175 0.8281 0.9100
No log 3.2464 224 0.7329 0.4091 0.7329 0.8561
No log 3.2754 226 0.6976 0.4779 0.6976 0.8352
No log 3.3043 228 0.6936 0.4698 0.6936 0.8328
No log 3.3333 230 0.7019 0.4818 0.7019 0.8378
No log 3.3623 232 0.7324 0.4395 0.7324 0.8558
No log 3.3913 234 0.7283 0.4210 0.7283 0.8534
No log 3.4203 236 0.6895 0.4114 0.6895 0.8304
No log 3.4493 238 0.6712 0.3123 0.6712 0.8193
No log 3.4783 240 0.7007 0.2819 0.7007 0.8371
No log 3.5072 242 0.7220 0.3466 0.7220 0.8497
No log 3.5362 244 0.7029 0.3279 0.7029 0.8384
No log 3.5652 246 0.6480 0.3905 0.6480 0.8050
No log 3.5942 248 0.7243 0.4422 0.7243 0.8511
No log 3.6232 250 0.8856 0.3806 0.8856 0.9410
No log 3.6522 252 0.8054 0.4604 0.8054 0.8974
No log 3.6812 254 0.6648 0.3787 0.6648 0.8154
No log 3.7101 256 0.6595 0.3198 0.6595 0.8121
No log 3.7391 258 0.6909 0.3262 0.6909 0.8312
No log 3.7681 260 0.6806 0.4180 0.6806 0.8250
No log 3.7971 262 0.7538 0.4432 0.7538 0.8682
No log 3.8261 264 0.7959 0.3955 0.7959 0.8922
No log 3.8551 266 0.7079 0.4284 0.7079 0.8414
No log 3.8841 268 0.7226 0.3796 0.7226 0.8501
No log 3.9130 270 0.7079 0.3861 0.7079 0.8414
No log 3.9420 272 0.7029 0.4027 0.7029 0.8384
No log 3.9710 274 0.7051 0.3688 0.7051 0.8397
No log 4.0 276 0.7125 0.3782 0.7125 0.8441
No log 4.0290 278 0.7405 0.4114 0.7405 0.8605
No log 4.0580 280 0.7595 0.3847 0.7595 0.8715
No log 4.0870 282 0.7334 0.3681 0.7334 0.8564
No log 4.1159 284 0.6808 0.3026 0.6808 0.8251
No log 4.1449 286 0.7041 0.3432 0.7041 0.8391
No log 4.1739 288 0.6954 0.3752 0.6954 0.8339
No log 4.2029 290 0.6789 0.2918 0.6789 0.8239
No log 4.2319 292 0.7203 0.3577 0.7203 0.8487
No log 4.2609 294 0.7592 0.4219 0.7592 0.8713
No log 4.2899 296 0.7528 0.4064 0.7528 0.8676
No log 4.3188 298 0.7342 0.3492 0.7342 0.8569
No log 4.3478 300 0.7252 0.4224 0.7252 0.8516
No log 4.3768 302 0.7218 0.3878 0.7218 0.8496
No log 4.4058 304 0.7160 0.3905 0.7160 0.8462
No log 4.4348 306 0.7273 0.2562 0.7273 0.8528
No log 4.4638 308 0.6956 0.2600 0.6956 0.8340
No log 4.4928 310 0.7151 0.3243 0.7151 0.8456
No log 4.5217 312 0.7137 0.3450 0.7137 0.8448
No log 4.5507 314 0.6858 0.3273 0.6858 0.8281
No log 4.5797 316 0.7426 0.2934 0.7426 0.8617
No log 4.6087 318 0.7541 0.2660 0.7541 0.8684
No log 4.6377 320 0.7281 0.3470 0.7281 0.8533
No log 4.6667 322 0.7452 0.3600 0.7452 0.8632
No log 4.6957 324 0.7764 0.3593 0.7764 0.8811
No log 4.7246 326 0.7899 0.3655 0.7899 0.8888
No log 4.7536 328 0.8841 0.3277 0.8841 0.9403
No log 4.7826 330 0.9610 0.3178 0.9610 0.9803
No log 4.8116 332 1.0017 0.3276 1.0017 1.0008
No log 4.8406 334 0.9481 0.2519 0.9481 0.9737
No log 4.8696 336 0.8821 0.2689 0.8821 0.9392
No log 4.8986 338 0.7970 0.2881 0.7970 0.8927
No log 4.9275 340 0.7441 0.3344 0.7441 0.8626
No log 4.9565 342 0.7559 0.3688 0.7559 0.8694
No log 4.9855 344 0.7835 0.3865 0.7835 0.8852
No log 5.0145 346 0.9424 0.3585 0.9424 0.9708
No log 5.0435 348 1.0728 0.3677 1.0728 1.0358
No log 5.0725 350 0.9937 0.3352 0.9937 0.9969
No log 5.1014 352 0.8609 0.3805 0.8609 0.9278
No log 5.1304 354 0.7387 0.4173 0.7387 0.8595
No log 5.1594 356 0.7361 0.4028 0.7361 0.8580
No log 5.1884 358 0.8001 0.3439 0.8001 0.8945
No log 5.2174 360 0.8382 0.3642 0.8382 0.9155
No log 5.2464 362 0.7500 0.3145 0.7500 0.8661
No log 5.2754 364 0.7177 0.3853 0.7177 0.8472
No log 5.3043 366 0.7078 0.3817 0.7078 0.8413
No log 5.3333 368 0.7513 0.4143 0.7513 0.8668
No log 5.3623 370 0.8831 0.3399 0.8831 0.9398
No log 5.3913 372 1.0511 0.3509 1.0511 1.0252
No log 5.4203 374 0.9822 0.3705 0.9822 0.9911
No log 5.4493 376 0.7901 0.3382 0.7901 0.8889
No log 5.4783 378 0.6999 0.3626 0.6999 0.8366
No log 5.5072 380 0.6947 0.3642 0.6947 0.8335
No log 5.5362 382 0.7287 0.3855 0.7287 0.8536
No log 5.5652 384 0.8318 0.3681 0.8318 0.9120
No log 5.5942 386 0.8297 0.3539 0.8297 0.9109
No log 5.6232 388 0.7720 0.4199 0.7720 0.8786
No log 5.6522 390 0.7426 0.3753 0.7426 0.8617
No log 5.6812 392 0.7541 0.4196 0.7541 0.8684
No log 5.7101 394 0.7541 0.3935 0.7541 0.8684
No log 5.7391 396 0.6949 0.4104 0.6949 0.8336
No log 5.7681 398 0.7402 0.3259 0.7402 0.8603
No log 5.7971 400 0.7566 0.2907 0.7566 0.8698
No log 5.8261 402 0.6974 0.2653 0.6974 0.8351
No log 5.8551 404 0.6676 0.3679 0.6676 0.8171
No log 5.8841 406 0.7848 0.3560 0.7848 0.8859
No log 5.9130 408 0.8264 0.3715 0.8264 0.9091
No log 5.9420 410 0.7384 0.3746 0.7384 0.8593
No log 5.9710 412 0.7467 0.4037 0.7467 0.8641
No log 6.0 414 0.8645 0.3274 0.8645 0.9298
No log 6.0290 416 0.8761 0.3337 0.8761 0.9360
No log 6.0580 418 0.8262 0.3064 0.8262 0.9090
No log 6.0870 420 0.7532 0.2693 0.7532 0.8678
No log 6.1159 422 0.7274 0.2813 0.7274 0.8529
No log 6.1449 424 0.7368 0.2607 0.7368 0.8584
No log 6.1739 426 0.7688 0.2852 0.7688 0.8768
No log 6.2029 428 0.8366 0.3136 0.8366 0.9147
No log 6.2319 430 0.8326 0.3504 0.8326 0.9125
No log 6.2609 432 0.7645 0.3125 0.7645 0.8744
No log 6.2899 434 0.7481 0.3164 0.7481 0.8649
No log 6.3188 436 0.8031 0.3709 0.8031 0.8961
No log 6.3478 438 0.8331 0.3851 0.8331 0.9128
No log 6.3768 440 0.8656 0.3624 0.8656 0.9304
No log 6.4058 442 0.7998 0.4077 0.7998 0.8943
No log 6.4348 444 0.7140 0.4166 0.7140 0.8450
No log 6.4638 446 0.7030 0.3429 0.7030 0.8384
No log 6.4928 448 0.7072 0.3748 0.7072 0.8409
No log 6.5217 450 0.7178 0.3664 0.7178 0.8473
No log 6.5507 452 0.7428 0.4407 0.7428 0.8619
No log 6.5797 454 0.7834 0.4615 0.7834 0.8851
No log 6.6087 456 0.7497 0.4582 0.7497 0.8659
No log 6.6377 458 0.7290 0.4458 0.7290 0.8538
No log 6.6667 460 0.7298 0.4479 0.7298 0.8543
No log 6.6957 462 0.7183 0.4351 0.7183 0.8475
No log 6.7246 464 0.7109 0.3616 0.7109 0.8431
No log 6.7536 466 0.7206 0.3745 0.7206 0.8489
No log 6.7826 468 0.7009 0.3761 0.7009 0.8372
No log 6.8116 470 0.6879 0.3285 0.6879 0.8294
No log 6.8406 472 0.6998 0.2616 0.6998 0.8365
No log 6.8696 474 0.7009 0.2900 0.7009 0.8372
No log 6.8986 476 0.7037 0.3595 0.7037 0.8388
No log 6.9275 478 0.7371 0.4710 0.7371 0.8585
No log 6.9565 480 0.7821 0.4494 0.7821 0.8843
No log 6.9855 482 0.7756 0.4758 0.7756 0.8807
No log 7.0145 484 0.7633 0.4551 0.7633 0.8737
No log 7.0435 486 0.7221 0.4293 0.7221 0.8498
No log 7.0725 488 0.7004 0.4183 0.7004 0.8369
No log 7.1014 490 0.6743 0.4035 0.6743 0.8211
No log 7.1304 492 0.6696 0.3592 0.6696 0.8183
No log 7.1594 494 0.6797 0.3490 0.6797 0.8244
No log 7.1884 496 0.7383 0.3376 0.7383 0.8592
No log 7.2174 498 0.7489 0.3376 0.7489 0.8654
0.402 7.2464 500 0.6812 0.3623 0.6812 0.8254
0.402 7.2754 502 0.6533 0.3867 0.6533 0.8083
0.402 7.3043 504 0.6570 0.3922 0.6570 0.8106
0.402 7.3333 506 0.6510 0.3988 0.6510 0.8069
0.402 7.3623 508 0.6382 0.3724 0.6382 0.7989
0.402 7.3913 510 0.6516 0.3595 0.6516 0.8072
0.402 7.4203 512 0.6678 0.3880 0.6678 0.8172
0.402 7.4493 514 0.6471 0.3896 0.6471 0.8044
0.402 7.4783 516 0.6457 0.4363 0.6457 0.8035
0.402 7.5072 518 0.6447 0.4205 0.6447 0.8029
0.402 7.5362 520 0.6440 0.4124 0.6440 0.8025
0.402 7.5652 522 0.6410 0.3488 0.6410 0.8006
0.402 7.5942 524 0.6465 0.3774 0.6465 0.8040
0.402 7.6232 526 0.6523 0.3100 0.6523 0.8076
0.402 7.6522 528 0.6699 0.3003 0.6699 0.8185
0.402 7.6812 530 0.6827 0.3040 0.6827 0.8263
0.402 7.7101 532 0.6922 0.3528 0.6922 0.8320
0.402 7.7391 534 0.7201 0.4194 0.7201 0.8486
0.402 7.7681 536 0.7396 0.4146 0.7396 0.8600
0.402 7.7971 538 0.8020 0.3990 0.8020 0.8955
0.402 7.8261 540 0.8001 0.3905 0.8001 0.8945
0.402 7.8551 542 0.7636 0.3590 0.7636 0.8738
0.402 7.8841 544 0.7457 0.2857 0.7457 0.8636
0.402 7.9130 546 0.7685 0.2732 0.7685 0.8766
0.402 7.9420 548 0.8351 0.3741 0.8351 0.9138
0.402 7.9710 550 0.8165 0.2938 0.8165 0.9036
0.402 8.0 552 0.7466 0.2866 0.7466 0.8641
0.402 8.0290 554 0.7461 0.2699 0.7461 0.8638
0.402 8.0580 556 0.7723 0.2630 0.7723 0.8788
0.402 8.0870 558 0.7276 0.2529 0.7276 0.8530

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k13_task2_organization

Finetuned
(4206)
this model