ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6691
  • Qwk: 0.3803
  • Mse: 0.6691
  • Rmse: 0.8180

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0370 2 4.4196 -0.0391 4.4196 2.1023
No log 0.0741 4 2.5472 0.0297 2.5472 1.5960
No log 0.1111 6 1.7700 -0.0716 1.7700 1.3304
No log 0.1481 8 1.3133 -0.0013 1.3133 1.1460
No log 0.1852 10 1.3150 -0.0919 1.3150 1.1467
No log 0.2222 12 1.0669 -0.0699 1.0669 1.0329
No log 0.2593 14 1.0201 -0.0305 1.0201 1.0100
No log 0.2963 16 1.1213 -0.0786 1.1213 1.0589
No log 0.3333 18 1.0155 -0.0410 1.0155 1.0077
No log 0.3704 20 1.0073 -0.0238 1.0073 1.0036
No log 0.4074 22 0.9898 0.0093 0.9898 0.9949
No log 0.4444 24 1.0333 0.0398 1.0333 1.0165
No log 0.4815 26 1.0520 0.0084 1.0520 1.0257
No log 0.5185 28 1.1549 -0.0108 1.1549 1.0747
No log 0.5556 30 1.1271 0.0 1.1271 1.0617
No log 0.5926 32 1.0711 -0.0065 1.0711 1.0349
No log 0.6296 34 1.0726 0.0148 1.0726 1.0357
No log 0.6667 36 0.9865 0.0572 0.9865 0.9932
No log 0.7037 38 0.8557 0.0363 0.8557 0.9251
No log 0.7407 40 0.8165 0.1504 0.8165 0.9036
No log 0.7778 42 0.8230 0.1044 0.8230 0.9072
No log 0.8148 44 0.9432 0.1190 0.9432 0.9712
No log 0.8519 46 1.0958 0.0902 1.0958 1.0468
No log 0.8889 48 1.1681 0.0632 1.1681 1.0808
No log 0.9259 50 1.0321 0.0848 1.0321 1.0159
No log 0.9630 52 0.7772 0.3009 0.7772 0.8816
No log 1.0 54 0.7036 0.1842 0.7036 0.8388
No log 1.0370 56 0.8786 0.0903 0.8786 0.9374
No log 1.0741 58 0.9606 0.0554 0.9606 0.9801
No log 1.1111 60 0.8669 0.0364 0.8669 0.9311
No log 1.1481 62 0.7814 0.0725 0.7814 0.8840
No log 1.1852 64 0.7543 0.1225 0.7543 0.8685
No log 1.2222 66 0.7619 0.0869 0.7619 0.8729
No log 1.2593 68 0.7536 0.1505 0.7536 0.8681
No log 1.2963 70 0.7398 0.1696 0.7398 0.8601
No log 1.3333 72 0.7359 0.2562 0.7359 0.8579
No log 1.3704 74 0.7360 0.3021 0.7360 0.8579
No log 1.4074 76 0.7463 0.2111 0.7463 0.8639
No log 1.4444 78 0.7836 0.1934 0.7836 0.8852
No log 1.4815 80 0.8714 0.2630 0.8714 0.9335
No log 1.5185 82 0.8230 0.2213 0.8230 0.9072
No log 1.5556 84 0.7624 0.2752 0.7624 0.8732
No log 1.5926 86 0.7558 0.2825 0.7558 0.8694
No log 1.6296 88 0.7436 0.3233 0.7436 0.8623
No log 1.6667 90 0.7220 0.3167 0.7220 0.8497
No log 1.7037 92 0.7408 0.3659 0.7408 0.8607
No log 1.7407 94 0.7695 0.3768 0.7695 0.8772
No log 1.7778 96 1.0137 0.3038 1.0137 1.0068
No log 1.8148 98 0.9244 0.3491 0.9244 0.9614
No log 1.8519 100 0.8097 0.3792 0.8097 0.8999
No log 1.8889 102 0.8251 0.3820 0.8251 0.9084
No log 1.9259 104 0.8874 0.3716 0.8874 0.9420
No log 1.9630 106 1.2678 0.2776 1.2678 1.1260
No log 2.0 108 1.2110 0.2626 1.2110 1.1005
No log 2.0370 110 0.9264 0.4062 0.9264 0.9625
No log 2.0741 112 0.9667 0.3781 0.9667 0.9832
No log 2.1111 114 1.3896 0.2422 1.3896 1.1788
No log 2.1481 116 1.6851 0.2039 1.6851 1.2981
No log 2.1852 118 1.9148 0.1499 1.9148 1.3838
No log 2.2222 120 1.7409 0.1741 1.7409 1.3194
No log 2.2593 122 1.1403 0.2969 1.1403 1.0679
No log 2.2963 124 0.8535 0.2854 0.8535 0.9239
No log 2.3333 126 0.8213 0.2905 0.8213 0.9063
No log 2.3704 128 0.8349 0.2978 0.8349 0.9137
No log 2.4074 130 0.8544 0.2570 0.8544 0.9243
No log 2.4444 132 0.9064 0.3157 0.9064 0.9520
No log 2.4815 134 0.9548 0.3506 0.9548 0.9772
No log 2.5185 136 0.7993 0.3260 0.7993 0.8941
No log 2.5556 138 0.7876 0.3618 0.7876 0.8875
No log 2.5926 140 0.8047 0.4118 0.8047 0.8970
No log 2.6296 142 0.7211 0.3835 0.7211 0.8492
No log 2.6667 144 0.7517 0.3719 0.7517 0.8670
No log 2.7037 146 0.9495 0.3033 0.9495 0.9744
No log 2.7407 148 0.9005 0.2943 0.9005 0.9489
No log 2.7778 150 0.7465 0.3679 0.7465 0.8640
No log 2.8148 152 0.7141 0.2499 0.7141 0.8451
No log 2.8519 154 0.7917 0.2479 0.7917 0.8898
No log 2.8889 156 0.7767 0.2414 0.7767 0.8813
No log 2.9259 158 0.7194 0.2945 0.7194 0.8481
No log 2.9630 160 0.7175 0.3649 0.7175 0.8471
No log 3.0 162 0.7400 0.4109 0.7400 0.8602
No log 3.0370 164 0.7447 0.4407 0.7447 0.8630
No log 3.0741 166 0.7085 0.3746 0.7085 0.8417
No log 3.1111 168 0.7708 0.3453 0.7708 0.8780
No log 3.1481 170 0.9067 0.3562 0.9067 0.9522
No log 3.1852 172 0.8317 0.3810 0.8317 0.9120
No log 3.2222 174 0.7168 0.3365 0.7168 0.8467
No log 3.2593 176 0.7303 0.3200 0.7303 0.8546
No log 3.2963 178 0.7085 0.4006 0.7085 0.8417
No log 3.3333 180 0.7284 0.3601 0.7284 0.8534
No log 3.3704 182 0.7710 0.3396 0.7710 0.8781
No log 3.4074 184 0.7431 0.3469 0.7431 0.8620
No log 3.4444 186 0.7462 0.3669 0.7462 0.8638
No log 3.4815 188 0.7417 0.3725 0.7417 0.8612
No log 3.5185 190 0.7468 0.4254 0.7468 0.8642
No log 3.5556 192 0.8687 0.3218 0.8687 0.9320
No log 3.5926 194 0.9680 0.3119 0.9680 0.9839
No log 3.6296 196 0.9828 0.3169 0.9828 0.9913
No log 3.6667 198 0.8420 0.3034 0.8420 0.9176
No log 3.7037 200 0.7013 0.3745 0.7013 0.8374
No log 3.7407 202 0.6893 0.3830 0.6893 0.8302
No log 3.7778 204 0.7027 0.3814 0.7027 0.8383
No log 3.8148 206 0.7004 0.4102 0.7004 0.8369
No log 3.8519 208 0.6936 0.4096 0.6936 0.8328
No log 3.8889 210 0.6993 0.4154 0.6993 0.8362
No log 3.9259 212 0.6983 0.4055 0.6983 0.8357
No log 3.9630 214 0.6906 0.4157 0.6906 0.8310
No log 4.0 216 0.6818 0.4217 0.6818 0.8257
No log 4.0370 218 0.6645 0.4260 0.6645 0.8151
No log 4.0741 220 0.7026 0.3668 0.7026 0.8382
No log 4.1111 222 0.7205 0.4043 0.7205 0.8488
No log 4.1481 224 0.6613 0.4220 0.6613 0.8132
No log 4.1852 226 0.6609 0.4277 0.6609 0.8129
No log 4.2222 228 0.7113 0.4062 0.7113 0.8434
No log 4.2593 230 0.9779 0.3313 0.9779 0.9889
No log 4.2963 232 0.9386 0.3695 0.9386 0.9688
No log 4.3333 234 0.7036 0.4277 0.7036 0.8388
No log 4.3704 236 0.6590 0.4438 0.6590 0.8118
No log 4.4074 238 0.6672 0.4407 0.6672 0.8168
No log 4.4444 240 0.6466 0.3878 0.6466 0.8041
No log 4.4815 242 0.6928 0.4802 0.6928 0.8324
No log 4.5185 244 0.6692 0.4489 0.6692 0.8180
No log 4.5556 246 0.6343 0.3947 0.6343 0.7964
No log 4.5926 248 0.6338 0.4053 0.6338 0.7961
No log 4.6296 250 0.6637 0.3879 0.6637 0.8147
No log 4.6667 252 0.6756 0.3842 0.6756 0.8219
No log 4.7037 254 0.6372 0.4029 0.6372 0.7982
No log 4.7407 256 0.6463 0.4302 0.6463 0.8039
No log 4.7778 258 0.6712 0.4366 0.6712 0.8193
No log 4.8148 260 0.6880 0.4234 0.6880 0.8294
No log 4.8519 262 0.6819 0.4688 0.6819 0.8258
No log 4.8889 264 0.6653 0.4152 0.6653 0.8157
No log 4.9259 266 0.6592 0.4330 0.6592 0.8119
No log 4.9630 268 0.6559 0.4424 0.6559 0.8099
No log 5.0 270 0.6504 0.4094 0.6504 0.8065
No log 5.0370 272 0.6529 0.3833 0.6529 0.8080
No log 5.0741 274 0.6735 0.4033 0.6735 0.8207
No log 5.1111 276 0.7186 0.3892 0.7186 0.8477
No log 5.1481 278 0.6773 0.4102 0.6773 0.8230
No log 5.1852 280 0.6296 0.3754 0.6296 0.7935
No log 5.2222 282 0.6201 0.4080 0.6201 0.7874
No log 5.2593 284 0.6259 0.4408 0.6259 0.7911
No log 5.2963 286 0.6719 0.4255 0.6719 0.8197
No log 5.3333 288 0.7516 0.4274 0.7516 0.8670
No log 5.3704 290 0.7350 0.4382 0.7350 0.8573
No log 5.4074 292 0.6629 0.4903 0.6629 0.8142
No log 5.4444 294 0.6475 0.4804 0.6475 0.8047
No log 5.4815 296 0.6420 0.5019 0.6420 0.8013
No log 5.5185 298 0.6332 0.4905 0.6332 0.7958
No log 5.5556 300 0.6139 0.4580 0.6139 0.7835
No log 5.5926 302 0.6381 0.4397 0.6381 0.7988
No log 5.6296 304 0.6287 0.4551 0.6287 0.7929
No log 5.6667 306 0.6269 0.4724 0.6269 0.7918
No log 5.7037 308 0.6466 0.5005 0.6466 0.8041
No log 5.7407 310 0.6481 0.4841 0.6481 0.8050
No log 5.7778 312 0.6912 0.4611 0.6912 0.8314
No log 5.8148 314 0.8633 0.4564 0.8633 0.9291
No log 5.8519 316 0.8832 0.4757 0.8832 0.9398
No log 5.8889 318 0.7348 0.4503 0.7348 0.8572
No log 5.9259 320 0.6672 0.4624 0.6672 0.8168
No log 5.9630 322 0.6901 0.4424 0.6901 0.8307
No log 6.0 324 0.8197 0.4566 0.8197 0.9054
No log 6.0370 326 0.9653 0.4398 0.9653 0.9825
No log 6.0741 328 1.0157 0.4161 1.0157 1.0078
No log 6.1111 330 0.8492 0.4191 0.8492 0.9215
No log 6.1481 332 0.6802 0.4496 0.6802 0.8248
No log 6.1852 334 0.6395 0.4512 0.6395 0.7997
No log 6.2222 336 0.6257 0.4346 0.6257 0.7910
No log 6.2593 338 0.6178 0.5071 0.6178 0.7860
No log 6.2963 340 0.6214 0.5257 0.6214 0.7883
No log 6.3333 342 0.6241 0.4171 0.6241 0.7900
No log 6.3704 344 0.6244 0.4552 0.6244 0.7902
No log 6.4074 346 0.6803 0.4478 0.6803 0.8248
No log 6.4444 348 0.7646 0.4479 0.7646 0.8744
No log 6.4815 350 0.7379 0.4252 0.7379 0.8590
No log 6.5185 352 0.6972 0.4207 0.6972 0.8350
No log 6.5556 354 0.6397 0.4222 0.6397 0.7998
No log 6.5926 356 0.6381 0.3850 0.6381 0.7988
No log 6.6296 358 0.6454 0.4207 0.6454 0.8033
No log 6.6667 360 0.6459 0.4457 0.6459 0.8037
No log 6.7037 362 0.7246 0.4157 0.7246 0.8512
No log 6.7407 364 0.7588 0.4357 0.7588 0.8711
No log 6.7778 366 0.7020 0.4408 0.7020 0.8378
No log 6.8148 368 0.6753 0.4294 0.6753 0.8218
No log 6.8519 370 0.8218 0.4332 0.8218 0.9065
No log 6.8889 372 0.9294 0.3805 0.9294 0.9641
No log 6.9259 374 0.8722 0.3648 0.8722 0.9339
No log 6.9630 376 0.7289 0.4029 0.7289 0.8538
No log 7.0 378 0.6802 0.3941 0.6802 0.8247
No log 7.0370 380 0.7089 0.3573 0.7089 0.8419
No log 7.0741 382 0.7054 0.3528 0.7054 0.8399
No log 7.1111 384 0.6780 0.3481 0.6780 0.8234
No log 7.1481 386 0.6792 0.3810 0.6792 0.8242
No log 7.1852 388 0.7001 0.3311 0.7001 0.8367
No log 7.2222 390 0.6872 0.3615 0.6872 0.8290
No log 7.2593 392 0.6694 0.3580 0.6694 0.8182
No log 7.2963 394 0.6737 0.3580 0.6737 0.8208
No log 7.3333 396 0.6894 0.3660 0.6894 0.8303
No log 7.3704 398 0.7122 0.3868 0.7122 0.8439
No log 7.4074 400 0.7319 0.3910 0.7319 0.8555
No log 7.4444 402 0.7919 0.4346 0.7919 0.8899
No log 7.4815 404 0.7478 0.4399 0.7478 0.8648
No log 7.5185 406 0.6782 0.4265 0.6782 0.8235
No log 7.5556 408 0.6531 0.3974 0.6531 0.8081
No log 7.5926 410 0.6450 0.4003 0.6450 0.8031
No log 7.6296 412 0.6548 0.4823 0.6548 0.8092
No log 7.6667 414 0.6478 0.4648 0.6478 0.8049
No log 7.7037 416 0.6327 0.4234 0.6327 0.7954
No log 7.7407 418 0.6861 0.4285 0.6861 0.8283
No log 7.7778 420 0.7817 0.3908 0.7817 0.8842
No log 7.8148 422 0.7929 0.3908 0.7929 0.8904
No log 7.8519 424 0.7003 0.4688 0.7003 0.8368
No log 7.8889 426 0.6402 0.4272 0.6402 0.8001
No log 7.9259 428 0.7397 0.4949 0.7397 0.8601
No log 7.9630 430 0.8326 0.4747 0.8326 0.9125
No log 8.0 432 0.8049 0.4663 0.8049 0.8971
No log 8.0370 434 0.6930 0.5085 0.6930 0.8324
No log 8.0741 436 0.6632 0.4410 0.6632 0.8144
No log 8.1111 438 0.6653 0.4576 0.6653 0.8156
No log 8.1481 440 0.6553 0.4460 0.6553 0.8095
No log 8.1852 442 0.6545 0.4252 0.6545 0.8090
No log 8.2222 444 0.6691 0.3982 0.6691 0.8180
No log 8.2593 446 0.6902 0.4087 0.6902 0.8308
No log 8.2963 448 0.6820 0.4055 0.6820 0.8258
No log 8.3333 450 0.6586 0.4015 0.6586 0.8116
No log 8.3704 452 0.6473 0.3839 0.6473 0.8046
No log 8.4074 454 0.6473 0.3857 0.6473 0.8045
No log 8.4444 456 0.6485 0.4001 0.6485 0.8053
No log 8.4815 458 0.6488 0.3475 0.6488 0.8055
No log 8.5185 460 0.6683 0.3615 0.6683 0.8175
No log 8.5556 462 0.7027 0.4422 0.7027 0.8383
No log 8.5926 464 0.7134 0.4377 0.7134 0.8446
No log 8.6296 466 0.6776 0.3886 0.6776 0.8232
No log 8.6667 468 0.6603 0.4482 0.6603 0.8126
No log 8.7037 470 0.7048 0.4158 0.7048 0.8395
No log 8.7407 472 0.7287 0.4089 0.7287 0.8536
No log 8.7778 474 0.7058 0.4425 0.7058 0.8401
No log 8.8148 476 0.6795 0.4607 0.6795 0.8243
No log 8.8519 478 0.6779 0.4528 0.6779 0.8233
No log 8.8889 480 0.6746 0.4528 0.6746 0.8213
No log 8.9259 482 0.6621 0.4422 0.6621 0.8137
No log 8.9630 484 0.6731 0.4452 0.6731 0.8204
No log 9.0 486 0.6881 0.4376 0.6881 0.8295
No log 9.0370 488 0.6797 0.4682 0.6797 0.8244
No log 9.0741 490 0.6599 0.4552 0.6599 0.8124
No log 9.1111 492 0.6989 0.4085 0.6989 0.8360
No log 9.1481 494 0.7294 0.3793 0.7294 0.8541
No log 9.1852 496 0.6979 0.3935 0.6979 0.8354
No log 9.2222 498 0.6612 0.4188 0.6612 0.8132
0.3837 9.2593 500 0.6662 0.4812 0.6662 0.8162
0.3837 9.2963 502 0.6520 0.4514 0.6520 0.8074
0.3837 9.3333 504 0.6312 0.4270 0.6312 0.7945
0.3837 9.3704 506 0.6341 0.3153 0.6341 0.7963
0.3837 9.4074 508 0.6601 0.3555 0.6601 0.8125
0.3837 9.4444 510 0.6691 0.3803 0.6691 0.8180

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task2_organization

Finetuned
(4206)
this model