ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k18_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6184
  • Qwk: 0.5294
  • Mse: 0.6184
  • Rmse: 0.7864

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0211 2 4.1288 -0.0315 4.1288 2.0319
No log 0.0421 4 2.5185 0.0426 2.5185 1.5870
No log 0.0632 6 1.4666 0.1104 1.4666 1.2110
No log 0.0842 8 1.0851 -0.0178 1.0851 1.0417
No log 0.1053 10 0.8797 0.0974 0.8797 0.9379
No log 0.1263 12 0.9451 0.0093 0.9451 0.9722
No log 0.1474 14 1.4534 0.0302 1.4534 1.2056
No log 0.1684 16 2.0478 0.0673 2.0478 1.4310
No log 0.1895 18 1.7795 0.0425 1.7795 1.3340
No log 0.2105 20 1.2998 0.0208 1.2998 1.1401
No log 0.2316 22 0.9965 0.0836 0.9965 0.9983
No log 0.2526 24 0.9233 0.1067 0.9233 0.9609
No log 0.2737 26 0.9016 0.0955 0.9016 0.9495
No log 0.2947 28 0.7589 0.2702 0.7589 0.8711
No log 0.3158 30 0.7519 0.2637 0.7519 0.8671
No log 0.3368 32 0.8252 0.1152 0.8252 0.9084
No log 0.3579 34 0.8629 0.1225 0.8629 0.9289
No log 0.3789 36 0.7928 0.1505 0.7928 0.8904
No log 0.4 38 0.7374 0.2376 0.7374 0.8587
No log 0.4211 40 0.8696 0.0914 0.8696 0.9325
No log 0.4421 42 0.8435 0.1149 0.8435 0.9184
No log 0.4632 44 0.8571 0.1489 0.8571 0.9258
No log 0.4842 46 0.9362 0.0903 0.9362 0.9676
No log 0.5053 48 1.1060 0.0718 1.1060 1.0517
No log 0.5263 50 1.0754 0.0615 1.0754 1.0370
No log 0.5474 52 1.0345 0.0769 1.0345 1.0171
No log 0.5684 54 1.0350 0.0769 1.0350 1.0173
No log 0.5895 56 0.9969 0.1118 0.9969 0.9984
No log 0.6105 58 0.9859 0.1416 0.9859 0.9929
No log 0.6316 60 0.9304 0.1334 0.9304 0.9646
No log 0.6526 62 0.7553 0.2849 0.7553 0.8691
No log 0.6737 64 0.7851 0.1643 0.7851 0.8860
No log 0.6947 66 0.8097 0.1643 0.8097 0.8999
No log 0.7158 68 0.7818 0.1904 0.7818 0.8842
No log 0.7368 70 0.7878 0.2766 0.7878 0.8876
No log 0.7579 72 0.8239 0.2505 0.8239 0.9077
No log 0.7789 74 0.8717 0.1621 0.8717 0.9336
No log 0.8 76 0.7942 0.3226 0.7942 0.8912
No log 0.8211 78 0.7783 0.3385 0.7783 0.8822
No log 0.8421 80 0.7546 0.2920 0.7546 0.8687
No log 0.8632 82 0.7792 0.2545 0.7792 0.8827
No log 0.8842 84 0.7480 0.3243 0.7480 0.8649
No log 0.9053 86 0.7029 0.2658 0.7029 0.8384
No log 0.9263 88 0.7026 0.3783 0.7026 0.8382
No log 0.9474 90 0.6902 0.3564 0.6902 0.8308
No log 0.9684 92 0.6650 0.3636 0.6650 0.8155
No log 0.9895 94 0.7505 0.2769 0.7505 0.8663
No log 1.0105 96 0.9071 0.1912 0.9071 0.9524
No log 1.0316 98 0.8257 0.1937 0.8257 0.9087
No log 1.0526 100 0.6927 0.3466 0.6927 0.8323
No log 1.0737 102 0.6683 0.3612 0.6683 0.8175
No log 1.0947 104 0.7382 0.3380 0.7382 0.8592
No log 1.1158 106 0.9529 0.2659 0.9529 0.9762
No log 1.1368 108 0.9635 0.3044 0.9635 0.9816
No log 1.1579 110 0.7591 0.3529 0.7591 0.8713
No log 1.1789 112 0.6922 0.3792 0.6922 0.8320
No log 1.2 114 0.7296 0.3937 0.7296 0.8542
No log 1.2211 116 0.7899 0.3672 0.7899 0.8888
No log 1.2421 118 0.8106 0.3884 0.8106 0.9003
No log 1.2632 120 0.7301 0.4256 0.7301 0.8545
No log 1.2842 122 0.8867 0.4489 0.8867 0.9417
No log 1.3053 124 0.9356 0.4196 0.9356 0.9673
No log 1.3263 126 0.8363 0.4475 0.8363 0.9145
No log 1.3474 128 0.6997 0.3396 0.6997 0.8365
No log 1.3684 130 0.7186 0.3959 0.7186 0.8477
No log 1.3895 132 0.7385 0.3838 0.7385 0.8594
No log 1.4105 134 0.6888 0.4018 0.6888 0.8299
No log 1.4316 136 0.7473 0.4035 0.7473 0.8644
No log 1.4526 138 0.9286 0.3883 0.9286 0.9636
No log 1.4737 140 1.0840 0.2961 1.0840 1.0412
No log 1.4947 142 0.8336 0.3932 0.8336 0.9130
No log 1.5158 144 0.7058 0.4382 0.7058 0.8401
No log 1.5368 146 0.7232 0.4489 0.7232 0.8504
No log 1.5579 148 0.8145 0.3962 0.8145 0.9025
No log 1.5789 150 0.8053 0.4102 0.8053 0.8974
No log 1.6 152 0.7190 0.4421 0.7190 0.8479
No log 1.6211 154 0.7069 0.4567 0.7069 0.8408
No log 1.6421 156 0.7192 0.4628 0.7192 0.8481
No log 1.6632 158 0.9127 0.3297 0.9127 0.9554
No log 1.6842 160 1.3862 0.2134 1.3862 1.1774
No log 1.7053 162 1.4692 0.2036 1.4692 1.2121
No log 1.7263 164 1.0294 0.2771 1.0294 1.0146
No log 1.7474 166 0.6834 0.4687 0.6834 0.8267
No log 1.7684 168 0.6539 0.4906 0.6539 0.8086
No log 1.7895 170 0.6982 0.4477 0.6982 0.8356
No log 1.8105 172 0.7451 0.4087 0.7451 0.8632
No log 1.8316 174 0.8036 0.4019 0.8036 0.8965
No log 1.8526 176 0.8099 0.3935 0.8099 0.8999
No log 1.8737 178 0.8201 0.4115 0.8201 0.9056
No log 1.8947 180 0.8681 0.4417 0.8681 0.9317
No log 1.9158 182 0.8429 0.4423 0.8429 0.9181
No log 1.9368 184 0.7964 0.4683 0.7964 0.8924
No log 1.9579 186 0.7728 0.4223 0.7728 0.8791
No log 1.9789 188 0.8025 0.4409 0.8025 0.8958
No log 2.0 190 0.8573 0.4339 0.8573 0.9259
No log 2.0211 192 0.8898 0.4230 0.8898 0.9433
No log 2.0421 194 0.8453 0.4237 0.8453 0.9194
No log 2.0632 196 0.7927 0.4640 0.7927 0.8904
No log 2.0842 198 0.6814 0.4616 0.6814 0.8255
No log 2.1053 200 0.6805 0.3876 0.6805 0.8249
No log 2.1263 202 0.7932 0.4137 0.7932 0.8906
No log 2.1474 204 0.8210 0.3537 0.8210 0.9061
No log 2.1684 206 0.6529 0.4375 0.6529 0.8080
No log 2.1895 208 0.5770 0.4339 0.5770 0.7596
No log 2.2105 210 0.5688 0.4584 0.5688 0.7542
No log 2.2316 212 0.5937 0.4436 0.5937 0.7705
No log 2.2526 214 0.6245 0.4381 0.6245 0.7903
No log 2.2737 216 0.6395 0.4528 0.6395 0.7997
No log 2.2947 218 0.6386 0.4830 0.6386 0.7991
No log 2.3158 220 0.6937 0.4787 0.6937 0.8329
No log 2.3368 222 0.7709 0.4437 0.7709 0.8780
No log 2.3579 224 0.7213 0.3433 0.7213 0.8493
No log 2.3789 226 0.6798 0.3799 0.6798 0.8245
No log 2.4 228 0.6615 0.4124 0.6615 0.8133
No log 2.4211 230 0.6727 0.4200 0.6727 0.8202
No log 2.4421 232 0.7324 0.3683 0.7324 0.8558
No log 2.4632 234 0.6935 0.3867 0.6935 0.8327
No log 2.4842 236 0.6711 0.3579 0.6711 0.8192
No log 2.5053 238 0.6323 0.4102 0.6323 0.7952
No log 2.5263 240 0.6303 0.4339 0.6303 0.7939
No log 2.5474 242 0.6294 0.4273 0.6294 0.7933
No log 2.5684 244 0.6716 0.3922 0.6716 0.8195
No log 2.5895 246 0.8463 0.3883 0.8463 0.9199
No log 2.6105 248 0.9069 0.3402 0.9069 0.9523
No log 2.6316 250 0.7418 0.4259 0.7418 0.8613
No log 2.6526 252 0.6080 0.5125 0.6080 0.7798
No log 2.6737 254 0.5552 0.4465 0.5552 0.7451
No log 2.6947 256 0.5867 0.4622 0.5867 0.7660
No log 2.7158 258 0.6339 0.4706 0.6339 0.7962
No log 2.7368 260 0.6690 0.5044 0.6690 0.8179
No log 2.7579 262 0.6825 0.5167 0.6825 0.8261
No log 2.7789 264 0.6800 0.5143 0.6800 0.8246
No log 2.8 266 0.6771 0.5095 0.6771 0.8229
No log 2.8211 268 0.6573 0.4622 0.6573 0.8107
No log 2.8421 270 0.6588 0.4508 0.6588 0.8116
No log 2.8632 272 0.6727 0.5050 0.6727 0.8202
No log 2.8842 274 0.6300 0.5274 0.6300 0.7937
No log 2.9053 276 0.6290 0.5245 0.6290 0.7931
No log 2.9263 278 0.6731 0.5580 0.6731 0.8204
No log 2.9474 280 0.7603 0.4575 0.7603 0.8720
No log 2.9684 282 0.7838 0.4346 0.7838 0.8853
No log 2.9895 284 0.8229 0.4382 0.8229 0.9072
No log 3.0105 286 0.9095 0.4388 0.9095 0.9537
No log 3.0316 288 0.7786 0.4496 0.7786 0.8824
No log 3.0526 290 0.6497 0.5162 0.6497 0.8060
No log 3.0737 292 0.5631 0.4890 0.5631 0.7504
No log 3.0947 294 0.6116 0.4394 0.6116 0.7820
No log 3.1158 296 0.6838 0.4867 0.6838 0.8269
No log 3.1368 298 0.8232 0.4285 0.8232 0.9073
No log 3.1579 300 0.8734 0.4269 0.8734 0.9346
No log 3.1789 302 0.8150 0.4712 0.8150 0.9028
No log 3.2 304 0.7405 0.5294 0.7405 0.8605
No log 3.2211 306 0.6413 0.5176 0.6413 0.8008
No log 3.2421 308 0.5913 0.5348 0.5913 0.7689
No log 3.2632 310 0.5907 0.5172 0.5907 0.7686
No log 3.2842 312 0.6288 0.5611 0.6288 0.7930
No log 3.3053 314 0.6425 0.5611 0.6425 0.8015
No log 3.3263 316 0.6212 0.5010 0.6212 0.7882
No log 3.3474 318 0.5935 0.4746 0.5935 0.7704
No log 3.3684 320 0.5832 0.4311 0.5832 0.7637
No log 3.3895 322 0.5889 0.4639 0.5889 0.7674
No log 3.4105 324 0.6058 0.4984 0.6058 0.7783
No log 3.4316 326 0.6198 0.5541 0.6198 0.7873
No log 3.4526 328 0.6251 0.5567 0.6251 0.7906
No log 3.4737 330 0.6397 0.5262 0.6397 0.7998
No log 3.4947 332 0.6237 0.5229 0.6237 0.7897
No log 3.5158 334 0.6148 0.5372 0.6148 0.7841
No log 3.5368 336 0.5905 0.5330 0.5905 0.7685
No log 3.5579 338 0.5872 0.5450 0.5872 0.7663
No log 3.5789 340 0.5686 0.5144 0.5686 0.7541
No log 3.6 342 0.5612 0.5164 0.5612 0.7492
No log 3.6211 344 0.5621 0.5117 0.5621 0.7497
No log 3.6421 346 0.5519 0.5587 0.5519 0.7429
No log 3.6632 348 0.5641 0.5630 0.5641 0.7511
No log 3.6842 350 0.5928 0.6066 0.5928 0.7699
No log 3.7053 352 0.6458 0.5173 0.6458 0.8036
No log 3.7263 354 0.7049 0.5306 0.7049 0.8396
No log 3.7474 356 0.8000 0.4673 0.8000 0.8944
No log 3.7684 358 0.8399 0.4433 0.8399 0.9165
No log 3.7895 360 0.7574 0.4544 0.7574 0.8703
No log 3.8105 362 0.6814 0.5425 0.6814 0.8255
No log 3.8316 364 0.6036 0.5189 0.6036 0.7769
No log 3.8526 366 0.5701 0.5702 0.5701 0.7551
No log 3.8737 368 0.5851 0.5704 0.5851 0.7649
No log 3.8947 370 0.6450 0.5595 0.6450 0.8031
No log 3.9158 372 0.7828 0.4601 0.7828 0.8848
No log 3.9368 374 0.9965 0.3715 0.9965 0.9983
No log 3.9579 376 1.1143 0.2836 1.1143 1.0556
No log 3.9789 378 0.9718 0.3717 0.9718 0.9858
No log 4.0 380 0.7108 0.5453 0.7108 0.8431
No log 4.0211 382 0.5774 0.6060 0.5774 0.7599
No log 4.0421 384 0.5570 0.5410 0.5570 0.7463
No log 4.0632 386 0.5892 0.5333 0.5892 0.7676
No log 4.0842 388 0.6194 0.5241 0.6194 0.7870
No log 4.1053 390 0.5920 0.5445 0.5920 0.7694
No log 4.1263 392 0.5359 0.4936 0.5359 0.7321
No log 4.1474 394 0.5107 0.5042 0.5107 0.7146
No log 4.1684 396 0.5163 0.5408 0.5163 0.7185
No log 4.1895 398 0.5313 0.5989 0.5313 0.7289
No log 4.2105 400 0.5463 0.5777 0.5463 0.7391
No log 4.2316 402 0.5665 0.5727 0.5665 0.7527
No log 4.2526 404 0.6014 0.5768 0.6014 0.7755
No log 4.2737 406 0.6611 0.5654 0.6611 0.8131
No log 4.2947 408 0.7154 0.5310 0.7154 0.8458
No log 4.3158 410 0.7825 0.5006 0.7825 0.8846
No log 4.3368 412 0.8225 0.5012 0.8225 0.9069
No log 4.3579 414 0.8351 0.4843 0.8351 0.9139
No log 4.3789 416 0.8116 0.4926 0.8116 0.9009
No log 4.4 418 0.7569 0.4809 0.7569 0.8700
No log 4.4211 420 0.6994 0.4849 0.6994 0.8363
No log 4.4421 422 0.6854 0.4819 0.6854 0.8279
No log 4.4632 424 0.6971 0.5304 0.6971 0.8349
No log 4.4842 426 0.7128 0.5651 0.7128 0.8442
No log 4.5053 428 0.7769 0.4844 0.7769 0.8814
No log 4.5263 430 0.7936 0.4804 0.7936 0.8909
No log 4.5474 432 0.6638 0.4739 0.6638 0.8147
No log 4.5684 434 0.6668 0.5030 0.6668 0.8166
No log 4.5895 436 0.7087 0.5215 0.7087 0.8418
No log 4.6105 438 0.7407 0.5156 0.7407 0.8606
No log 4.6316 440 0.7848 0.5012 0.7848 0.8859
No log 4.6526 442 0.8081 0.5111 0.8081 0.8989
No log 4.6737 444 0.7567 0.5024 0.7567 0.8699
No log 4.6947 446 0.6946 0.5198 0.6946 0.8334
No log 4.7158 448 0.6577 0.5325 0.6577 0.8110
No log 4.7368 450 0.6399 0.5614 0.6399 0.7999
No log 4.7579 452 0.6201 0.5072 0.6201 0.7875
No log 4.7789 454 0.6461 0.5535 0.6461 0.8038
No log 4.8 456 0.6800 0.5525 0.6800 0.8246
No log 4.8211 458 0.7090 0.5176 0.7090 0.8420
No log 4.8421 460 0.7283 0.5149 0.7283 0.8534
No log 4.8632 462 0.7139 0.5349 0.7139 0.8449
No log 4.8842 464 0.6622 0.5543 0.6622 0.8138
No log 4.9053 466 0.6279 0.5405 0.6279 0.7924
No log 4.9263 468 0.6448 0.5362 0.6448 0.8030
No log 4.9474 470 0.6874 0.5685 0.6874 0.8291
No log 4.9684 472 0.7100 0.5745 0.7100 0.8426
No log 4.9895 474 0.7547 0.5324 0.7547 0.8688
No log 5.0105 476 0.8040 0.5036 0.8040 0.8967
No log 5.0316 478 0.8140 0.4906 0.8140 0.9022
No log 5.0526 480 0.7026 0.5452 0.7026 0.8382
No log 5.0737 482 0.5995 0.5783 0.5995 0.7743
No log 5.0947 484 0.5586 0.5276 0.5586 0.7474
No log 5.1158 486 0.5841 0.5174 0.5841 0.7643
No log 5.1368 488 0.6294 0.5559 0.6294 0.7933
No log 5.1579 490 0.6434 0.5541 0.6434 0.8021
No log 5.1789 492 0.6261 0.5497 0.6261 0.7913
No log 5.2 494 0.6209 0.5519 0.6209 0.7880
No log 5.2211 496 0.6343 0.5296 0.6343 0.7964
No log 5.2421 498 0.6715 0.5134 0.6715 0.8195
0.4009 5.2632 500 0.7082 0.5482 0.7082 0.8416
0.4009 5.2842 502 0.7507 0.5557 0.7507 0.8664
0.4009 5.3053 504 0.7417 0.5503 0.7417 0.8612
0.4009 5.3263 506 0.6898 0.5538 0.6898 0.8306
0.4009 5.3474 508 0.5998 0.5470 0.5998 0.7745
0.4009 5.3684 510 0.5250 0.5638 0.5250 0.7246
0.4009 5.3895 512 0.5114 0.5531 0.5114 0.7151
0.4009 5.4105 514 0.5244 0.5550 0.5244 0.7241
0.4009 5.4316 516 0.5475 0.5461 0.5475 0.7399
0.4009 5.4526 518 0.5598 0.5269 0.5598 0.7482
0.4009 5.4737 520 0.5705 0.4984 0.5705 0.7553
0.4009 5.4947 522 0.5705 0.5026 0.5705 0.7553
0.4009 5.5158 524 0.6184 0.5294 0.6184 0.7864

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k18_task2_organization

Finetuned
(4222)
this model