ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8617
  • Qwk: 0.4220
  • Mse: 0.8617
  • Rmse: 0.9283

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0392 2 4.6805 -0.0132 4.6805 2.1635
No log 0.0784 4 2.6738 -0.0084 2.6738 1.6352
No log 0.1176 6 2.4271 -0.1115 2.4271 1.5579
No log 0.1569 8 2.0600 -0.0017 2.0600 1.4353
No log 0.1961 10 1.3955 0.0661 1.3955 1.1813
No log 0.2353 12 1.2844 0.0561 1.2844 1.1333
No log 0.2745 14 1.3536 0.0488 1.3536 1.1634
No log 0.3137 16 1.4246 0.0488 1.4246 1.1935
No log 0.3529 18 1.5773 0.0 1.5773 1.2559
No log 0.3922 20 1.6455 0.0169 1.6455 1.2828
No log 0.4314 22 1.3891 0.0038 1.3891 1.1786
No log 0.4706 24 1.3973 -0.0132 1.3973 1.1821
No log 0.5098 26 1.4647 -0.0132 1.4647 1.2102
No log 0.5490 28 1.2120 0.2245 1.2120 1.1009
No log 0.5882 30 1.1904 0.2342 1.1904 1.0911
No log 0.6275 32 1.3581 0.0600 1.3581 1.1654
No log 0.6667 34 1.3790 0.0600 1.3790 1.1743
No log 0.7059 36 1.3767 0.0750 1.3767 1.1733
No log 0.7451 38 1.4080 0.0750 1.4080 1.1866
No log 0.7843 40 1.3509 0.1495 1.3509 1.1623
No log 0.8235 42 1.2776 0.2108 1.2776 1.1303
No log 0.8627 44 1.2322 0.1904 1.2322 1.1100
No log 0.9020 46 1.3957 0.0955 1.3957 1.1814
No log 0.9412 48 1.5660 0.0310 1.5660 1.2514
No log 0.9804 50 1.4056 0.1842 1.4056 1.1856
No log 1.0196 52 1.3778 0.2377 1.3778 1.1738
No log 1.0588 54 1.3796 0.2544 1.3796 1.1745
No log 1.0980 56 1.2416 0.2149 1.2416 1.1143
No log 1.1373 58 1.1237 0.2537 1.1237 1.0601
No log 1.1765 60 1.1783 0.2374 1.1783 1.0855
No log 1.2157 62 1.2600 0.2283 1.2600 1.1225
No log 1.2549 64 1.2494 0.1638 1.2494 1.1178
No log 1.2941 66 1.1574 0.2167 1.1574 1.0758
No log 1.3333 68 1.1305 0.2167 1.1305 1.0632
No log 1.3725 70 1.1658 0.2532 1.1658 1.0797
No log 1.4118 72 1.3829 0.1738 1.3829 1.1760
No log 1.4510 74 1.5409 0.1457 1.5409 1.2413
No log 1.4902 76 1.7079 0.0800 1.7079 1.3069
No log 1.5294 78 1.6392 0.0568 1.6392 1.2803
No log 1.5686 80 1.3314 0.1438 1.3314 1.1539
No log 1.6078 82 1.1382 0.2589 1.1382 1.0669
No log 1.6471 84 1.0333 0.3195 1.0333 1.0165
No log 1.6863 86 1.0151 0.3404 1.0151 1.0075
No log 1.7255 88 1.0314 0.2938 1.0314 1.0156
No log 1.7647 90 1.1807 0.2095 1.1807 1.0866
No log 1.8039 92 1.6365 0.1117 1.6365 1.2793
No log 1.8431 94 2.2214 0.1489 2.2214 1.4904
No log 1.8824 96 2.3773 0.1258 2.3773 1.5419
No log 1.9216 98 2.1821 0.2008 2.1821 1.4772
No log 1.9608 100 1.8052 0.2012 1.8052 1.3436
No log 2.0 102 1.3041 0.1535 1.3041 1.1420
No log 2.0392 104 1.0108 0.2738 1.0108 1.0054
No log 2.0784 106 0.9505 0.4033 0.9505 0.9749
No log 2.1176 108 0.8971 0.4645 0.8971 0.9471
No log 2.1569 110 0.8907 0.4645 0.8907 0.9438
No log 2.1961 112 0.9243 0.4197 0.9243 0.9614
No log 2.2353 114 1.0148 0.3613 1.0148 1.0073
No log 2.2745 116 1.0794 0.3484 1.0794 1.0390
No log 2.3137 118 1.0855 0.3584 1.0855 1.0419
No log 2.3529 120 1.0386 0.3798 1.0386 1.0191
No log 2.3922 122 0.9869 0.3837 0.9869 0.9934
No log 2.4314 124 1.0951 0.3363 1.0951 1.0465
No log 2.4706 126 1.0979 0.2734 1.0979 1.0478
No log 2.5098 128 1.1317 0.3416 1.1317 1.0638
No log 2.5490 130 1.1018 0.2734 1.1018 1.0496
No log 2.5882 132 1.0157 0.3117 1.0157 1.0078
No log 2.6275 134 0.9381 0.4549 0.9381 0.9686
No log 2.6667 136 0.9465 0.4549 0.9465 0.9729
No log 2.7059 138 0.9773 0.3841 0.9773 0.9886
No log 2.7451 140 0.9099 0.4368 0.9099 0.9539
No log 2.7843 142 0.8638 0.4527 0.8638 0.9294
No log 2.8235 144 0.8544 0.4637 0.8544 0.9244
No log 2.8627 146 0.9392 0.4703 0.9392 0.9691
No log 2.9020 148 0.9266 0.5333 0.9266 0.9626
No log 2.9412 150 0.8093 0.6167 0.8093 0.8996
No log 2.9804 152 0.8601 0.4175 0.8601 0.9274
No log 3.0196 154 0.9404 0.3287 0.9404 0.9698
No log 3.0588 156 0.9617 0.3149 0.9617 0.9807
No log 3.0980 158 0.9668 0.4591 0.9668 0.9833
No log 3.1373 160 0.9711 0.4691 0.9711 0.9855
No log 3.1765 162 0.9248 0.4681 0.9248 0.9617
No log 3.2157 164 0.8787 0.5458 0.8787 0.9374
No log 3.2549 166 0.8352 0.5526 0.8352 0.9139
No log 3.2941 168 0.8331 0.5756 0.8331 0.9128
No log 3.3333 170 0.9259 0.5147 0.9259 0.9623
No log 3.3725 172 0.8950 0.5154 0.8950 0.9461
No log 3.4118 174 1.0169 0.3385 1.0169 1.0084
No log 3.4510 176 1.3691 0.3001 1.3691 1.1701
No log 3.4902 178 1.2906 0.3702 1.2906 1.1361
No log 3.5294 180 1.0114 0.3521 1.0114 1.0057
No log 3.5686 182 0.8751 0.4626 0.8751 0.9355
No log 3.6078 184 0.8740 0.5329 0.8740 0.9349
No log 3.6471 186 0.8692 0.5155 0.8692 0.9323
No log 3.6863 188 0.9598 0.3149 0.9598 0.9797
No log 3.7255 190 1.0935 0.3410 1.0936 1.0457
No log 3.7647 192 1.0933 0.4267 1.0933 1.0456
No log 3.8039 194 0.9682 0.4303 0.9682 0.9840
No log 3.8431 196 0.9472 0.4626 0.9472 0.9733
No log 3.8824 198 0.9848 0.3374 0.9848 0.9924
No log 3.9216 200 1.0061 0.3758 1.0061 1.0030
No log 3.9608 202 1.0043 0.4699 1.0043 1.0022
No log 4.0 204 1.0780 0.4119 1.0780 1.0382
No log 4.0392 206 1.0010 0.4556 1.0010 1.0005
No log 4.0784 208 0.9168 0.4826 0.9168 0.9575
No log 4.1176 210 0.9659 0.4334 0.9659 0.9828
No log 4.1569 212 0.9399 0.4775 0.9399 0.9695
No log 4.1961 214 0.8783 0.4879 0.8783 0.9372
No log 4.2353 216 0.8624 0.4930 0.8624 0.9287
No log 4.2745 218 0.8706 0.5283 0.8706 0.9331
No log 4.3137 220 0.8888 0.5450 0.8888 0.9428
No log 4.3529 222 1.0378 0.3990 1.0378 1.0187
No log 4.3922 224 1.1014 0.3673 1.1014 1.0495
No log 4.4314 226 1.0202 0.4286 1.0202 1.0101
No log 4.4706 228 0.8880 0.5450 0.8880 0.9423
No log 4.5098 230 0.8636 0.5058 0.8636 0.9293
No log 4.5490 232 0.8895 0.5291 0.8895 0.9431
No log 4.5882 234 0.8942 0.5291 0.8942 0.9456
No log 4.6275 236 0.8422 0.4789 0.8422 0.9177
No log 4.6667 238 0.8856 0.5259 0.8856 0.9411
No log 4.7059 240 0.9355 0.4662 0.9355 0.9672
No log 4.7451 242 0.9534 0.4346 0.9534 0.9764
No log 4.7843 244 0.9926 0.3945 0.9926 0.9963
No log 4.8235 246 1.0758 0.3918 1.0758 1.0372
No log 4.8627 248 1.0341 0.3465 1.0341 1.0169
No log 4.9020 250 1.0061 0.4444 1.0061 1.0030
No log 4.9412 252 1.0022 0.4256 1.0022 1.0011
No log 4.9804 254 1.0061 0.4450 1.0061 1.0030
No log 5.0196 256 1.0082 0.3794 1.0082 1.0041
No log 5.0588 258 1.0095 0.3794 1.0095 1.0048
No log 5.0980 260 1.0098 0.4346 1.0098 1.0049
No log 5.1373 262 1.0904 0.4050 1.0904 1.0442
No log 5.1765 264 1.1586 0.3826 1.1586 1.0764
No log 5.2157 266 1.1098 0.3933 1.1098 1.0535
No log 5.2549 268 1.0723 0.3708 1.0723 1.0355
No log 5.2941 270 1.0672 0.3326 1.0672 1.0331
No log 5.3333 272 1.0333 0.3998 1.0333 1.0165
No log 5.3725 274 1.0074 0.3155 1.0074 1.0037
No log 5.4118 276 0.9641 0.3938 0.9641 0.9819
No log 5.4510 278 0.9424 0.4369 0.9424 0.9708
No log 5.4902 280 0.9287 0.4527 0.9287 0.9637
No log 5.5294 282 0.9250 0.4374 0.9250 0.9618
No log 5.5686 284 0.9248 0.4125 0.9248 0.9617
No log 5.6078 286 0.9373 0.3957 0.9373 0.9681
No log 5.6471 288 0.9111 0.4964 0.9111 0.9545
No log 5.6863 290 0.9393 0.4176 0.9393 0.9692
No log 5.7255 292 0.9243 0.4176 0.9243 0.9614
No log 5.7647 294 0.8929 0.4256 0.8929 0.9449
No log 5.8039 296 0.9299 0.4579 0.9299 0.9643
No log 5.8431 298 0.9399 0.4785 0.9399 0.9695
No log 5.8824 300 0.9113 0.4593 0.9113 0.9546
No log 5.9216 302 0.9508 0.5253 0.9508 0.9751
No log 5.9608 304 0.9091 0.4514 0.9091 0.9535
No log 6.0 306 0.9063 0.5029 0.9063 0.9520
No log 6.0392 308 0.9690 0.4510 0.9690 0.9844
No log 6.0784 310 0.9085 0.5073 0.9085 0.9531
No log 6.1176 312 0.8337 0.4801 0.8337 0.9131
No log 6.1569 314 0.8375 0.4801 0.8375 0.9151
No log 6.1961 316 0.8534 0.4705 0.8534 0.9238
No log 6.2353 318 0.8730 0.4705 0.8730 0.9344
No log 6.2745 320 0.9101 0.4256 0.9101 0.9540
No log 6.3137 322 0.9366 0.3788 0.9366 0.9678
No log 6.3529 324 0.9344 0.3788 0.9344 0.9667
No log 6.3922 326 0.9331 0.3987 0.9331 0.9659
No log 6.4314 328 0.9874 0.4202 0.9874 0.9937
No log 6.4706 330 0.9828 0.4439 0.9828 0.9914
No log 6.5098 332 0.9404 0.2694 0.9404 0.9697
No log 6.5490 334 0.9505 0.3263 0.9505 0.9750
No log 6.5882 336 0.9790 0.3411 0.9790 0.9894
No log 6.6275 338 0.9523 0.3263 0.9523 0.9758
No log 6.6667 340 0.9440 0.3596 0.9440 0.9716
No log 6.7059 342 0.9433 0.3210 0.9433 0.9713
No log 6.7451 344 0.9198 0.3066 0.9198 0.9591
No log 6.7843 346 0.9323 0.3660 0.9323 0.9656
No log 6.8235 348 0.9351 0.3457 0.9351 0.9670
No log 6.8627 350 0.9322 0.3373 0.9322 0.9655
No log 6.9020 352 0.8893 0.3671 0.8893 0.9430
No log 6.9412 354 0.9043 0.4142 0.9043 0.9509
No log 6.9804 356 0.9357 0.4568 0.9357 0.9673
No log 7.0196 358 0.9564 0.3493 0.9564 0.9779
No log 7.0588 360 0.9921 0.3346 0.9921 0.9960
No log 7.0980 362 0.9973 0.3205 0.9973 0.9987
No log 7.1373 364 0.9874 0.3210 0.9874 0.9937
No log 7.1765 366 0.9682 0.3089 0.9682 0.9840
No log 7.2157 368 0.9548 0.3608 0.9548 0.9771
No log 7.2549 370 0.9605 0.3527 0.9605 0.9801
No log 7.2941 372 0.9659 0.3527 0.9659 0.9828
No log 7.3333 374 0.9913 0.3457 0.9913 0.9957
No log 7.3725 376 0.9668 0.3513 0.9668 0.9833
No log 7.4118 378 0.9586 0.3804 0.9586 0.9791
No log 7.4510 380 1.0024 0.4264 1.0024 1.0012
No log 7.4902 382 0.9607 0.3804 0.9607 0.9802
No log 7.5294 384 0.9158 0.3145 0.9158 0.9570
No log 7.5686 386 0.9633 0.3590 0.9633 0.9815
No log 7.6078 388 1.0783 0.3523 1.0783 1.0384
No log 7.6471 390 1.0783 0.3784 1.0783 1.0384
No log 7.6863 392 0.9584 0.4175 0.9584 0.9790
No log 7.7255 394 0.8756 0.4120 0.8756 0.9357
No log 7.7647 396 0.8678 0.4045 0.8678 0.9316
No log 7.8039 398 0.8712 0.3896 0.8712 0.9334
No log 7.8431 400 0.8655 0.3685 0.8655 0.9303
No log 7.8824 402 0.8747 0.3263 0.8747 0.9352
No log 7.9216 404 0.9072 0.3502 0.9072 0.9525
No log 7.9608 406 0.8933 0.3602 0.8933 0.9452
No log 8.0 408 0.8742 0.3263 0.8742 0.9350
No log 8.0392 410 0.8725 0.3674 0.8725 0.9341
No log 8.0784 412 0.8732 0.3570 0.8732 0.9345
No log 8.1176 414 0.8779 0.3263 0.8779 0.9370
No log 8.1569 416 0.8899 0.3062 0.8899 0.9433
No log 8.1961 418 0.8990 0.3263 0.8990 0.9481
No log 8.2353 420 0.9068 0.3216 0.9068 0.9523
No log 8.2745 422 0.9033 0.3318 0.9033 0.9504
No log 8.3137 424 0.8974 0.3216 0.8974 0.9473
No log 8.3529 426 0.9018 0.3216 0.9018 0.9496
No log 8.3922 428 0.9219 0.3163 0.9219 0.9602
No log 8.4314 430 0.9715 0.3657 0.9715 0.9856
No log 8.4706 432 1.0289 0.3699 1.0289 1.0143
No log 8.5098 434 0.9977 0.2963 0.9977 0.9989
No log 8.5490 436 0.9753 0.3066 0.9753 0.9876
No log 8.5882 438 0.9870 0.3119 0.9870 0.9935
No log 8.6275 440 0.9835 0.2038 0.9835 0.9917
No log 8.6667 442 0.9567 0.3271 0.9567 0.9781
No log 8.7059 444 0.9574 0.3747 0.9574 0.9785
No log 8.7451 446 1.0043 0.3646 1.0043 1.0021
No log 8.7843 448 0.9759 0.3646 0.9759 0.9879
No log 8.8235 450 0.9136 0.3855 0.9136 0.9558
No log 8.8627 452 0.8976 0.4505 0.8976 0.9474
No log 8.9020 454 0.9082 0.4275 0.9082 0.9530
No log 8.9412 456 0.8806 0.4278 0.8806 0.9384
No log 8.9804 458 0.8760 0.4661 0.8759 0.9359
No log 9.0196 460 0.8798 0.4563 0.8798 0.9380
No log 9.0588 462 0.8626 0.4726 0.8626 0.9288
No log 9.0980 464 0.8729 0.4413 0.8729 0.9343
No log 9.1373 466 0.8975 0.3908 0.8975 0.9474
No log 9.1765 468 0.9058 0.3908 0.9058 0.9518
No log 9.2157 470 0.8940 0.3908 0.8940 0.9455
No log 9.2549 472 0.8958 0.3908 0.8958 0.9465
No log 9.2941 474 0.8909 0.4181 0.8909 0.9439
No log 9.3333 476 0.9024 0.4181 0.9024 0.9499
No log 9.3725 478 0.9124 0.4181 0.9124 0.9552
No log 9.4118 480 0.9377 0.4045 0.9377 0.9683
No log 9.4510 482 0.9634 0.4002 0.9634 0.9815
No log 9.4902 484 0.9615 0.3243 0.9615 0.9806
No log 9.5294 486 0.9603 0.3629 0.9603 0.9799
No log 9.5686 488 0.9495 0.3629 0.9495 0.9744
No log 9.6078 490 0.9442 0.3685 0.9442 0.9717
No log 9.6471 492 0.9334 0.3728 0.9334 0.9661
No log 9.6863 494 0.9382 0.3571 0.9382 0.9686
No log 9.7255 496 0.9345 0.4059 0.9345 0.9667
No log 9.7647 498 0.9132 0.4294 0.9132 0.9556
0.367 9.8039 500 0.8974 0.4637 0.8974 0.9473
0.367 9.8431 502 0.8946 0.5062 0.8946 0.9458
0.367 9.8824 504 0.8746 0.4745 0.8746 0.9352
0.367 9.9216 506 0.8645 0.5112 0.8645 0.9298
0.367 9.9608 508 0.8611 0.4428 0.8611 0.9279
0.367 10.0 510 0.8719 0.4077 0.8719 0.9337
0.367 10.0392 512 0.8851 0.4533 0.8851 0.9408
0.367 10.0784 514 0.8804 0.4428 0.8804 0.9383
0.367 10.1176 516 0.8617 0.4220 0.8617 0.9283

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task2_organization

Finetuned
(4005)
this model