ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k13_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9002
  • Qwk: 0.4144
  • Mse: 0.9002
  • Rmse: 0.9488

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0426 2 4.8122 -0.0075 4.8122 2.1937
No log 0.0851 4 3.0681 -0.0352 3.0681 1.7516
No log 0.1277 6 1.8318 0.0198 1.8318 1.3534
No log 0.1702 8 1.7537 0.0062 1.7537 1.3243
No log 0.2128 10 2.0217 -0.0925 2.0217 1.4219
No log 0.2553 12 1.8901 -0.1251 1.8901 1.3748
No log 0.2979 14 1.6340 0.0376 1.6340 1.2783
No log 0.3404 16 1.3868 0.0421 1.3868 1.1776
No log 0.3830 18 1.3010 0.0269 1.3010 1.1406
No log 0.4255 20 1.3824 0.0845 1.3824 1.1757
No log 0.4681 22 1.3951 0.1604 1.3951 1.1812
No log 0.5106 24 1.2458 0.1505 1.2458 1.1162
No log 0.5532 26 1.3000 0.1021 1.3000 1.1402
No log 0.5957 28 1.7388 0.0599 1.7388 1.3186
No log 0.6383 30 1.8963 0.1439 1.8963 1.3770
No log 0.6809 32 1.4723 0.2201 1.4723 1.2134
No log 0.7234 34 1.1328 0.3263 1.1328 1.0643
No log 0.7660 36 1.1589 0.3660 1.1589 1.0765
No log 0.8085 38 1.1587 0.3011 1.1587 1.0764
No log 0.8511 40 1.3795 0.1080 1.3795 1.1745
No log 0.8936 42 1.6398 0.0667 1.6398 1.2805
No log 0.9362 44 1.7265 0.0363 1.7265 1.3140
No log 0.9787 46 1.5623 0.0766 1.5623 1.2499
No log 1.0213 48 1.4806 0.0622 1.4806 1.2168
No log 1.0638 50 1.4171 0.1659 1.4171 1.1904
No log 1.1064 52 1.3689 0.1226 1.3689 1.1700
No log 1.1489 54 1.1988 0.2622 1.1988 1.0949
No log 1.1915 56 1.1914 0.2476 1.1914 1.0915
No log 1.2340 58 1.2191 0.2085 1.2191 1.1041
No log 1.2766 60 1.6236 0.1038 1.6236 1.2742
No log 1.3191 62 2.0890 0.1509 2.0890 1.4453
No log 1.3617 64 1.7839 0.1430 1.7839 1.3356
No log 1.4043 66 1.5015 0.0905 1.5015 1.2254
No log 1.4468 68 1.1914 0.1848 1.1914 1.0915
No log 1.4894 70 1.0982 0.3189 1.0982 1.0479
No log 1.5319 72 1.0573 0.3347 1.0573 1.0282
No log 1.5745 74 1.1340 0.2439 1.1340 1.0649
No log 1.6170 76 1.4298 0.0972 1.4298 1.1957
No log 1.6596 78 1.6487 0.1508 1.6487 1.2840
No log 1.7021 80 1.5029 0.1195 1.5029 1.2259
No log 1.7447 82 1.3308 0.1395 1.3308 1.1536
No log 1.7872 84 1.2693 0.1659 1.2693 1.1266
No log 1.8298 86 1.1930 0.2390 1.1930 1.0922
No log 1.8723 88 0.9546 0.3753 0.9546 0.9771
No log 1.9149 90 0.9237 0.4180 0.9237 0.9611
No log 1.9574 92 0.9298 0.4401 0.9298 0.9643
No log 2.0 94 0.9353 0.4534 0.9353 0.9671
No log 2.0426 96 0.9114 0.4461 0.9114 0.9547
No log 2.0851 98 0.9261 0.4963 0.9261 0.9623
No log 2.1277 100 0.9183 0.4377 0.9183 0.9583
No log 2.1702 102 0.9270 0.4498 0.9270 0.9628
No log 2.2128 104 0.9341 0.4077 0.9341 0.9665
No log 2.2553 106 1.0545 0.3691 1.0545 1.0269
No log 2.2979 108 1.0953 0.3193 1.0953 1.0465
No log 2.3404 110 0.9748 0.4420 0.9748 0.9873
No log 2.3830 112 0.9710 0.3684 0.9710 0.9854
No log 2.4255 114 1.0235 0.4331 1.0235 1.0117
No log 2.4681 116 0.9750 0.3938 0.9750 0.9874
No log 2.5106 118 1.0305 0.4516 1.0305 1.0151
No log 2.5532 120 0.9758 0.3596 0.9758 0.9878
No log 2.5957 122 1.0048 0.4582 1.0048 1.0024
No log 2.6383 124 1.0677 0.3850 1.0677 1.0333
No log 2.6809 126 0.9952 0.4084 0.9952 0.9976
No log 2.7234 128 0.9634 0.4746 0.9634 0.9815
No log 2.7660 130 0.9927 0.3960 0.9927 0.9964
No log 2.8085 132 1.0217 0.3544 1.0217 1.0108
No log 2.8511 134 1.0128 0.3728 1.0128 1.0064
No log 2.8936 136 0.9970 0.4385 0.9970 0.9985
No log 2.9362 138 1.0160 0.3833 1.0160 1.0080
No log 2.9787 140 1.0121 0.3961 1.0121 1.0060
No log 3.0213 142 1.0128 0.4233 1.0128 1.0064
No log 3.0638 144 1.0413 0.3908 1.0413 1.0205
No log 3.1064 146 1.0194 0.3590 1.0194 1.0097
No log 3.1489 148 1.0187 0.4053 1.0187 1.0093
No log 3.1915 150 1.0415 0.2969 1.0415 1.0205
No log 3.2340 152 1.0455 0.3050 1.0455 1.0225
No log 3.2766 154 1.0094 0.3998 1.0094 1.0047
No log 3.3191 156 1.0209 0.4304 1.0209 1.0104
No log 3.3617 158 1.0673 0.5029 1.0673 1.0331
No log 3.4043 160 1.0393 0.4546 1.0393 1.0195
No log 3.4468 162 1.0719 0.4286 1.0719 1.0353
No log 3.4894 164 1.0844 0.4080 1.0844 1.0413
No log 3.5319 166 1.1426 0.3966 1.1426 1.0689
No log 3.5745 168 0.9936 0.4146 0.9936 0.9968
No log 3.6170 170 0.9810 0.4435 0.9810 0.9905
No log 3.6596 172 1.0874 0.4233 1.0874 1.0428
No log 3.7021 174 1.2268 0.4251 1.2268 1.1076
No log 3.7447 176 1.1650 0.4062 1.1650 1.0794
No log 3.7872 178 1.1476 0.2728 1.1476 1.0713
No log 3.8298 180 1.3088 0.3410 1.3088 1.1440
No log 3.8723 182 1.2345 0.2851 1.2345 1.1111
No log 3.9149 184 1.2082 0.3125 1.2082 1.0992
No log 3.9574 186 1.2015 0.2647 1.2015 1.0961
No log 4.0 188 1.2311 0.2403 1.2311 1.1096
No log 4.0426 190 1.2074 0.2514 1.2074 1.0988
No log 4.0851 192 1.2136 0.3460 1.2136 1.1016
No log 4.1277 194 1.2702 0.3000 1.2702 1.1270
No log 4.1702 196 1.1295 0.4503 1.1295 1.0628
No log 4.2128 198 1.0513 0.4032 1.0513 1.0253
No log 4.2553 200 1.0389 0.2759 1.0389 1.0193
No log 4.2979 202 1.0161 0.3436 1.0161 1.0080
No log 4.3404 204 0.9792 0.3935 0.9792 0.9895
No log 4.3830 206 0.9587 0.4474 0.9587 0.9791
No log 4.4255 208 0.9676 0.3081 0.9676 0.9837
No log 4.4681 210 0.9403 0.3243 0.9403 0.9697
No log 4.5106 212 0.9305 0.4746 0.9305 0.9646
No log 4.5532 214 0.9286 0.4196 0.9286 0.9636
No log 4.5957 216 0.9092 0.4626 0.9092 0.9535
No log 4.6383 218 0.9292 0.4366 0.9292 0.9639
No log 4.6809 220 1.0650 0.4677 1.0650 1.0320
No log 4.7234 222 1.0483 0.3836 1.0483 1.0239
No log 4.7660 224 0.9540 0.3868 0.9540 0.9767
No log 4.8085 226 0.9212 0.3796 0.9212 0.9598
No log 4.8511 228 0.9229 0.3337 0.9229 0.9607
No log 4.8936 230 0.9541 0.3290 0.9541 0.9768
No log 4.9362 232 0.9711 0.3094 0.9711 0.9854
No log 4.9787 234 0.9449 0.3290 0.9449 0.9721
No log 5.0213 236 0.9448 0.3337 0.9448 0.9720
No log 5.0638 238 0.9545 0.3108 0.9545 0.9770
No log 5.1064 240 0.9610 0.3090 0.9610 0.9803
No log 5.1489 242 1.0240 0.3966 1.0240 1.0119
No log 5.1915 244 0.9936 0.3724 0.9936 0.9968
No log 5.2340 246 0.9853 0.3108 0.9853 0.9926
No log 5.2766 248 0.9760 0.3472 0.9760 0.9879
No log 5.3191 250 1.0198 0.4579 1.0198 1.0098
No log 5.3617 252 0.9965 0.4145 0.9965 0.9983
No log 5.4043 254 0.9631 0.3174 0.9631 0.9814
No log 5.4468 256 1.0175 0.3532 1.0175 1.0087
No log 5.4894 258 0.9924 0.3614 0.9924 0.9962
No log 5.5319 260 0.9456 0.3525 0.9456 0.9724
No log 5.5745 262 0.9582 0.3724 0.9582 0.9789
No log 5.6170 264 0.9379 0.3724 0.9379 0.9685
No log 5.6596 266 0.9141 0.4084 0.9141 0.9561
No log 5.7021 268 0.8953 0.4004 0.8953 0.9462
No log 5.7447 270 0.8896 0.4138 0.8896 0.9432
No log 5.7872 272 0.8912 0.4965 0.8912 0.9440
No log 5.8298 274 0.9515 0.4477 0.9515 0.9755
No log 5.8723 276 1.0620 0.3151 1.0620 1.0305
No log 5.9149 278 1.1349 0.2958 1.1349 1.0653
No log 5.9574 280 1.0495 0.3119 1.0495 1.0244
No log 6.0 282 0.9618 0.3728 0.9618 0.9807
No log 6.0426 284 0.9813 0.3328 0.9813 0.9906
No log 6.0851 286 1.0063 0.3728 1.0063 1.0031
No log 6.1277 288 1.0605 0.2865 1.0605 1.0298
No log 6.1702 290 1.0486 0.3884 1.0486 1.0240
No log 6.2128 292 1.0270 0.3728 1.0270 1.0134
No log 6.2553 294 1.0537 0.3365 1.0537 1.0265
No log 6.2979 296 1.0899 0.2984 1.0899 1.0440
No log 6.3404 298 1.0007 0.3798 1.0007 1.0003
No log 6.3830 300 0.9685 0.3525 0.9685 0.9841
No log 6.4255 302 1.0883 0.2590 1.0883 1.0432
No log 6.4681 304 1.1411 0.3197 1.1411 1.0682
No log 6.5106 306 1.0022 0.3530 1.0022 1.0011
No log 6.5532 308 0.8851 0.4081 0.8851 0.9408
No log 6.5957 310 0.9013 0.4159 0.9013 0.9493
No log 6.6383 312 0.9013 0.4081 0.9013 0.9494
No log 6.6809 314 0.9667 0.3535 0.9667 0.9832
No log 6.7234 316 1.0521 0.2621 1.0521 1.0257
No log 6.7660 318 1.0349 0.2813 1.0349 1.0173
No log 6.8085 320 1.0566 0.3157 1.0566 1.0279
No log 6.8511 322 1.1358 0.3262 1.1358 1.0657
No log 6.8936 324 1.1170 0.3409 1.1170 1.0569
No log 6.9362 326 1.0586 0.2864 1.0586 1.0289
No log 6.9787 328 1.0410 0.3271 1.0410 1.0203
No log 7.0213 330 1.0320 0.2965 1.0320 1.0159
No log 7.0638 332 1.0221 0.3086 1.0221 1.0110
No log 7.1064 334 1.0111 0.4079 1.0111 1.0056
No log 7.1489 336 1.0076 0.3715 1.0076 1.0038
No log 7.1915 338 0.9910 0.3819 0.9910 0.9955
No log 7.2340 340 0.9663 0.4181 0.9663 0.9830
No log 7.2766 342 0.9660 0.3522 0.9660 0.9828
No log 7.3191 344 0.9250 0.4879 0.9250 0.9617
No log 7.3617 346 0.8838 0.4359 0.8838 0.9401
No log 7.4043 348 0.8791 0.3472 0.8791 0.9376
No log 7.4468 350 0.8777 0.4555 0.8777 0.9369
No log 7.4894 352 0.9044 0.4620 0.9044 0.9510
No log 7.5319 354 0.9123 0.4611 0.9123 0.9552
No log 7.5745 356 0.9339 0.3880 0.9339 0.9664
No log 7.6170 358 1.0084 0.3154 1.0084 1.0042
No log 7.6596 360 0.9489 0.4405 0.9489 0.9741
No log 7.7021 362 0.9001 0.3996 0.9001 0.9487
No log 7.7447 364 1.0527 0.3782 1.0527 1.0260
No log 7.7872 366 1.0485 0.3782 1.0485 1.0239
No log 7.8298 368 0.9314 0.3996 0.9314 0.9651
No log 7.8723 370 0.9201 0.3421 0.9201 0.9592
No log 7.9149 372 0.9135 0.3728 0.9135 0.9557
No log 7.9574 374 0.9592 0.4450 0.9592 0.9794
No log 8.0 376 0.9563 0.4115 0.9563 0.9779
No log 8.0426 378 0.8871 0.4016 0.8871 0.9419
No log 8.0851 380 0.9254 0.3621 0.9254 0.9620
No log 8.1277 382 0.9975 0.3864 0.9975 0.9987
No log 8.1702 384 0.9820 0.4386 0.9820 0.9909
No log 8.2128 386 0.9066 0.3380 0.9066 0.9522
No log 8.2553 388 0.8911 0.4527 0.8911 0.9440
No log 8.2979 390 0.9078 0.4235 0.9078 0.9528
No log 8.3404 392 0.9014 0.3979 0.9014 0.9494
No log 8.3830 394 0.9402 0.3237 0.9402 0.9697
No log 8.4255 396 0.9647 0.3777 0.9647 0.9822
No log 8.4681 398 0.9024 0.4142 0.9024 0.9500
No log 8.5106 400 0.8656 0.4646 0.8656 0.9304
No log 8.5532 402 0.8747 0.4563 0.8747 0.9352
No log 8.5957 404 0.8875 0.4429 0.8875 0.9421
No log 8.6383 406 0.9024 0.4397 0.9024 0.9500
No log 8.6809 408 0.9119 0.4119 0.9119 0.9549
No log 8.7234 410 0.9247 0.4119 0.9247 0.9616
No log 8.7660 412 0.9588 0.3408 0.9588 0.9792
No log 8.8085 414 1.0353 0.2233 1.0353 1.0175
No log 8.8511 416 1.1303 0.1896 1.1303 1.0632
No log 8.8936 418 1.1172 0.2439 1.1172 1.0570
No log 8.9362 420 1.0152 0.3243 1.0152 1.0076
No log 8.9787 422 0.9568 0.3090 0.9568 0.9782
No log 9.0213 424 0.9590 0.3771 0.9590 0.9793
No log 9.0638 426 0.9528 0.3771 0.9528 0.9761
No log 9.1064 428 0.9725 0.3819 0.9725 0.9862
No log 9.1489 430 0.9967 0.3449 0.9967 0.9984
No log 9.1915 432 1.0234 0.3165 1.0234 1.0116
No log 9.2340 434 1.0156 0.3249 1.0156 1.0078
No log 9.2766 436 0.9665 0.3482 0.9665 0.9831
No log 9.3191 438 0.9527 0.3987 0.9527 0.9761
No log 9.3617 440 0.9565 0.3920 0.9565 0.9780
No log 9.4043 442 0.9587 0.4570 0.9587 0.9792
No log 9.4468 444 0.9393 0.4294 0.9393 0.9692
No log 9.4894 446 0.9155 0.4388 0.9155 0.9568
No log 9.5319 448 0.9139 0.3987 0.9139 0.9560
No log 9.5745 450 0.9193 0.4084 0.9193 0.9588
No log 9.6170 452 0.9239 0.4521 0.9239 0.9612
No log 9.6596 454 0.9928 0.3602 0.9928 0.9964
No log 9.7021 456 1.0517 0.3218 1.0517 1.0255
No log 9.7447 458 1.0206 0.3548 1.0206 1.0103
No log 9.7872 460 0.9278 0.4696 0.9278 0.9632
No log 9.8298 462 0.9051 0.4313 0.9051 0.9514
No log 9.8723 464 0.9253 0.4273 0.9253 0.9619
No log 9.9149 466 0.9060 0.4181 0.9060 0.9519
No log 9.9574 468 0.9226 0.4794 0.9226 0.9605
No log 10.0 470 0.9506 0.4741 0.9506 0.9750
No log 10.0426 472 0.9352 0.4393 0.9352 0.9670
No log 10.0851 474 0.9377 0.4218 0.9377 0.9684
No log 10.1277 476 0.9348 0.4218 0.9348 0.9669
No log 10.1702 478 0.9387 0.4218 0.9387 0.9689
No log 10.2128 480 0.9448 0.4084 0.9448 0.9720
No log 10.2553 482 0.9508 0.4393 0.9508 0.9751
No log 10.2979 484 0.9548 0.4434 0.9548 0.9771
No log 10.3404 486 0.9490 0.3511 0.9490 0.9741
No log 10.3830 488 0.9407 0.3608 0.9407 0.9699
No log 10.4255 490 0.9312 0.3608 0.9312 0.9650
No log 10.4681 492 0.9125 0.4084 0.9125 0.9552
No log 10.5106 494 0.8901 0.4352 0.8901 0.9435
No log 10.5532 496 0.8825 0.4346 0.8825 0.9394
No log 10.5957 498 0.8902 0.4527 0.8902 0.9435
0.3308 10.6383 500 0.8867 0.4540 0.8867 0.9416
0.3308 10.6809 502 0.8795 0.4429 0.8795 0.9378
0.3308 10.7234 504 0.8847 0.4681 0.8847 0.9406
0.3308 10.7660 506 0.8818 0.4626 0.8818 0.9390
0.3308 10.8085 508 0.8752 0.4626 0.8752 0.9355
0.3308 10.8511 510 0.8616 0.4393 0.8616 0.9283
0.3308 10.8936 512 0.8578 0.4450 0.8578 0.9262
0.3308 10.9362 514 0.8730 0.4409 0.8730 0.9343
0.3308 10.9787 516 0.8715 0.4181 0.8715 0.9335
0.3308 11.0213 518 0.9002 0.4144 0.9002 0.9488

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
135M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k13_task2_organization

Finetuned
(4005)
this model