ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9995
  • Qwk: 0.6029
  • Mse: 0.9995
  • Rmse: 0.9997

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 6.9762 0.0116 6.9762 2.6412
No log 0.2667 4 4.5336 0.0562 4.5336 2.1292
No log 0.4 6 3.0297 0.0988 3.0297 1.7406
No log 0.5333 8 2.2848 0.0671 2.2848 1.5116
No log 0.6667 10 2.1711 0.1504 2.1711 1.4735
No log 0.8 12 1.7811 0.1308 1.7811 1.3346
No log 0.9333 14 1.6031 0.1714 1.6031 1.2662
No log 1.0667 16 1.5111 0.2545 1.5111 1.2293
No log 1.2 18 1.4676 0.2202 1.4676 1.2115
No log 1.3333 20 1.3815 0.2593 1.3815 1.1754
No log 1.4667 22 1.4641 0.3091 1.4641 1.2100
No log 1.6 24 1.4323 0.3793 1.4323 1.1968
No log 1.7333 26 1.3071 0.4310 1.3071 1.1433
No log 1.8667 28 1.3527 0.3604 1.3527 1.1631
No log 2.0 30 1.2310 0.4655 1.2310 1.1095
No log 2.1333 32 1.1542 0.4915 1.1542 1.0743
No log 2.2667 34 1.1464 0.5691 1.1464 1.0707
No log 2.4 36 1.1208 0.5920 1.1208 1.0587
No log 2.5333 38 1.0833 0.5984 1.0833 1.0408
No log 2.6667 40 1.1861 0.5736 1.1861 1.0891
No log 2.8 42 1.3626 0.4275 1.3626 1.1673
No log 2.9333 44 1.4226 0.4179 1.4226 1.1927
No log 3.0667 46 1.4246 0.4094 1.4246 1.1936
No log 3.2 48 1.4806 0.3548 1.4806 1.2168
No log 3.3333 50 1.3050 0.5512 1.3050 1.1424
No log 3.4667 52 1.1071 0.5802 1.1071 1.0522
No log 3.6 54 1.0959 0.5079 1.0959 1.0468
No log 3.7333 56 1.0586 0.5385 1.0586 1.0289
No log 3.8667 58 1.0955 0.5303 1.0955 1.0467
No log 4.0 60 1.0429 0.6061 1.0429 1.0212
No log 4.1333 62 1.0570 0.5625 1.0570 1.0281
No log 4.2667 64 1.1941 0.5496 1.1941 1.0927
No log 4.4 66 1.2302 0.4923 1.2302 1.1092
No log 4.5333 68 1.2277 0.48 1.2277 1.1080
No log 4.6667 70 1.1289 0.5581 1.1289 1.0625
No log 4.8 72 1.0033 0.6107 1.0033 1.0016
No log 4.9333 74 0.9515 0.6418 0.9515 0.9754
No log 5.0667 76 1.0305 0.6269 1.0305 1.0151
No log 5.2 78 1.2575 0.5255 1.2575 1.1214
No log 5.3333 80 1.3976 0.4317 1.3976 1.1822
No log 5.4667 82 1.3370 0.5 1.3370 1.1563
No log 5.6 84 1.1708 0.5867 1.1708 1.0820
No log 5.7333 86 1.1220 0.6194 1.1220 1.0593
No log 5.8667 88 1.1572 0.6391 1.1572 1.0757
No log 6.0 90 1.1967 0.6587 1.1967 1.0939
No log 6.1333 92 1.0404 0.5833 1.0404 1.0200
No log 6.2667 94 0.8309 0.6861 0.8309 0.9116
No log 6.4 96 0.9682 0.6260 0.9682 0.9840
No log 6.5333 98 1.0802 0.5469 1.0802 1.0393
No log 6.6667 100 1.0284 0.6165 1.0284 1.0141
No log 6.8 102 1.2650 0.5850 1.2650 1.1247
No log 6.9333 104 1.2493 0.5946 1.2493 1.1177
No log 7.0667 106 1.0268 0.5882 1.0268 1.0133
No log 7.2 108 1.0016 0.6154 1.0016 1.0008
No log 7.3333 110 1.1036 0.5532 1.1036 1.0505
No log 7.4667 112 1.2791 0.5037 1.2791 1.1310
No log 7.6 114 1.4619 0.52 1.4619 1.2091
No log 7.7333 116 1.5598 0.4474 1.5598 1.2489
No log 7.8667 118 1.3194 0.5180 1.3194 1.1487
No log 8.0 120 1.1818 0.5630 1.1818 1.0871
No log 8.1333 122 1.1127 0.5942 1.1127 1.0548
No log 8.2667 124 1.0250 0.6131 1.0250 1.0124
No log 8.4 126 0.9507 0.6269 0.9507 0.9750
No log 8.5333 128 0.9369 0.6370 0.9369 0.9679
No log 8.6667 130 0.9462 0.6519 0.9462 0.9727
No log 8.8 132 0.9783 0.6389 0.9783 0.9891
No log 8.9333 134 1.2009 0.5890 1.2009 1.0958
No log 9.0667 136 1.2988 0.5912 1.2988 1.1397
No log 9.2 138 1.1619 0.5772 1.1619 1.0779
No log 9.3333 140 0.9321 0.6575 0.9321 0.9655
No log 9.4667 142 0.8416 0.6522 0.8416 0.9174
No log 9.6 144 0.7632 0.6809 0.7632 0.8736
No log 9.7333 146 0.7995 0.6901 0.7995 0.8941
No log 9.8667 148 0.9441 0.6277 0.9441 0.9716
No log 10.0 150 1.0413 0.5455 1.0413 1.0204
No log 10.1333 152 1.0956 0.5692 1.0956 1.0467
No log 10.2667 154 1.1166 0.5649 1.1166 1.0567
No log 10.4 156 1.2046 0.5755 1.2046 1.0975
No log 10.5333 158 1.1852 0.5915 1.1852 1.0887
No log 10.6667 160 1.0701 0.5974 1.0701 1.0345
No log 10.8 162 0.9173 0.6323 0.9173 0.9578
No log 10.9333 164 0.7214 0.7347 0.7214 0.8494
No log 11.0667 166 0.6953 0.7534 0.6953 0.8338
No log 11.2 168 0.7027 0.7413 0.7027 0.8383
No log 11.3333 170 0.7657 0.6901 0.7657 0.8750
No log 11.4667 172 0.9875 0.6438 0.9875 0.9937
No log 11.6 174 1.0440 0.6027 1.0440 1.0218
No log 11.7333 176 0.9659 0.6131 0.9659 0.9828
No log 11.8667 178 0.9263 0.6522 0.9263 0.9625
No log 12.0 180 0.9092 0.6423 0.9092 0.9535
No log 12.1333 182 0.9064 0.6277 0.9064 0.9520
No log 12.2667 184 0.9109 0.6176 0.9109 0.9544
No log 12.4 186 0.9422 0.6029 0.9422 0.9707
No log 12.5333 188 1.0177 0.6241 1.0177 1.0088
No log 12.6667 190 1.1664 0.6154 1.1664 1.0800
No log 12.8 192 1.2425 0.5972 1.2425 1.1147
No log 12.9333 194 1.0991 0.6 1.0991 1.0484
No log 13.0667 196 0.9600 0.6029 0.9600 0.9798
No log 13.2 198 0.8865 0.6087 0.8865 0.9415
No log 13.3333 200 0.8820 0.6277 0.8820 0.9392
No log 13.4667 202 0.9883 0.6143 0.9883 0.9941
No log 13.6 204 1.0696 0.6099 1.0696 1.0342
No log 13.7333 206 1.0808 0.6143 1.0808 1.0396
No log 13.8667 208 1.1522 0.6 1.1522 1.0734
No log 14.0 210 1.1571 0.5839 1.1571 1.0757
No log 14.1333 212 1.1449 0.6087 1.1449 1.0700
No log 14.2667 214 1.1036 0.6131 1.1036 1.0505
No log 14.4 216 0.9916 0.5985 0.9916 0.9958
No log 14.5333 218 0.9330 0.5926 0.9330 0.9659
No log 14.6667 220 0.9423 0.6187 0.9423 0.9707
No log 14.8 222 0.8956 0.6176 0.8956 0.9463
No log 14.9333 224 0.9036 0.6715 0.9036 0.9506
No log 15.0667 226 0.9851 0.6074 0.9851 0.9925
No log 15.2 228 1.0505 0.6043 1.0505 1.0249
No log 15.3333 230 0.9727 0.6119 0.9727 0.9862
No log 15.4667 232 0.9387 0.6222 0.9387 0.9688
No log 15.6 234 0.9847 0.6483 0.9847 0.9923
No log 15.7333 236 1.1289 0.6415 1.1289 1.0625
No log 15.8667 238 1.1370 0.6329 1.1370 1.0663
No log 16.0 240 1.1060 0.6164 1.1060 1.0517
No log 16.1333 242 1.1443 0.6241 1.1443 1.0697
No log 16.2667 244 1.1473 0.6241 1.1473 1.0711
No log 16.4 246 1.1726 0.6338 1.1726 1.0829
No log 16.5333 248 1.1472 0.6099 1.1472 1.0711
No log 16.6667 250 1.0388 0.6029 1.0388 1.0192
No log 16.8 252 0.9390 0.6331 0.9390 0.9690
No log 16.9333 254 0.9287 0.6434 0.9287 0.9637
No log 17.0667 256 0.9144 0.6957 0.9144 0.9563
No log 17.2 258 0.9830 0.6626 0.9830 0.9915
No log 17.3333 260 1.1315 0.6627 1.1315 1.0637
No log 17.4667 262 1.0734 0.6626 1.0734 1.0360
No log 17.6 264 0.9525 0.7190 0.9525 0.9759
No log 17.7333 266 0.9009 0.6763 0.9009 0.9492
No log 17.8667 268 0.9482 0.6466 0.9482 0.9738
No log 18.0 270 1.0518 0.6061 1.0518 1.0256
No log 18.1333 272 1.0363 0.6061 1.0363 1.0180
No log 18.2667 274 0.9828 0.6260 0.9828 0.9914
No log 18.4 276 0.9671 0.6212 0.9671 0.9834
No log 18.5333 278 0.9856 0.6222 0.9856 0.9927
No log 18.6667 280 0.9763 0.6620 0.9763 0.9881
No log 18.8 282 1.0212 0.6620 1.0212 1.0105
No log 18.9333 284 1.0155 0.6620 1.0155 1.0077
No log 19.0667 286 1.0708 0.6667 1.0708 1.0348
No log 19.2 288 1.1283 0.6582 1.1283 1.0622
No log 19.3333 290 1.1923 0.6503 1.1923 1.0919
No log 19.4667 292 1.0601 0.6667 1.0601 1.0296
No log 19.6 294 1.0263 0.6795 1.0263 1.0131
No log 19.7333 296 1.0832 0.6624 1.0832 1.0408
No log 19.8667 298 1.0412 0.6790 1.0412 1.0204
No log 20.0 300 0.9402 0.6667 0.9402 0.9696
No log 20.1333 302 0.8753 0.6806 0.8753 0.9355
No log 20.2667 304 0.8673 0.6667 0.8673 0.9313
No log 20.4 306 0.9088 0.6667 0.9088 0.9533
No log 20.5333 308 0.9834 0.6797 0.9834 0.9917
No log 20.6667 310 1.0086 0.6582 1.0086 1.0043
No log 20.8 312 0.9966 0.6585 0.9966 0.9983
No log 20.9333 314 1.0463 0.6707 1.0463 1.0229
No log 21.0667 316 1.0816 0.6667 1.0816 1.0400
No log 21.2 318 1.1098 0.6242 1.1098 1.0535
No log 21.3333 320 1.0380 0.6154 1.0380 1.0188
No log 21.4667 322 0.9753 0.6176 0.9753 0.9876
No log 21.6 324 1.0447 0.6286 1.0447 1.0221
No log 21.7333 326 1.1366 0.6338 1.1366 1.0661
No log 21.8667 328 1.0398 0.6143 1.0398 1.0197
No log 22.0 330 1.0080 0.6143 1.0080 1.0040
No log 22.1333 332 0.9419 0.6429 0.9419 0.9705
No log 22.2667 334 0.8671 0.6573 0.8671 0.9312
No log 22.4 336 0.9215 0.6803 0.9215 0.9600
No log 22.5333 338 1.0440 0.6225 1.0440 1.0217
No log 22.6667 340 1.0417 0.6575 1.0417 1.0206
No log 22.8 342 0.9889 0.6286 0.9889 0.9944
No log 22.9333 344 0.9988 0.6571 0.9988 0.9994
No log 23.0667 346 0.9783 0.6619 0.9783 0.9891
No log 23.2 348 0.9708 0.6434 0.9708 0.9853
No log 23.3333 350 1.0487 0.6316 1.0487 1.0241
No log 23.4667 352 1.0396 0.6301 1.0396 1.0196
No log 23.6 354 0.9583 0.6043 0.9583 0.9789
No log 23.7333 356 0.9187 0.6324 0.9187 0.9585
No log 23.8667 358 0.9547 0.6043 0.9547 0.9771
No log 24.0 360 1.1132 0.6443 1.1132 1.0551
No log 24.1333 362 1.1777 0.6093 1.1777 1.0852
No log 24.2667 364 1.1012 0.6131 1.1012 1.0494
No log 24.4 366 0.9899 0.6377 0.9899 0.9949
No log 24.5333 368 0.9070 0.6324 0.9070 0.9524
No log 24.6667 370 0.9125 0.6569 0.9125 0.9553
No log 24.8 372 0.9867 0.6423 0.9867 0.9933
No log 24.9333 374 1.0075 0.6423 1.0075 1.0037
No log 25.0667 376 0.9666 0.6522 0.9666 0.9831
No log 25.2 378 0.9033 0.6466 0.9033 0.9504
No log 25.3333 380 0.9248 0.6763 0.9248 0.9617
No log 25.4667 382 1.0007 0.6301 1.0007 1.0004
No log 25.6 384 0.9922 0.6351 0.9922 0.9961
No log 25.7333 386 0.9648 0.6443 0.9648 0.9823
No log 25.8667 388 0.8456 0.6906 0.8456 0.9195
No log 26.0 390 0.8162 0.7042 0.8162 0.9035
No log 26.1333 392 0.8276 0.6950 0.8276 0.9097
No log 26.2667 394 0.9064 0.7034 0.9064 0.9521
No log 26.4 396 0.9804 0.6525 0.9804 0.9902
No log 26.5333 398 1.0055 0.6377 1.0055 1.0027
No log 26.6667 400 0.9419 0.6567 0.9419 0.9705
No log 26.8 402 0.9435 0.6767 0.9435 0.9714
No log 26.9333 404 1.0097 0.6212 1.0097 1.0049
No log 27.0667 406 1.0988 0.5942 1.0988 1.0482
No log 27.2 408 1.1606 0.5946 1.1606 1.0773
No log 27.3333 410 1.0539 0.6216 1.0539 1.0266
No log 27.4667 412 0.8727 0.6667 0.8727 0.9342
No log 27.6 414 0.7833 0.7050 0.7833 0.8850
No log 27.7333 416 0.7628 0.7050 0.7628 0.8734
No log 27.8667 418 0.7717 0.6714 0.7717 0.8785
No log 28.0 420 0.8107 0.6712 0.8107 0.9004
No log 28.1333 422 0.8331 0.6531 0.8331 0.9127
No log 28.2667 424 0.8927 0.6438 0.8927 0.9448
No log 28.4 426 1.0182 0.6133 1.0182 1.0091
No log 28.5333 428 1.0942 0.6358 1.0942 1.0460
No log 28.6667 430 1.0785 0.5899 1.0785 1.0385
No log 28.8 432 1.0396 0.5899 1.0396 1.0196
No log 28.9333 434 0.9691 0.6370 0.9691 0.9844
No log 29.0667 436 0.9503 0.6618 0.9503 0.9748
No log 29.2 438 0.9568 0.6519 0.9568 0.9781
No log 29.3333 440 1.0303 0.6154 1.0303 1.0150
No log 29.4667 442 1.0985 0.6301 1.0985 1.0481
No log 29.6 444 1.0627 0.625 1.0627 1.0309
No log 29.7333 446 1.0104 0.6277 1.0104 1.0052
No log 29.8667 448 0.9952 0.6519 0.9952 0.9976
No log 30.0 450 0.9886 0.6324 0.9886 0.9943
No log 30.1333 452 0.9949 0.6423 0.9949 0.9974
No log 30.2667 454 1.0294 0.6259 1.0294 1.0146
No log 30.4 456 0.9364 0.6624 0.9364 0.9677
No log 30.5333 458 0.8712 0.6909 0.8712 0.9334
No log 30.6667 460 0.7814 0.7273 0.7814 0.8840
No log 30.8 462 0.7434 0.7226 0.7434 0.8622
No log 30.9333 464 0.7723 0.7226 0.7723 0.8788
No log 31.0667 466 0.8276 0.6620 0.8276 0.9097
No log 31.2 468 0.8275 0.6667 0.8275 0.9096
No log 31.3333 470 0.8978 0.6316 0.8978 0.9475
No log 31.4667 472 0.9547 0.6316 0.9547 0.9771
No log 31.6 474 1.0380 0.6131 1.0380 1.0188
No log 31.7333 476 1.1674 0.6029 1.1674 1.0805
No log 31.8667 478 1.2389 0.5507 1.2389 1.1131
No log 32.0 480 1.1844 0.5630 1.1844 1.0883
No log 32.1333 482 1.1086 0.6061 1.1086 1.0529
No log 32.2667 484 1.0927 0.5938 1.0927 1.0453
No log 32.4 486 1.0898 0.5669 1.0898 1.0439
No log 32.5333 488 1.1151 0.5538 1.1151 1.0560
No log 32.6667 490 1.1816 0.5481 1.1816 1.0870
No log 32.8 492 1.2220 0.5734 1.2220 1.1055
No log 32.9333 494 1.2775 0.5960 1.2775 1.1303
No log 33.0667 496 1.1949 0.6040 1.1949 1.0931
No log 33.2 498 1.0213 0.6277 1.0213 1.0106
0.3115 33.3333 500 0.9042 0.6316 0.9042 0.9509
0.3115 33.4667 502 0.8744 0.6519 0.8744 0.9351
0.3115 33.6 504 0.8831 0.6418 0.8831 0.9397
0.3115 33.7333 506 0.9224 0.6377 0.9224 0.9604
0.3115 33.8667 508 0.9732 0.6443 0.9732 0.9865
0.3115 34.0 510 1.0563 0.6582 1.0563 1.0278
0.3115 34.1333 512 1.0273 0.6790 1.0273 1.0136
0.3115 34.2667 514 0.9763 0.6533 0.9763 0.9881
0.3115 34.4 516 0.8881 0.6974 0.8881 0.9424
0.3115 34.5333 518 0.8023 0.7034 0.8023 0.8957
0.3115 34.6667 520 0.7705 0.7075 0.7705 0.8778
0.3115 34.8 522 0.7883 0.7034 0.7883 0.8879
0.3115 34.9333 524 0.8400 0.7083 0.8400 0.9165
0.3115 35.0667 526 0.9255 0.6533 0.9255 0.9620
0.3115 35.2 528 0.9700 0.6443 0.9700 0.9849
0.3115 35.3333 530 0.9377 0.6533 0.9377 0.9683
0.3115 35.4667 532 0.8611 0.6571 0.8611 0.9280
0.3115 35.6 534 0.8192 0.6571 0.8192 0.9051
0.3115 35.7333 536 0.7956 0.6853 0.7956 0.8920
0.3115 35.8667 538 0.8034 0.6853 0.8034 0.8963
0.3115 36.0 540 0.8044 0.6974 0.8044 0.8969
0.3115 36.1333 542 0.7973 0.7089 0.7973 0.8929
0.3115 36.2667 544 0.7906 0.6883 0.7906 0.8892
0.3115 36.4 546 0.7587 0.6986 0.7587 0.8711
0.3115 36.5333 548 0.7762 0.7222 0.7762 0.8810
0.3115 36.6667 550 0.8119 0.6901 0.8119 0.9011
0.3115 36.8 552 0.8977 0.6475 0.8977 0.9475
0.3115 36.9333 554 1.0032 0.6187 1.0032 1.0016
0.3115 37.0667 556 1.0809 0.5942 1.0809 1.0397
0.3115 37.2 558 1.0841 0.5942 1.0841 1.0412
0.3115 37.3333 560 1.0333 0.6029 1.0333 1.0165
0.3115 37.4667 562 0.9995 0.6029 0.9995 0.9997

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task1_organization

Finetuned
(4206)
this model