ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1214
  • Qwk: 0.6143
  • Mse: 1.1214
  • Rmse: 1.0590

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0141 2 6.7440 0.0242 6.7440 2.5969
No log 0.0282 4 5.2940 0.0220 5.2940 2.3009
No log 0.0423 6 3.3084 0.1290 3.3084 1.8189
No log 0.0563 8 2.2826 0.2286 2.2826 1.5108
No log 0.0704 10 1.8655 0.125 1.8655 1.3658
No log 0.0845 12 1.7103 0.1132 1.7103 1.3078
No log 0.0986 14 1.6698 0.1308 1.6698 1.2922
No log 0.1127 16 1.6239 0.3471 1.6239 1.2743
No log 0.1268 18 1.6686 0.3200 1.6686 1.2917
No log 0.1408 20 1.9506 0.2609 1.9506 1.3966
No log 0.1549 22 1.3913 0.4375 1.3913 1.1795
No log 0.1690 24 1.6055 0.3817 1.6055 1.2671
No log 0.1831 26 1.9794 0.1077 1.9794 1.4069
No log 0.1972 28 1.7775 0.2258 1.7775 1.3332
No log 0.2113 30 1.3990 0.3793 1.3990 1.1828
No log 0.2254 32 1.3736 0.3423 1.3736 1.1720
No log 0.2394 34 1.3730 0.2617 1.3730 1.1718
No log 0.2535 36 1.3779 0.3273 1.3779 1.1739
No log 0.2676 38 1.4917 0.3652 1.4917 1.2214
No log 0.2817 40 1.6636 0.2689 1.6636 1.2898
No log 0.2958 42 1.7542 0.3125 1.7542 1.3245
No log 0.3099 44 1.3129 0.4769 1.3129 1.1458
No log 0.3239 46 1.0856 0.5692 1.0856 1.0419
No log 0.3380 48 0.8973 0.6618 0.8973 0.9473
No log 0.3521 50 0.8698 0.7 0.8698 0.9326
No log 0.3662 52 0.9331 0.7190 0.9331 0.9659
No log 0.3803 54 0.8663 0.6933 0.8663 0.9307
No log 0.3944 56 1.1653 0.5077 1.1653 1.0795
No log 0.4085 58 1.9724 0.0672 1.9724 1.4044
No log 0.4225 60 1.9031 0.1406 1.9031 1.3795
No log 0.4366 62 1.3217 0.4769 1.3217 1.1497
No log 0.4507 64 1.0487 0.5692 1.0487 1.0241
No log 0.4648 66 1.2912 0.4923 1.2912 1.1363
No log 0.4789 68 1.0597 0.6142 1.0597 1.0294
No log 0.4930 70 1.0556 0.5827 1.0556 1.0274
No log 0.5070 72 1.1047 0.5344 1.1047 1.0510
No log 0.5211 74 1.0206 0.6212 1.0206 1.0102
No log 0.5352 76 0.8976 0.7059 0.8976 0.9474
No log 0.5493 78 0.9363 0.6618 0.9363 0.9676
No log 0.5634 80 1.2574 0.5224 1.2574 1.1213
No log 0.5775 82 1.0410 0.5672 1.0410 1.0203
No log 0.5915 84 0.8990 0.6806 0.8990 0.9481
No log 0.6056 86 1.1185 0.5850 1.1185 1.0576
No log 0.6197 88 0.8235 0.7260 0.8235 0.9075
No log 0.6338 90 0.8973 0.6331 0.8973 0.9473
No log 0.6479 92 1.2783 0.5147 1.2783 1.1306
No log 0.6620 94 1.2196 0.5401 1.2196 1.1044
No log 0.6761 96 0.8845 0.6074 0.8845 0.9405
No log 0.6901 98 0.8184 0.7353 0.8184 0.9046
No log 0.7042 100 0.8440 0.7164 0.8440 0.9187
No log 0.7183 102 0.8693 0.6324 0.8693 0.9324
No log 0.7324 104 0.8373 0.6324 0.8373 0.9150
No log 0.7465 106 0.9279 0.6074 0.9279 0.9633
No log 0.7606 108 1.1173 0.5714 1.1173 1.0570
No log 0.7746 110 1.3111 0.5175 1.3111 1.1451
No log 0.7887 112 1.3648 0.5 1.3648 1.1682
No log 0.8028 114 1.2358 0.5811 1.2358 1.1117
No log 0.8169 116 0.8963 0.6309 0.8963 0.9467
No log 0.8310 118 0.8672 0.6316 0.8672 0.9313
No log 0.8451 120 1.0217 0.6225 1.0217 1.0108
No log 0.8592 122 1.0680 0.6420 1.0680 1.0335
No log 0.8732 124 0.8092 0.6622 0.8092 0.8996
No log 0.8873 126 0.6600 0.7432 0.6600 0.8124
No log 0.9014 128 0.6742 0.7808 0.6742 0.8211
No log 0.9155 130 1.0069 0.6389 1.0069 1.0035
No log 0.9296 132 1.5822 0.3724 1.5822 1.2578
No log 0.9437 134 1.4887 0.3857 1.4887 1.2201
No log 0.9577 136 0.9638 0.5865 0.9638 0.9817
No log 0.9718 138 0.7901 0.6970 0.7901 0.8889
No log 0.9859 140 0.8876 0.5865 0.8876 0.9421
No log 1.0 142 0.8067 0.6316 0.8067 0.8981
No log 1.0141 144 0.7887 0.7121 0.7887 0.8881
No log 1.0282 146 1.1604 0.5224 1.1604 1.0772
No log 1.0423 148 1.3343 0.5 1.3343 1.1551
No log 1.0563 150 1.1131 0.5755 1.1131 1.0550
No log 1.0704 152 0.7185 0.7153 0.7185 0.8476
No log 1.0845 154 0.6178 0.7518 0.6178 0.7860
No log 1.0986 156 0.6170 0.7518 0.6170 0.7855
No log 1.1127 158 0.7039 0.7347 0.7039 0.8390
No log 1.1268 160 1.2500 0.5890 1.2500 1.1180
No log 1.1408 162 1.5417 0.5185 1.5417 1.2416
No log 1.1549 164 1.3428 0.5526 1.3428 1.1588
No log 1.1690 166 1.0460 0.6389 1.0460 1.0228
No log 1.1831 168 0.9130 0.6331 0.9130 0.9555
No log 1.1972 170 0.7866 0.6957 0.7866 0.8869
No log 1.2113 172 0.8143 0.6618 0.8143 0.9024
No log 1.2254 174 1.0080 0.5802 1.0080 1.0040
No log 1.2394 176 1.2833 0.5038 1.2833 1.1328
No log 1.2535 178 1.4008 0.4593 1.4008 1.1835
No log 1.2676 180 1.2106 0.5224 1.2106 1.1003
No log 1.2817 182 0.9220 0.6308 0.9220 0.9602
No log 1.2958 184 0.7785 0.7111 0.7785 0.8823
No log 1.3099 186 0.7528 0.7206 0.7528 0.8676
No log 1.3239 188 0.8576 0.6412 0.8576 0.9261
No log 1.3380 190 1.0571 0.6447 1.0571 1.0281
No log 1.3521 192 1.0745 0.6410 1.0745 1.0366
No log 1.3662 194 0.8194 0.6944 0.8194 0.9052
No log 1.3803 196 0.6416 0.7671 0.6416 0.8010
No log 1.3944 198 0.6060 0.8105 0.6060 0.7785
No log 1.4085 200 0.7386 0.7470 0.7386 0.8594
No log 1.4225 202 0.8371 0.7066 0.8371 0.9150
No log 1.4366 204 0.7535 0.7389 0.7535 0.8681
No log 1.4507 206 0.8178 0.6933 0.8178 0.9043
No log 1.4648 208 0.9892 0.6207 0.9892 0.9946
No log 1.4789 210 0.9484 0.6119 0.9484 0.9739
No log 1.4930 212 0.9129 0.6277 0.9129 0.9555
No log 1.5070 214 1.0244 0.6043 1.0244 1.0121
No log 1.5211 216 0.9973 0.6232 0.9973 0.9987
No log 1.5352 218 0.9717 0.6622 0.9717 0.9857
No log 1.5493 220 0.8828 0.6429 0.8828 0.9396
No log 1.5634 222 0.8831 0.6712 0.8831 0.9397
No log 1.5775 224 1.1185 0.5874 1.1185 1.0576
No log 1.5915 226 1.1407 0.5714 1.1407 1.0680
No log 1.6056 228 0.8499 0.6475 0.8499 0.9219
No log 1.6197 230 0.7666 0.6716 0.7666 0.8755
No log 1.6338 232 0.8144 0.6364 0.8144 0.9025
No log 1.6479 234 0.8424 0.6423 0.8424 0.9178
No log 1.6620 236 1.0570 0.625 1.0570 1.0281
No log 1.6761 238 1.1823 0.6460 1.1823 1.0874
No log 1.6901 240 1.1059 0.6541 1.1059 1.0516
No log 1.7042 242 0.7642 0.7114 0.7642 0.8742
No log 1.7183 244 0.6791 0.75 0.6791 0.8241
No log 1.7324 246 0.6777 0.7647 0.6777 0.8232
No log 1.7465 248 0.8198 0.6962 0.8198 0.9054
No log 1.7606 250 1.0871 0.6452 1.0871 1.0427
No log 1.7746 252 1.0364 0.6582 1.0364 1.0180
No log 1.7887 254 0.8287 0.6667 0.8287 0.9103
No log 1.8028 256 0.8374 0.6619 0.8374 0.9151
No log 1.8169 258 1.0605 0.6301 1.0605 1.0298
No log 1.8310 260 1.2231 0.6122 1.2231 1.1059
No log 1.8451 262 1.0880 0.6069 1.0880 1.0431
No log 1.8592 264 0.7917 0.6519 0.7917 0.8898
No log 1.8732 266 0.7044 0.6963 0.7044 0.8393
No log 1.8873 268 0.8074 0.6806 0.8074 0.8985
No log 1.9014 270 1.0232 0.6456 1.0232 1.0115
No log 1.9155 272 1.3892 0.5679 1.3892 1.1787
No log 1.9296 274 1.3891 0.5509 1.3891 1.1786
No log 1.9437 276 1.1799 0.6552 1.1799 1.0862
No log 1.9577 278 0.9384 0.6538 0.9384 0.9687
No log 1.9718 280 0.7552 0.7050 0.7552 0.8690
No log 1.9859 282 0.7083 0.7259 0.7083 0.8416
No log 2.0 284 0.7529 0.6870 0.7529 0.8677
No log 2.0141 286 0.8592 0.6618 0.8592 0.9269
No log 2.0282 288 0.9947 0.6377 0.9947 0.9974
No log 2.0423 290 0.9650 0.6222 0.9650 0.9823
No log 2.0563 292 0.8001 0.6406 0.8001 0.8945
No log 2.0704 294 0.7126 0.6815 0.7126 0.8442
No log 2.0845 296 0.7479 0.6980 0.7479 0.8648
No log 2.0986 298 1.0272 0.6708 1.0272 1.0135
No log 2.1127 300 1.1649 0.6173 1.1649 1.0793
No log 2.1268 302 1.0545 0.6531 1.0545 1.0269
No log 2.1408 304 0.8614 0.6667 0.8614 0.9281
No log 2.1549 306 0.8295 0.6769 0.8295 0.9107
No log 2.1690 308 0.8993 0.6418 0.8993 0.9483
No log 2.1831 310 1.1010 0.6282 1.1010 1.0493
No log 2.1972 312 1.1787 0.6135 1.1787 1.0857
No log 2.2113 314 1.3576 0.5647 1.3576 1.1651
No log 2.2254 316 1.2120 0.5478 1.2120 1.1009
No log 2.2394 318 1.0059 0.6533 1.0059 1.0030
No log 2.2535 320 0.9030 0.6370 0.9030 0.9503
No log 2.2676 322 0.8499 0.6812 0.8499 0.9219
No log 2.2817 324 0.9406 0.6579 0.9406 0.9698
No log 2.2958 326 0.9956 0.6667 0.9956 0.9978
No log 2.3099 328 0.9049 0.6579 0.9049 0.9513
No log 2.3239 330 0.8125 0.6809 0.8125 0.9014
No log 2.3380 332 0.7437 0.7286 0.7437 0.8624
No log 2.3521 334 0.8225 0.7285 0.8225 0.9069
No log 2.3662 336 1.0335 0.6463 1.0335 1.0166
No log 2.3803 338 1.1395 0.6 1.1395 1.0675
No log 2.3944 340 0.9304 0.6710 0.9304 0.9646
No log 2.4085 342 0.7049 0.7391 0.7049 0.8396
No log 2.4225 344 0.6756 0.7832 0.6756 0.8219
No log 2.4366 346 0.6993 0.7391 0.6993 0.8363
No log 2.4507 348 0.9126 0.6713 0.9126 0.9553
No log 2.4648 350 1.1028 0.625 1.1028 1.0502
No log 2.4789 352 1.0142 0.6383 1.0142 1.0071
No log 2.4930 354 0.8221 0.6667 0.8221 0.9067
No log 2.5070 356 0.6899 0.7571 0.6899 0.8306
No log 2.5211 358 0.6967 0.7338 0.6967 0.8347
No log 2.5352 360 0.8171 0.6939 0.8171 0.9039
No log 2.5493 362 1.2282 0.5839 1.2282 1.1083
No log 2.5634 364 1.4694 0.5116 1.4694 1.2122
No log 2.5775 366 1.3706 0.5065 1.3706 1.1707
No log 2.5915 368 1.0564 0.6286 1.0564 1.0278
No log 2.6056 370 0.7913 0.6870 0.7913 0.8896
No log 2.6197 372 0.7205 0.6917 0.7205 0.8488
No log 2.6338 374 0.7456 0.7164 0.7456 0.8635
No log 2.6479 376 0.9632 0.6622 0.9632 0.9814
No log 2.6620 378 1.3262 0.5488 1.3262 1.1516
No log 2.6761 380 1.3304 0.5967 1.3304 1.1535
No log 2.6901 382 1.0394 0.6433 1.0394 1.0195
No log 2.7042 384 0.7160 0.75 0.7160 0.8461
No log 2.7183 386 0.6142 0.7919 0.6142 0.7837
No log 2.7324 388 0.6132 0.7919 0.6132 0.7831
No log 2.7465 390 0.7013 0.7702 0.7013 0.8374
No log 2.7606 392 0.9525 0.6627 0.9525 0.9759
No log 2.7746 394 1.1454 0.6076 1.1454 1.0702
No log 2.7887 396 1.0687 0.5915 1.0687 1.0338
No log 2.8028 398 0.8730 0.6370 0.8730 0.9344
No log 2.8169 400 0.7101 0.7164 0.7101 0.8427
No log 2.8310 402 0.6828 0.7407 0.6828 0.8263
No log 2.8451 404 0.7016 0.7299 0.7016 0.8376
No log 2.8592 406 0.7366 0.7015 0.7366 0.8583
No log 2.8732 408 0.7954 0.7297 0.7954 0.8919
No log 2.8873 410 0.8738 0.6536 0.8738 0.9348
No log 2.9014 412 1.0533 0.6667 1.0533 1.0263
No log 2.9155 414 1.1614 0.6480 1.1614 1.0777
No log 2.9296 416 0.9818 0.6667 0.9818 0.9909
No log 2.9437 418 0.7215 0.7006 0.7215 0.8494
No log 2.9577 420 0.6422 0.7692 0.6422 0.8014
No log 2.9718 422 0.6573 0.7692 0.6573 0.8107
No log 2.9859 424 0.6745 0.7770 0.6745 0.8213
No log 3.0 426 0.9489 0.6187 0.9489 0.9741
No log 3.0141 428 1.3060 0.5101 1.3060 1.1428
No log 3.0282 430 1.3793 0.5290 1.3793 1.1744
No log 3.0423 432 1.1668 0.5621 1.1668 1.0802
No log 3.0563 434 0.9193 0.6620 0.9193 0.9588
No log 3.0704 436 0.8717 0.6667 0.8717 0.9336
No log 3.0845 438 0.8284 0.6519 0.8284 0.9101
No log 3.0986 440 0.7904 0.6202 0.7904 0.8890
No log 3.1127 442 0.8700 0.6712 0.8700 0.9327
No log 3.1268 444 1.0220 0.6623 1.0220 1.0109
No log 3.1408 446 1.1568 0.6203 1.1568 1.0755
No log 3.1549 448 1.1233 0.6093 1.1233 1.0599
No log 3.1690 450 0.9876 0.6383 0.9876 0.9938
No log 3.1831 452 0.8126 0.6457 0.8126 0.9014
No log 3.1972 454 0.7353 0.6970 0.7353 0.8575
No log 3.2113 456 0.7074 0.7068 0.7074 0.8411
No log 3.2254 458 0.7119 0.7259 0.7119 0.8438
No log 3.2394 460 0.9340 0.6369 0.9340 0.9665
No log 3.2535 462 1.0673 0.6554 1.0673 1.0331
No log 3.2676 464 1.0102 0.6341 1.0102 1.0051
No log 3.2817 466 0.8466 0.6795 0.8466 0.9201
No log 3.2958 468 0.7423 0.7194 0.7423 0.8615
No log 3.3099 470 0.7499 0.7259 0.7499 0.8660
No log 3.3239 472 0.8142 0.6870 0.8142 0.9023
No log 3.3380 474 0.8775 0.6667 0.8775 0.9368
No log 3.3521 476 0.9104 0.6835 0.9104 0.9542
No log 3.3662 478 1.0112 0.6627 1.0112 1.0056
No log 3.3803 480 1.0261 0.6627 1.0261 1.0130
No log 3.3944 482 0.9375 0.6708 0.9375 0.9682
No log 3.4085 484 0.8628 0.6812 0.8628 0.9289
No log 3.4225 486 0.8253 0.6769 0.8253 0.9084
No log 3.4366 488 0.7833 0.6512 0.7833 0.8850
No log 3.4507 490 0.7329 0.7246 0.7329 0.8561
No log 3.4648 492 0.6724 0.7246 0.6724 0.8200
No log 3.4789 494 0.6104 0.7917 0.6104 0.7813
No log 3.4930 496 0.6171 0.7613 0.6171 0.7855
No log 3.5070 498 0.7663 0.7453 0.7663 0.8754
0.4825 3.5211 500 0.8382 0.7308 0.8382 0.9155
0.4825 3.5352 502 0.8454 0.7226 0.8454 0.9195
0.4825 3.5493 504 0.7908 0.6939 0.7908 0.8893
0.4825 3.5634 506 0.7962 0.6716 0.7962 0.8923
0.4825 3.5775 508 0.9049 0.6471 0.9049 0.9512
0.4825 3.5915 510 1.1214 0.6143 1.1214 1.0590

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task1_organization

Finetuned
(4205)
this model