ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8098
  • Qwk: 0.7117
  • Mse: 0.8098
  • Rmse: 0.8999

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0294 2 7.0395 0.0179 7.0395 2.6532
No log 0.0588 4 5.1483 0.0606 5.1483 2.2690
No log 0.0882 6 3.0370 0.0848 3.0370 1.7427
No log 0.1176 8 2.7951 0.0662 2.7951 1.6719
No log 0.1471 10 2.2571 0.1702 2.2571 1.5024
No log 0.1765 12 1.8763 0.2982 1.8763 1.3698
No log 0.2059 14 1.9248 0.2075 1.9248 1.3874
No log 0.2353 16 1.7585 0.1143 1.7585 1.3261
No log 0.2647 18 1.8330 0.1524 1.8330 1.3539
No log 0.2941 20 2.1825 0.1094 2.1825 1.4773
No log 0.3235 22 2.2130 0.1185 2.2130 1.4876
No log 0.3529 24 1.8403 0.1951 1.8403 1.3566
No log 0.3824 26 1.3352 0.3604 1.3352 1.1555
No log 0.4118 28 1.1838 0.4874 1.1838 1.0880
No log 0.4412 30 1.1087 0.4793 1.1087 1.0530
No log 0.4706 32 1.0324 0.5528 1.0324 1.0161
No log 0.5 34 0.9736 0.5000 0.9736 0.9867
No log 0.5294 36 0.9341 0.5714 0.9341 0.9665
No log 0.5588 38 1.2072 0.5286 1.2072 1.0987
No log 0.5882 40 1.3235 0.5070 1.3235 1.1504
No log 0.6176 42 1.0669 0.5401 1.0669 1.0329
No log 0.6471 44 1.0482 0.5507 1.0482 1.0238
No log 0.6765 46 0.9463 0.6286 0.9463 0.9728
No log 0.7059 48 0.7672 0.7172 0.7672 0.8759
No log 0.7353 50 0.7751 0.6853 0.7751 0.8804
No log 0.7647 52 0.7610 0.7034 0.7610 0.8724
No log 0.7941 54 0.7514 0.6901 0.7514 0.8668
No log 0.8235 56 0.9088 0.6713 0.9088 0.9533
No log 0.8529 58 0.8455 0.6383 0.8455 0.9195
No log 0.8824 60 0.7790 0.7114 0.7790 0.8826
No log 0.9118 62 0.7595 0.6712 0.7595 0.8715
No log 0.9412 64 0.7645 0.6434 0.7645 0.8744
No log 0.9706 66 0.7649 0.6759 0.7649 0.8746
No log 1.0 68 0.7557 0.6974 0.7557 0.8693
No log 1.0294 70 0.7233 0.6923 0.7233 0.8505
No log 1.0588 72 0.7422 0.7205 0.7422 0.8615
No log 1.0882 74 0.7672 0.6923 0.7672 0.8759
No log 1.1176 76 0.7022 0.7261 0.7022 0.8379
No log 1.1471 78 0.7780 0.7261 0.7780 0.8821
No log 1.1765 80 0.7394 0.7273 0.7394 0.8599
No log 1.2059 82 0.6291 0.7285 0.6291 0.7931
No log 1.2353 84 0.6166 0.7702 0.6166 0.7852
No log 1.2647 86 0.5484 0.8049 0.5484 0.7405
No log 1.2941 88 0.5229 0.8235 0.5229 0.7231
No log 1.3235 90 0.6760 0.7935 0.6760 0.8222
No log 1.3529 92 0.7032 0.7766 0.7032 0.8386
No log 1.3824 94 0.7297 0.7853 0.7297 0.8542
No log 1.4118 96 0.6629 0.7935 0.6629 0.8142
No log 1.4412 98 0.5202 0.8421 0.5202 0.7213
No log 1.4706 100 0.5782 0.8144 0.5782 0.7604
No log 1.5 102 0.6131 0.7853 0.6131 0.7830
No log 1.5294 104 0.8951 0.7166 0.8951 0.9461
No log 1.5588 106 0.8565 0.7514 0.8565 0.9255
No log 1.5882 108 0.5821 0.7742 0.5821 0.7629
No log 1.6176 110 0.6736 0.7671 0.6736 0.8207
No log 1.6471 112 0.5505 0.8077 0.5505 0.7420
No log 1.6765 114 0.6847 0.7607 0.6847 0.8275
No log 1.7059 116 0.5990 0.7950 0.5990 0.7739
No log 1.7353 118 0.5772 0.8050 0.5772 0.7597
No log 1.7647 120 0.7171 0.76 0.7171 0.8468
No log 1.7941 122 0.5153 0.8372 0.5153 0.7178
No log 1.8235 124 0.7707 0.7668 0.7707 0.8779
No log 1.8529 126 1.0879 0.6424 1.0879 1.0430
No log 1.8824 128 1.1616 0.6104 1.1616 1.0778
No log 1.9118 130 0.9057 0.6914 0.9057 0.9517
No log 1.9412 132 0.5379 0.8284 0.5379 0.7334
No log 1.9706 134 0.4741 0.8415 0.4741 0.6886
No log 2.0 136 0.5138 0.8176 0.5138 0.7168
No log 2.0294 138 0.5889 0.7973 0.5889 0.7674
No log 2.0588 140 0.7011 0.6763 0.7011 0.8373
No log 2.0882 142 0.7792 0.6620 0.7792 0.8827
No log 2.1176 144 0.7033 0.7027 0.7033 0.8386
No log 2.1471 146 0.5803 0.7949 0.5803 0.7618
No log 2.1765 148 0.5704 0.7843 0.5704 0.7552
No log 2.2059 150 0.6696 0.7547 0.6696 0.8183
No log 2.2353 152 0.7605 0.7647 0.7605 0.8721
No log 2.2647 154 0.7006 0.7758 0.7006 0.8370
No log 2.2941 156 0.6253 0.7742 0.6253 0.7907
No log 2.3235 158 0.5613 0.7763 0.5613 0.7492
No log 2.3529 160 0.5788 0.7632 0.5788 0.7608
No log 2.3824 162 0.6759 0.7848 0.6759 0.8222
No log 2.4118 164 0.7429 0.7403 0.7429 0.8619
No log 2.4412 166 0.6738 0.7397 0.6738 0.8209
No log 2.4706 168 0.6230 0.7324 0.6230 0.7893
No log 2.5 170 0.6336 0.7273 0.6336 0.7960
No log 2.5294 172 0.7020 0.7564 0.7020 0.8379
No log 2.5588 174 0.9805 0.7135 0.9805 0.9902
No log 2.5882 176 0.9088 0.7143 0.9088 0.9533
No log 2.6176 178 0.7490 0.7374 0.7490 0.8655
No log 2.6471 180 0.7048 0.7362 0.7048 0.8395
No log 2.6765 182 0.7408 0.7226 0.7408 0.8607
No log 2.7059 184 0.8208 0.6887 0.8208 0.9060
No log 2.7353 186 0.8768 0.72 0.8768 0.9364
No log 2.7647 188 0.8265 0.7075 0.8265 0.9091
No log 2.7941 190 0.6941 0.7310 0.6941 0.8331
No log 2.8235 192 0.6287 0.7397 0.6287 0.7929
No log 2.8529 194 0.5668 0.7871 0.5668 0.7528
No log 2.8824 196 0.5075 0.8050 0.5075 0.7124
No log 2.9118 198 0.5244 0.8324 0.5244 0.7242
No log 2.9412 200 0.5201 0.8256 0.5201 0.7211
No log 2.9706 202 0.4614 0.8434 0.4614 0.6793
No log 3.0 204 0.5550 0.8242 0.5550 0.7450
No log 3.0294 206 0.6634 0.7889 0.6634 0.8145
No log 3.0588 208 0.6471 0.7636 0.6471 0.8044
No log 3.0882 210 0.5344 0.7875 0.5344 0.7310
No log 3.1176 212 0.5137 0.8129 0.5137 0.7168
No log 3.1471 214 0.5478 0.7875 0.5478 0.7401
No log 3.1765 216 0.5731 0.7875 0.5731 0.7570
No log 3.2059 218 0.6063 0.7625 0.6063 0.7787
No log 3.2353 220 0.6328 0.7730 0.6328 0.7955
No log 3.2647 222 0.7190 0.7760 0.7190 0.8480
No log 3.2941 224 0.7115 0.7760 0.7115 0.8435
No log 3.3235 226 0.5232 0.8313 0.5232 0.7234
No log 3.3529 228 0.4758 0.8293 0.4758 0.6898
No log 3.3824 230 0.5039 0.8272 0.5039 0.7099
No log 3.4118 232 0.5216 0.7919 0.5216 0.7223
No log 3.4412 234 0.6860 0.7436 0.6860 0.8282
No log 3.4706 236 0.8256 0.7425 0.8256 0.9086
No log 3.5 238 0.6921 0.7296 0.6921 0.8319
No log 3.5294 240 0.5410 0.8052 0.5410 0.7356
No log 3.5588 242 0.5334 0.8101 0.5334 0.7303
No log 3.5882 244 0.5598 0.8364 0.5598 0.7482
No log 3.6176 246 0.6231 0.7784 0.6231 0.7894
No log 3.6471 248 0.7352 0.7425 0.7352 0.8574
No log 3.6765 250 0.9671 0.6882 0.9671 0.9834
No log 3.7059 252 1.0205 0.6882 1.0205 1.0102
No log 3.7353 254 0.7955 0.7308 0.7955 0.8919
No log 3.7647 256 0.6052 0.7448 0.6052 0.7779
No log 3.7941 258 0.5792 0.7838 0.5792 0.7610
No log 3.8235 260 0.6250 0.7532 0.6250 0.7906
No log 3.8529 262 0.6739 0.7805 0.6739 0.8209
No log 3.8824 264 0.6612 0.7879 0.6612 0.8131
No log 3.9118 266 0.5659 0.8263 0.5659 0.7523
No log 3.9412 268 0.6186 0.8 0.6186 0.7865
No log 3.9706 270 0.5902 0.7952 0.5902 0.7682
No log 4.0 272 0.5388 0.8521 0.5388 0.7340
No log 4.0294 274 0.4640 0.8302 0.4640 0.6812
No log 4.0588 276 0.4772 0.8025 0.4772 0.6908
No log 4.0882 278 0.4724 0.8302 0.4724 0.6873
No log 4.1176 280 0.4595 0.8395 0.4595 0.6778
No log 4.1471 282 0.4621 0.8383 0.4621 0.6798
No log 4.1765 284 0.4570 0.8488 0.4570 0.6760
No log 4.2059 286 0.4711 0.8621 0.4711 0.6864
No log 4.2353 288 0.5541 0.8391 0.5541 0.7444
No log 4.2647 290 0.5272 0.8538 0.5272 0.7261
No log 4.2941 292 0.4590 0.8428 0.4590 0.6775
No log 4.3235 294 0.5050 0.8258 0.5050 0.7106
No log 4.3529 296 0.5188 0.8158 0.5188 0.7203
No log 4.3824 298 0.5158 0.8258 0.5158 0.7182
No log 4.4118 300 0.6153 0.7582 0.6153 0.7844
No log 4.4412 302 0.7424 0.7308 0.7424 0.8616
No log 4.4706 304 0.7378 0.7451 0.7378 0.8590
No log 4.5 306 0.7729 0.7152 0.7729 0.8792
No log 4.5294 308 0.9114 0.6962 0.9114 0.9547
No log 4.5588 310 0.9323 0.6918 0.9323 0.9655
No log 4.5882 312 0.8096 0.6962 0.8096 0.8998
No log 4.6176 314 0.6126 0.7651 0.6126 0.7827
No log 4.6471 316 0.5100 0.8125 0.5100 0.7141
No log 4.6765 318 0.4978 0.8324 0.4978 0.7055
No log 4.7059 320 0.6458 0.8247 0.6458 0.8036
No log 4.7353 322 0.8179 0.76 0.8179 0.9044
No log 4.7647 324 0.7867 0.7526 0.7867 0.8870
No log 4.7941 326 0.5448 0.8398 0.5448 0.7381
No log 4.8235 328 0.4714 0.8293 0.4714 0.6866
No log 4.8529 330 0.5428 0.7821 0.5428 0.7368
No log 4.8824 332 0.5169 0.8052 0.5169 0.7190
No log 4.9118 334 0.5558 0.8105 0.5558 0.7455
No log 4.9412 336 0.8226 0.7176 0.8226 0.9070
No log 4.9706 338 0.8923 0.6857 0.8923 0.9446
No log 5.0 340 0.7822 0.7117 0.7822 0.8844
No log 5.0294 342 0.7511 0.7172 0.7511 0.8667
No log 5.0588 344 0.7978 0.7 0.7978 0.8932
No log 5.0882 346 0.8963 0.6759 0.8963 0.9467
No log 5.1176 348 0.9122 0.6711 0.9122 0.9551
No log 5.1471 350 0.7902 0.6906 0.7902 0.8889
No log 5.1765 352 0.6873 0.7432 0.6873 0.8290
No log 5.2059 354 0.5766 0.7895 0.5766 0.7593
No log 5.2353 356 0.5382 0.8 0.5382 0.7336
No log 5.2647 358 0.5374 0.8101 0.5374 0.7331
No log 5.2941 360 0.5512 0.8101 0.5512 0.7425
No log 5.3235 362 0.6432 0.7811 0.6432 0.8020
No log 5.3529 364 0.7042 0.7647 0.7042 0.8392
No log 5.3824 366 0.7441 0.7470 0.7441 0.8626
No log 5.4118 368 0.7178 0.7261 0.7178 0.8472
No log 5.4412 370 0.7364 0.7222 0.7364 0.8581
No log 5.4706 372 0.7520 0.7101 0.7520 0.8672
No log 5.5 374 0.7868 0.6993 0.7868 0.8870
No log 5.5294 376 0.7768 0.7211 0.7768 0.8814
No log 5.5588 378 0.8252 0.7081 0.8252 0.9084
No log 5.5882 380 0.8301 0.7195 0.8301 0.9111
No log 5.6176 382 0.8481 0.7037 0.8481 0.9209
No log 5.6471 384 0.6918 0.7383 0.6918 0.8317
No log 5.6765 386 0.5629 0.7536 0.5629 0.7503
No log 5.7059 388 0.5389 0.7917 0.5389 0.7341
No log 5.7353 390 0.5165 0.8108 0.5165 0.7187
No log 5.7647 392 0.5137 0.8447 0.5137 0.7167
No log 5.7941 394 0.4778 0.8485 0.4778 0.6913
No log 5.8235 396 0.4482 0.8415 0.4482 0.6694
No log 5.8529 398 0.4501 0.8153 0.4501 0.6709
No log 5.8824 400 0.4594 0.8153 0.4594 0.6778
No log 5.9118 402 0.4939 0.8375 0.4939 0.7028
No log 5.9412 404 0.5036 0.8375 0.5036 0.7096
No log 5.9706 406 0.5284 0.8075 0.5284 0.7269
No log 6.0 408 0.5707 0.7949 0.5707 0.7554
No log 6.0294 410 0.6259 0.7568 0.6259 0.7911
No log 6.0588 412 0.6509 0.7568 0.6509 0.8068
No log 6.0882 414 0.6096 0.7397 0.6096 0.7808
No log 6.1176 416 0.5567 0.7895 0.5567 0.7461
No log 6.1471 418 0.5417 0.8153 0.5417 0.7360
No log 6.1765 420 0.5425 0.8 0.5425 0.7366
No log 6.2059 422 0.5758 0.8372 0.5758 0.7588
No log 6.2353 424 0.7540 0.7640 0.7540 0.8683
No log 6.2647 426 0.8285 0.7168 0.8285 0.9102
No log 6.2941 428 0.7435 0.7394 0.7435 0.8622
No log 6.3235 430 0.6127 0.7552 0.6127 0.7827
No log 6.3529 432 0.5628 0.7808 0.5628 0.7502
No log 6.3824 434 0.5624 0.7651 0.5624 0.7500
No log 6.4118 436 0.5868 0.7867 0.5868 0.7660
No log 6.4412 438 0.6609 0.7561 0.6609 0.8130
No log 6.4706 440 0.6473 0.7468 0.6473 0.8046
No log 6.5 442 0.6062 0.7568 0.6062 0.7786
No log 6.5294 444 0.5987 0.8133 0.5987 0.7738
No log 6.5588 446 0.5881 0.8289 0.5881 0.7669
No log 6.5882 448 0.5570 0.7867 0.5570 0.7464
No log 6.6176 450 0.6182 0.7857 0.6182 0.7863
No log 6.6471 452 0.7112 0.7342 0.7112 0.8433
No log 6.6765 454 0.6818 0.7383 0.6818 0.8257
No log 6.7059 456 0.7311 0.7222 0.7311 0.8550
No log 6.7353 458 0.7385 0.7273 0.7385 0.8594
No log 6.7647 460 0.7467 0.7190 0.7467 0.8641
No log 6.7941 462 0.7052 0.7320 0.7052 0.8398
No log 6.8235 464 0.6519 0.7436 0.6519 0.8074
No log 6.8529 466 0.6473 0.7547 0.6473 0.8045
No log 6.8824 468 0.5998 0.7730 0.5998 0.7745
No log 6.9118 470 0.5652 0.8 0.5652 0.7518
No log 6.9412 472 0.4801 0.8193 0.4801 0.6929
No log 6.9706 474 0.4435 0.8519 0.4435 0.6660
No log 7.0 476 0.4343 0.8434 0.4343 0.6590
No log 7.0294 478 0.5023 0.8177 0.5023 0.7087
No log 7.0588 480 0.6469 0.8 0.6469 0.8043
No log 7.0882 482 0.6603 0.7528 0.6603 0.8126
No log 7.1176 484 0.5575 0.8364 0.5575 0.7467
No log 7.1471 486 0.5195 0.8344 0.5195 0.7207
No log 7.1765 488 0.5460 0.8344 0.5460 0.7389
No log 7.2059 490 0.5852 0.8272 0.5852 0.7650
No log 7.2353 492 0.6445 0.7665 0.6445 0.8028
No log 7.2647 494 0.6599 0.7647 0.6599 0.8124
No log 7.2941 496 0.6938 0.7683 0.6938 0.8329
No log 7.3235 498 0.7081 0.7389 0.7081 0.8415
0.3819 7.3529 500 0.6216 0.7733 0.6216 0.7884
0.3819 7.3824 502 0.6040 0.75 0.6040 0.7772
0.3819 7.4118 504 0.5856 0.7671 0.5856 0.7652
0.3819 7.4412 506 0.5855 0.7671 0.5855 0.7652
0.3819 7.4706 508 0.6596 0.7578 0.6596 0.8121
0.3819 7.5 510 0.8098 0.7117 0.8098 0.8999

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k9_task1_organization

Finetuned
(4205)
this model