ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k1_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6777
  • Qwk: 0.5454
  • Mse: 0.6777
  • Rmse: 0.8232

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.2857 2 4.0350 -0.0199 4.0350 2.0087
No log 0.5714 4 2.6483 0.0629 2.6483 1.6274
No log 0.8571 6 1.2272 0.1011 1.2272 1.1078
No log 1.1429 8 0.9227 0.0975 0.9227 0.9606
No log 1.4286 10 0.8869 0.1064 0.8869 0.9417
No log 1.7143 12 0.8252 0.0867 0.8252 0.9084
No log 2.0 14 0.8259 0.1431 0.8259 0.9088
No log 2.2857 16 0.8247 0.1287 0.8247 0.9081
No log 2.5714 18 0.8072 0.1077 0.8072 0.8984
No log 2.8571 20 0.7810 0.1352 0.7810 0.8838
No log 3.1429 22 0.8198 0.1804 0.8198 0.9054
No log 3.4286 24 0.9293 0.1349 0.9293 0.9640
No log 3.7143 26 0.8607 0.1917 0.8607 0.9277
No log 4.0 28 0.6750 0.3817 0.6750 0.8216
No log 4.2857 30 0.6549 0.4247 0.6549 0.8093
No log 4.5714 32 0.6504 0.4083 0.6504 0.8065
No log 4.8571 34 0.6473 0.4489 0.6473 0.8045
No log 5.1429 36 0.6867 0.3615 0.6867 0.8287
No log 5.4286 38 0.6672 0.4061 0.6672 0.8168
No log 5.7143 40 0.6544 0.4689 0.6544 0.8090
No log 6.0 42 0.6700 0.5025 0.6700 0.8185
No log 6.2857 44 0.6731 0.5206 0.6731 0.8204
No log 6.5714 46 0.7865 0.4173 0.7865 0.8869
No log 6.8571 48 0.7939 0.4203 0.7939 0.8910
No log 7.1429 50 0.7918 0.4011 0.7918 0.8898
No log 7.4286 52 0.7593 0.4268 0.7593 0.8714
No log 7.7143 54 0.7475 0.4896 0.7475 0.8646
No log 8.0 56 0.6685 0.5415 0.6685 0.8176
No log 8.2857 58 0.6847 0.5437 0.6847 0.8275
No log 8.5714 60 0.7136 0.5452 0.7136 0.8447
No log 8.8571 62 0.7021 0.5665 0.7021 0.8379
No log 9.1429 64 0.7001 0.5147 0.7001 0.8367
No log 9.4286 66 0.6969 0.5156 0.6969 0.8348
No log 9.7143 68 0.6952 0.5246 0.6952 0.8338
No log 10.0 70 0.7560 0.4793 0.7560 0.8695
No log 10.2857 72 0.7128 0.4482 0.7128 0.8443
No log 10.5714 74 0.6367 0.5162 0.6367 0.7979
No log 10.8571 76 0.6295 0.5176 0.6295 0.7934
No log 11.1429 78 0.6454 0.5479 0.6454 0.8034
No log 11.4286 80 0.7350 0.4425 0.7350 0.8573
No log 11.7143 82 0.7963 0.4285 0.7963 0.8923
No log 12.0 84 0.6902 0.5688 0.6902 0.8308
No log 12.2857 86 0.6721 0.4832 0.6721 0.8198
No log 12.5714 88 0.6977 0.5151 0.6977 0.8353
No log 12.8571 90 0.7624 0.5542 0.7624 0.8731
No log 13.1429 92 0.7246 0.5591 0.7246 0.8513
No log 13.4286 94 0.6927 0.5189 0.6927 0.8323
No log 13.7143 96 0.7002 0.4795 0.7002 0.8368
No log 14.0 98 0.7070 0.4804 0.7070 0.8408
No log 14.2857 100 0.7122 0.4970 0.7122 0.8439
No log 14.5714 102 0.8013 0.5186 0.8013 0.8952
No log 14.8571 104 0.8327 0.5182 0.8327 0.9125
No log 15.1429 106 0.7526 0.4937 0.7526 0.8675
No log 15.4286 108 0.8110 0.4873 0.8110 0.9006
No log 15.7143 110 0.8082 0.4865 0.8082 0.8990
No log 16.0 112 0.7553 0.5138 0.7553 0.8691
No log 16.2857 114 0.8106 0.5003 0.8106 0.9004
No log 16.5714 116 0.9329 0.4327 0.9329 0.9659
No log 16.8571 118 0.8411 0.4934 0.8411 0.9171
No log 17.1429 120 0.7623 0.5189 0.7623 0.8731
No log 17.4286 122 0.7293 0.4649 0.7293 0.8540
No log 17.7143 124 0.7198 0.5015 0.7198 0.8484
No log 18.0 126 0.8276 0.4718 0.8276 0.9097
No log 18.2857 128 0.8099 0.4220 0.8099 0.9000
No log 18.5714 130 0.7223 0.5414 0.7223 0.8499
No log 18.8571 132 0.7156 0.4894 0.7156 0.8459
No log 19.1429 134 0.7489 0.5357 0.7489 0.8654
No log 19.4286 136 0.8579 0.4720 0.8579 0.9262
No log 19.7143 138 1.0977 0.4112 1.0977 1.0477
No log 20.0 140 1.1266 0.3788 1.1266 1.0614
No log 20.2857 142 0.9039 0.4443 0.9039 0.9507
No log 20.5714 144 0.7587 0.4550 0.7587 0.8710
No log 20.8571 146 0.9137 0.4550 0.9137 0.9559
No log 21.1429 148 0.9401 0.4345 0.9401 0.9696
No log 21.4286 150 0.7985 0.4989 0.7985 0.8936
No log 21.7143 152 0.7222 0.4958 0.7222 0.8498
No log 22.0 154 0.7947 0.4813 0.7947 0.8915
No log 22.2857 156 0.8960 0.4513 0.8960 0.9466
No log 22.5714 158 0.8604 0.4604 0.8604 0.9276
No log 22.8571 160 0.7566 0.4897 0.7566 0.8698
No log 23.1429 162 0.7432 0.4803 0.7432 0.8621
No log 23.4286 164 0.7385 0.4499 0.7385 0.8594
No log 23.7143 166 0.7221 0.4926 0.7221 0.8498
No log 24.0 168 0.6786 0.5037 0.6786 0.8238
No log 24.2857 170 0.6521 0.5213 0.6521 0.8075
No log 24.5714 172 0.6847 0.5480 0.6847 0.8275
No log 24.8571 174 0.7619 0.4958 0.7619 0.8729
No log 25.1429 176 0.7944 0.5110 0.7944 0.8913
No log 25.4286 178 0.7576 0.5152 0.7576 0.8704
No log 25.7143 180 0.7450 0.4970 0.7450 0.8631
No log 26.0 182 0.7315 0.5191 0.7315 0.8553
No log 26.2857 184 0.6731 0.4999 0.6731 0.8204
No log 26.5714 186 0.6323 0.5110 0.6323 0.7952
No log 26.8571 188 0.6099 0.5065 0.6099 0.7810
No log 27.1429 190 0.5921 0.5203 0.5921 0.7695
No log 27.4286 192 0.5912 0.5033 0.5912 0.7689
No log 27.7143 194 0.6056 0.5214 0.6056 0.7782
No log 28.0 196 0.6496 0.5309 0.6496 0.8060
No log 28.2857 198 0.6526 0.5411 0.6526 0.8078
No log 28.5714 200 0.6361 0.5494 0.6361 0.7975
No log 28.8571 202 0.6423 0.5689 0.6423 0.8014
No log 29.1429 204 0.6740 0.5319 0.6740 0.8210
No log 29.4286 206 0.6470 0.5815 0.6470 0.8044
No log 29.7143 208 0.6456 0.5699 0.6456 0.8035
No log 30.0 210 0.6484 0.5604 0.6484 0.8053
No log 30.2857 212 0.6736 0.5339 0.6736 0.8207
No log 30.5714 214 0.6727 0.5390 0.6727 0.8202
No log 30.8571 216 0.6614 0.5838 0.6614 0.8133
No log 31.1429 218 0.6471 0.5613 0.6471 0.8044
No log 31.4286 220 0.6592 0.5601 0.6592 0.8119
No log 31.7143 222 0.6697 0.5703 0.6697 0.8184
No log 32.0 224 0.6926 0.5712 0.6926 0.8322
No log 32.2857 226 0.6821 0.5712 0.6821 0.8259
No log 32.5714 228 0.6511 0.5754 0.6511 0.8069
No log 32.8571 230 0.6351 0.5753 0.6351 0.7969
No log 33.1429 232 0.6322 0.5920 0.6322 0.7951
No log 33.4286 234 0.6451 0.5586 0.6451 0.8032
No log 33.7143 236 0.6618 0.5589 0.6618 0.8135
No log 34.0 238 0.6601 0.5579 0.6601 0.8125
No log 34.2857 240 0.6363 0.5627 0.6363 0.7977
No log 34.5714 242 0.6240 0.5622 0.6240 0.7899
No log 34.8571 244 0.6247 0.5966 0.6247 0.7904
No log 35.1429 246 0.6523 0.5404 0.6523 0.8077
No log 35.4286 248 0.6558 0.5456 0.6558 0.8098
No log 35.7143 250 0.6322 0.5590 0.6322 0.7951
No log 36.0 252 0.6330 0.5569 0.6330 0.7956
No log 36.2857 254 0.6452 0.5404 0.6452 0.8032
No log 36.5714 256 0.6736 0.5384 0.6736 0.8207
No log 36.8571 258 0.7070 0.5098 0.7070 0.8408
No log 37.1429 260 0.7118 0.5324 0.7118 0.8437
No log 37.4286 262 0.7057 0.5078 0.7057 0.8401
No log 37.7143 264 0.6760 0.5550 0.6760 0.8222
No log 38.0 266 0.6459 0.5505 0.6459 0.8037
No log 38.2857 268 0.6560 0.5623 0.6560 0.8100
No log 38.5714 270 0.6932 0.5207 0.6932 0.8326
No log 38.8571 272 0.7327 0.5379 0.7327 0.8560
No log 39.1429 274 0.7813 0.5166 0.7813 0.8839
No log 39.4286 276 0.7910 0.5149 0.7910 0.8894
No log 39.7143 278 0.7479 0.5186 0.7479 0.8648
No log 40.0 280 0.6983 0.5227 0.6983 0.8357
No log 40.2857 282 0.6572 0.5307 0.6572 0.8107
No log 40.5714 284 0.6532 0.5604 0.6532 0.8082
No log 40.8571 286 0.6673 0.5176 0.6673 0.8169
No log 41.1429 288 0.7071 0.5088 0.7071 0.8409
No log 41.4286 290 0.7369 0.5176 0.7369 0.8584
No log 41.7143 292 0.7402 0.5487 0.7402 0.8603
No log 42.0 294 0.7310 0.5173 0.7310 0.8550
No log 42.2857 296 0.7470 0.5249 0.7470 0.8643
No log 42.5714 298 0.7522 0.5249 0.7522 0.8673
No log 42.8571 300 0.7203 0.5124 0.7203 0.8487
No log 43.1429 302 0.7119 0.5166 0.7119 0.8438
No log 43.4286 304 0.7256 0.5247 0.7256 0.8518
No log 43.7143 306 0.6882 0.5306 0.6882 0.8296
No log 44.0 308 0.6418 0.5495 0.6418 0.8011
No log 44.2857 310 0.6225 0.5432 0.6225 0.7890
No log 44.5714 312 0.6302 0.5155 0.6302 0.7938
No log 44.8571 314 0.6405 0.5076 0.6405 0.8003
No log 45.1429 316 0.6480 0.5609 0.6480 0.8050
No log 45.4286 318 0.6700 0.5596 0.6700 0.8185
No log 45.7143 320 0.7000 0.5435 0.7000 0.8367
No log 46.0 322 0.7163 0.5291 0.7163 0.8463
No log 46.2857 324 0.7169 0.5497 0.7169 0.8467
No log 46.5714 326 0.7161 0.5455 0.7161 0.8462
No log 46.8571 328 0.7148 0.5455 0.7148 0.8455
No log 47.1429 330 0.6896 0.5470 0.6896 0.8304
No log 47.4286 332 0.6641 0.5470 0.6641 0.8149
No log 47.7143 334 0.6599 0.5538 0.6599 0.8124
No log 48.0 336 0.6759 0.5411 0.6759 0.8221
No log 48.2857 338 0.6829 0.5451 0.6829 0.8264
No log 48.5714 340 0.7059 0.5504 0.7059 0.8402
No log 48.8571 342 0.7190 0.5397 0.7190 0.8479
No log 49.1429 344 0.7242 0.5552 0.7242 0.8510
No log 49.4286 346 0.7199 0.5274 0.7199 0.8484
No log 49.7143 348 0.7010 0.5376 0.7010 0.8373
No log 50.0 350 0.6517 0.5472 0.6517 0.8073
No log 50.2857 352 0.6156 0.5717 0.6156 0.7846
No log 50.5714 354 0.6031 0.5693 0.6031 0.7766
No log 50.8571 356 0.6089 0.5610 0.6089 0.7803
No log 51.1429 358 0.6393 0.5472 0.6393 0.7996
No log 51.4286 360 0.6559 0.5472 0.6559 0.8099
No log 51.7143 362 0.6479 0.5361 0.6479 0.8049
No log 52.0 364 0.6354 0.5184 0.6354 0.7971
No log 52.2857 366 0.6326 0.5432 0.6326 0.7954
No log 52.5714 368 0.6287 0.5448 0.6287 0.7929
No log 52.8571 370 0.6253 0.5375 0.6253 0.7907
No log 53.1429 372 0.6386 0.5223 0.6386 0.7991
No log 53.4286 374 0.6521 0.5211 0.6521 0.8075
No log 53.7143 376 0.6852 0.5145 0.6852 0.8278
No log 54.0 378 0.7146 0.4954 0.7146 0.8453
No log 54.2857 380 0.7076 0.5041 0.7076 0.8412
No log 54.5714 382 0.6714 0.5281 0.6714 0.8194
No log 54.8571 384 0.6628 0.5354 0.6628 0.8141
No log 55.1429 386 0.6930 0.5215 0.6930 0.8325
No log 55.4286 388 0.7187 0.5375 0.7187 0.8478
No log 55.7143 390 0.7287 0.5345 0.7287 0.8537
No log 56.0 392 0.7129 0.5292 0.7129 0.8443
No log 56.2857 394 0.6875 0.5285 0.6875 0.8292
No log 56.5714 396 0.6787 0.5294 0.6787 0.8239
No log 56.8571 398 0.6760 0.5049 0.6760 0.8222
No log 57.1429 400 0.6946 0.5048 0.6946 0.8335
No log 57.4286 402 0.7129 0.5328 0.7129 0.8443
No log 57.7143 404 0.7263 0.5145 0.7263 0.8522
No log 58.0 406 0.7396 0.5328 0.7396 0.8600
No log 58.2857 408 0.7696 0.5372 0.7696 0.8773
No log 58.5714 410 0.8131 0.5258 0.8131 0.9017
No log 58.8571 412 0.8491 0.5289 0.8491 0.9215
No log 59.1429 414 0.8707 0.5017 0.8707 0.9331
No log 59.4286 416 0.8414 0.5302 0.8414 0.9173
No log 59.7143 418 0.7996 0.5264 0.7996 0.8942
No log 60.0 420 0.7762 0.5434 0.7762 0.8810
No log 60.2857 422 0.7475 0.4849 0.7475 0.8646
No log 60.5714 424 0.7196 0.5050 0.7196 0.8483
No log 60.8571 426 0.7036 0.4583 0.7036 0.8388
No log 61.1429 428 0.7064 0.4898 0.7064 0.8405
No log 61.4286 430 0.7020 0.5300 0.7020 0.8379
No log 61.7143 432 0.7023 0.5427 0.7023 0.8380
No log 62.0 434 0.6878 0.5137 0.6878 0.8293
No log 62.2857 436 0.6777 0.5083 0.6777 0.8232
No log 62.5714 438 0.6602 0.5118 0.6602 0.8125
No log 62.8571 440 0.6440 0.5132 0.6440 0.8025
No log 63.1429 442 0.6366 0.5268 0.6366 0.7979
No log 63.4286 444 0.6385 0.5276 0.6385 0.7990
No log 63.7143 446 0.6485 0.4741 0.6485 0.8053
No log 64.0 448 0.6651 0.4912 0.6651 0.8155
No log 64.2857 450 0.6795 0.5076 0.6795 0.8243
No log 64.5714 452 0.6790 0.5076 0.6790 0.8240
No log 64.8571 454 0.6826 0.5048 0.6826 0.8262
No log 65.1429 456 0.6876 0.5119 0.6876 0.8292
No log 65.4286 458 0.6815 0.5143 0.6815 0.8255
No log 65.7143 460 0.6733 0.5215 0.6733 0.8205
No log 66.0 462 0.6692 0.5287 0.6692 0.8180
No log 66.2857 464 0.6648 0.5552 0.6648 0.8153
No log 66.5714 466 0.6596 0.5481 0.6596 0.8121
No log 66.8571 468 0.6518 0.5280 0.6518 0.8073
No log 67.1429 470 0.6491 0.5280 0.6491 0.8056
No log 67.4286 472 0.6454 0.5199 0.6454 0.8033
No log 67.7143 474 0.6435 0.5458 0.6435 0.8022
No log 68.0 476 0.6431 0.6018 0.6431 0.8019
No log 68.2857 478 0.6438 0.5574 0.6438 0.8024
No log 68.5714 480 0.6496 0.5405 0.6496 0.8060
No log 68.8571 482 0.6594 0.5405 0.6594 0.8120
No log 69.1429 484 0.6599 0.5843 0.6599 0.8123
No log 69.4286 486 0.6566 0.5505 0.6566 0.8103
No log 69.7143 488 0.6610 0.5394 0.6610 0.8130
No log 70.0 490 0.6736 0.5470 0.6736 0.8207
No log 70.2857 492 0.6762 0.5369 0.6762 0.8223
No log 70.5714 494 0.6642 0.5363 0.6642 0.8150
No log 70.8571 496 0.6484 0.5461 0.6484 0.8053
No log 71.1429 498 0.6364 0.5399 0.6364 0.7977
0.2277 71.4286 500 0.6365 0.5478 0.6365 0.7978
0.2277 71.7143 502 0.6505 0.5556 0.6505 0.8065
0.2277 72.0 504 0.6599 0.5538 0.6599 0.8123
0.2277 72.2857 506 0.6655 0.5538 0.6655 0.8158
0.2277 72.5714 508 0.6689 0.5415 0.6689 0.8179
0.2277 72.8571 510 0.6631 0.5069 0.6631 0.8143
0.2277 73.1429 512 0.6635 0.4959 0.6635 0.8145
0.2277 73.4286 514 0.6726 0.5257 0.6726 0.8201
0.2277 73.7143 516 0.6777 0.5454 0.6777 0.8232

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k1_task2_organization

Finetuned
(4222)
this model