ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6770
  • Qwk: 0.4036
  • Mse: 0.6770
  • Rmse: 0.8228

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0417 2 4.3763 -0.0241 4.3763 2.0920
No log 0.0833 4 2.5518 -0.0214 2.5518 1.5974
No log 0.125 6 1.7974 -0.0379 1.7974 1.3407
No log 0.1667 8 1.6811 -0.0766 1.6811 1.2966
No log 0.2083 10 1.2895 0.0278 1.2895 1.1355
No log 0.25 12 0.9071 0.0152 0.9071 0.9524
No log 0.2917 14 0.7998 0.2023 0.7998 0.8943
No log 0.3333 16 0.8628 0.1547 0.8628 0.9289
No log 0.375 18 0.9525 0.0342 0.9525 0.9760
No log 0.4167 20 0.9968 -0.0459 0.9968 0.9984
No log 0.4583 22 1.0992 -0.0007 1.0992 1.0484
No log 0.5 24 0.9171 0.1080 0.9171 0.9577
No log 0.5417 26 0.8036 0.1629 0.8036 0.8965
No log 0.5833 28 0.8193 0.2164 0.8193 0.9052
No log 0.625 30 0.9284 0.1018 0.9284 0.9635
No log 0.6667 32 0.9444 0.1126 0.9444 0.9718
No log 0.7083 34 0.8649 0.1550 0.8649 0.9300
No log 0.75 36 0.7703 0.2954 0.7703 0.8777
No log 0.7917 38 0.7510 0.1926 0.7510 0.8666
No log 0.8333 40 0.7570 0.2261 0.7570 0.8701
No log 0.875 42 0.7597 0.2041 0.7597 0.8716
No log 0.9167 44 0.7696 0.2125 0.7696 0.8773
No log 0.9583 46 0.7838 0.2567 0.7838 0.8853
No log 1.0 48 0.7830 0.2769 0.7830 0.8849
No log 1.0417 50 0.8013 0.1694 0.8013 0.8951
No log 1.0833 52 0.7731 0.2691 0.7731 0.8793
No log 1.125 54 0.8291 0.1590 0.8291 0.9106
No log 1.1667 56 0.9941 0.2039 0.9941 0.9970
No log 1.2083 58 0.8577 0.2499 0.8577 0.9261
No log 1.25 60 0.6666 0.4292 0.6666 0.8164
No log 1.2917 62 0.6766 0.4153 0.6766 0.8225
No log 1.3333 64 0.6739 0.4345 0.6739 0.8209
No log 1.375 66 0.8257 0.3266 0.8257 0.9087
No log 1.4167 68 0.7408 0.3257 0.7408 0.8607
No log 1.4583 70 0.6725 0.4677 0.6725 0.8201
No log 1.5 72 0.8568 0.2649 0.8568 0.9256
No log 1.5417 74 0.8973 0.2767 0.8973 0.9472
No log 1.5833 76 0.6858 0.4710 0.6858 0.8281
No log 1.625 78 0.7562 0.4394 0.7562 0.8696
No log 1.6667 80 0.7523 0.4350 0.7523 0.8674
No log 1.7083 82 0.6980 0.4770 0.6980 0.8355
No log 1.75 84 0.7730 0.3553 0.7730 0.8792
No log 1.7917 86 0.7739 0.3240 0.7739 0.8797
No log 1.8333 88 0.6665 0.4467 0.6665 0.8164
No log 1.875 90 0.7671 0.4972 0.7671 0.8758
No log 1.9167 92 0.6957 0.4493 0.6957 0.8341
No log 1.9583 94 0.6830 0.5031 0.6830 0.8264
No log 2.0 96 0.6613 0.4708 0.6613 0.8132
No log 2.0417 98 0.7047 0.4664 0.7047 0.8395
No log 2.0833 100 0.7138 0.4500 0.7138 0.8449
No log 2.125 102 0.6829 0.4111 0.6829 0.8264
No log 2.1667 104 0.7300 0.3936 0.7300 0.8544
No log 2.2083 106 0.7319 0.3741 0.7319 0.8555
No log 2.25 108 0.7594 0.4012 0.7594 0.8715
No log 2.2917 110 0.8482 0.4625 0.8482 0.9210
No log 2.3333 112 0.7790 0.4267 0.7790 0.8826
No log 2.375 114 0.7420 0.4344 0.7420 0.8614
No log 2.4167 116 0.7450 0.4243 0.7450 0.8631
No log 2.4583 118 0.8100 0.5 0.8100 0.9000
No log 2.5 120 0.7595 0.4268 0.7595 0.8715
No log 2.5417 122 0.7847 0.4336 0.7847 0.8858
No log 2.5833 124 0.8454 0.3721 0.8454 0.9195
No log 2.625 126 0.9016 0.3498 0.9016 0.9495
No log 2.6667 128 0.6546 0.4102 0.6546 0.8090
No log 2.7083 130 0.9167 0.4554 0.9167 0.9574
No log 2.75 132 0.9086 0.4568 0.9086 0.9532
No log 2.7917 134 0.6499 0.4209 0.6499 0.8061
No log 2.8333 136 0.6471 0.4006 0.6471 0.8044
No log 2.875 138 0.6925 0.4192 0.6925 0.8322
No log 2.9167 140 0.6870 0.4560 0.6870 0.8289
No log 2.9583 142 0.6520 0.4219 0.6520 0.8075
No log 3.0 144 0.6578 0.4705 0.6578 0.8111
No log 3.0417 146 0.6517 0.4483 0.6517 0.8073
No log 3.0833 148 0.6509 0.4652 0.6509 0.8068
No log 3.125 150 0.6599 0.4265 0.6599 0.8124
No log 3.1667 152 0.6971 0.4719 0.6971 0.8349
No log 3.2083 154 0.8255 0.3920 0.8255 0.9086
No log 3.25 156 0.6865 0.4837 0.6865 0.8286
No log 3.2917 158 0.6565 0.4615 0.6565 0.8103
No log 3.3333 160 0.6561 0.5159 0.6561 0.8100
No log 3.375 162 0.6347 0.5216 0.6347 0.7967
No log 3.4167 164 0.6688 0.3936 0.6688 0.8178
No log 3.4583 166 0.7227 0.4073 0.7227 0.8501
No log 3.5 168 0.6802 0.4346 0.6802 0.8247
No log 3.5417 170 0.6419 0.4757 0.6419 0.8012
No log 3.5833 172 0.6659 0.4606 0.6659 0.8160
No log 3.625 174 0.6612 0.4693 0.6612 0.8131
No log 3.6667 176 0.9147 0.4508 0.9147 0.9564
No log 3.7083 178 0.9688 0.4311 0.9688 0.9843
No log 3.75 180 0.8884 0.4685 0.8884 0.9425
No log 3.7917 182 0.6974 0.4672 0.6974 0.8351
No log 3.8333 184 0.6822 0.4710 0.6822 0.8260
No log 3.875 186 0.7263 0.4732 0.7263 0.8522
No log 3.9167 188 0.6827 0.4294 0.6827 0.8263
No log 3.9583 190 0.7398 0.4927 0.7398 0.8601
No log 4.0 192 0.7056 0.5028 0.7056 0.8400
No log 4.0417 194 0.6446 0.4303 0.6446 0.8029
No log 4.0833 196 0.6629 0.5071 0.6629 0.8142
No log 4.125 198 0.6690 0.4871 0.6690 0.8179
No log 4.1667 200 0.6647 0.4867 0.6647 0.8153
No log 4.2083 202 0.6469 0.4999 0.6469 0.8043
No log 4.25 204 0.6364 0.4590 0.6364 0.7977
No log 4.2917 206 0.6464 0.4948 0.6464 0.8040
No log 4.3333 208 0.6253 0.4820 0.6253 0.7908
No log 4.375 210 0.6426 0.3894 0.6426 0.8016
No log 4.4167 212 0.6661 0.5153 0.6661 0.8161
No log 4.4583 214 0.6897 0.4859 0.6897 0.8305
No log 4.5 216 0.6889 0.4816 0.6889 0.8300
No log 4.5417 218 0.6517 0.4779 0.6517 0.8073
No log 4.5833 220 0.6466 0.4228 0.6466 0.8041
No log 4.625 222 0.6280 0.4738 0.6280 0.7924
No log 4.6667 224 0.7118 0.4697 0.7118 0.8437
No log 4.7083 226 0.8949 0.3270 0.8949 0.9460
No log 4.75 228 0.8942 0.3277 0.8942 0.9456
No log 4.7917 230 0.7210 0.4266 0.7210 0.8491
No log 4.8333 232 0.6333 0.4399 0.6333 0.7958
No log 4.875 234 0.6414 0.4788 0.6414 0.8009
No log 4.9167 236 0.6446 0.4945 0.6446 0.8028
No log 4.9583 238 0.7178 0.4954 0.7178 0.8472
No log 5.0 240 0.6737 0.4959 0.6737 0.8208
No log 5.0417 242 0.6449 0.4269 0.6449 0.8031
No log 5.0833 244 0.6975 0.4186 0.6975 0.8352
No log 5.125 246 0.6599 0.4655 0.6599 0.8124
No log 5.1667 248 0.6665 0.4526 0.6665 0.8164
No log 5.2083 250 0.6515 0.4749 0.6515 0.8071
No log 5.25 252 0.6534 0.4643 0.6534 0.8083
No log 5.2917 254 0.6834 0.4926 0.6834 0.8267
No log 5.3333 256 0.6818 0.5133 0.6818 0.8257
No log 5.375 258 0.6414 0.3941 0.6414 0.8009
No log 5.4167 260 0.6617 0.4215 0.6617 0.8134
No log 5.4583 262 0.6756 0.4151 0.6756 0.8220
No log 5.5 264 0.6594 0.4224 0.6594 0.8120
No log 5.5417 266 0.6808 0.4721 0.6808 0.8251
No log 5.5833 268 0.6695 0.4217 0.6695 0.8182
No log 5.625 270 0.6828 0.4598 0.6828 0.8263
No log 5.6667 272 0.6745 0.4238 0.6745 0.8213
No log 5.7083 274 0.6502 0.3667 0.6502 0.8064
No log 5.75 276 0.6983 0.3714 0.6983 0.8356
No log 5.7917 278 0.7511 0.3900 0.7511 0.8667
No log 5.8333 280 0.6846 0.4455 0.6846 0.8274
No log 5.875 282 0.6444 0.3742 0.6444 0.8027
No log 5.9167 284 0.6587 0.3543 0.6587 0.8116
No log 5.9583 286 0.6750 0.4119 0.6750 0.8216
No log 6.0 288 0.6326 0.3724 0.6326 0.7954
No log 6.0417 290 0.6728 0.4488 0.6728 0.8203
No log 6.0833 292 0.8701 0.3875 0.8701 0.9328
No log 6.125 294 0.9125 0.4140 0.9125 0.9552
No log 6.1667 296 0.7822 0.4543 0.7822 0.8844
No log 6.2083 298 0.6478 0.4135 0.6478 0.8049
No log 6.25 300 0.6943 0.4541 0.6943 0.8333
No log 6.2917 302 0.7041 0.4339 0.7041 0.8391
No log 6.3333 304 0.6826 0.4577 0.6826 0.8262
No log 6.375 306 0.6791 0.4533 0.6791 0.8241
No log 6.4167 308 0.7367 0.4851 0.7367 0.8583
No log 6.4583 310 0.7199 0.4411 0.7199 0.8484
No log 6.5 312 0.6338 0.4476 0.6338 0.7961
No log 6.5417 314 0.6036 0.4083 0.6036 0.7769
No log 6.5833 316 0.6714 0.4948 0.6714 0.8194
No log 6.625 318 0.6745 0.5070 0.6745 0.8213
No log 6.6667 320 0.6225 0.4793 0.6225 0.7890
No log 6.7083 322 0.7083 0.5044 0.7083 0.8416
No log 6.75 324 0.8051 0.5148 0.8051 0.8973
No log 6.7917 326 0.7739 0.5025 0.7739 0.8797
No log 6.8333 328 0.6669 0.4882 0.6669 0.8166
No log 6.875 330 0.6424 0.4443 0.6424 0.8015
No log 6.9167 332 0.6510 0.4540 0.6510 0.8069
No log 6.9583 334 0.6745 0.4690 0.6745 0.8212
No log 7.0 336 0.7856 0.5038 0.7856 0.8863
No log 7.0417 338 0.8264 0.4427 0.8264 0.9091
No log 7.0833 340 0.7710 0.4316 0.7710 0.8780
No log 7.125 342 0.6960 0.3726 0.6960 0.8343
No log 7.1667 344 0.6677 0.3265 0.6677 0.8171
No log 7.2083 346 0.6800 0.3656 0.6800 0.8246
No log 7.25 348 0.7299 0.3907 0.7299 0.8543
No log 7.2917 350 0.7783 0.3898 0.7783 0.8822
No log 7.3333 352 0.7414 0.35 0.7414 0.8610
No log 7.375 354 0.7091 0.3713 0.7091 0.8421
No log 7.4167 356 0.6792 0.3830 0.6792 0.8241
No log 7.4583 358 0.6772 0.3860 0.6772 0.8229
No log 7.5 360 0.6880 0.4109 0.6880 0.8295
No log 7.5417 362 0.6927 0.4296 0.6927 0.8323
No log 7.5833 364 0.6479 0.4191 0.6479 0.8049
No log 7.625 366 0.6354 0.4072 0.6354 0.7971
No log 7.6667 368 0.6292 0.3868 0.6292 0.7932
No log 7.7083 370 0.6481 0.4620 0.6481 0.8051
No log 7.75 372 0.7438 0.4594 0.7438 0.8624
No log 7.7917 374 0.7531 0.4363 0.7531 0.8678
No log 7.8333 376 0.6905 0.4803 0.6905 0.8310
No log 7.875 378 0.6215 0.4184 0.6215 0.7883
No log 7.9167 380 0.6153 0.3876 0.6153 0.7844
No log 7.9583 382 0.6151 0.3876 0.6151 0.7843
No log 8.0 384 0.6142 0.3736 0.6142 0.7837
No log 8.0417 386 0.6417 0.4630 0.6417 0.8010
No log 8.0833 388 0.7207 0.4555 0.7207 0.8489
No log 8.125 390 0.7218 0.4709 0.7218 0.8496
No log 8.1667 392 0.6508 0.5179 0.6508 0.8067
No log 8.2083 394 0.6181 0.4368 0.6181 0.7862
No log 8.25 396 0.6365 0.4714 0.6365 0.7978
No log 8.2917 398 0.6853 0.4217 0.6853 0.8278
No log 8.3333 400 0.6693 0.4404 0.6693 0.8181
No log 8.375 402 0.6208 0.4375 0.6208 0.7879
No log 8.4167 404 0.6278 0.4452 0.6278 0.7923
No log 8.4583 406 0.6528 0.4816 0.6528 0.8079
No log 8.5 408 0.6371 0.4364 0.6371 0.7982
No log 8.5417 410 0.6256 0.4262 0.6256 0.7909
No log 8.5833 412 0.6252 0.4231 0.6252 0.7907
No log 8.625 414 0.6253 0.4032 0.6253 0.7908
No log 8.6667 416 0.6402 0.4668 0.6402 0.8001
No log 8.7083 418 0.6651 0.4304 0.6651 0.8156
No log 8.75 420 0.6750 0.4382 0.6750 0.8216
No log 8.7917 422 0.6937 0.4206 0.6937 0.8329
No log 8.8333 424 0.6985 0.4206 0.6985 0.8357
No log 8.875 426 0.6869 0.4476 0.6869 0.8288
No log 8.9167 428 0.7012 0.5124 0.7012 0.8374
No log 8.9583 430 0.6786 0.4135 0.6786 0.8238
No log 9.0 432 0.6528 0.4564 0.6528 0.8080
No log 9.0417 434 0.6716 0.4268 0.6716 0.8195
No log 9.0833 436 0.6617 0.4242 0.6617 0.8134
No log 9.125 438 0.6337 0.4328 0.6337 0.7960
No log 9.1667 440 0.6261 0.4643 0.6261 0.7913
No log 9.2083 442 0.6307 0.4177 0.6307 0.7941
No log 9.25 444 0.6208 0.4653 0.6208 0.7879
No log 9.2917 446 0.6156 0.4576 0.6156 0.7846
No log 9.3333 448 0.6152 0.3830 0.6152 0.7844
No log 9.375 450 0.6229 0.3967 0.6229 0.7892
No log 9.4167 452 0.6081 0.4302 0.6081 0.7798
No log 9.4583 454 0.6170 0.4391 0.6170 0.7855
No log 9.5 456 0.6390 0.5144 0.6390 0.7994
No log 9.5417 458 0.6248 0.4416 0.6248 0.7904
No log 9.5833 460 0.6212 0.4434 0.6212 0.7882
No log 9.625 462 0.6288 0.4440 0.6288 0.7930
No log 9.6667 464 0.6385 0.4476 0.6385 0.7991
No log 9.7083 466 0.6408 0.4263 0.6408 0.8005
No log 9.75 468 0.6558 0.3755 0.6558 0.8098
No log 9.7917 470 0.6835 0.4341 0.6835 0.8267
No log 9.8333 472 0.6657 0.3587 0.6657 0.8159
No log 9.875 474 0.6531 0.4277 0.6531 0.8082
No log 9.9167 476 0.6569 0.4177 0.6569 0.8105
No log 9.9583 478 0.7014 0.4848 0.7014 0.8375
No log 10.0 480 0.7876 0.5323 0.7876 0.8874
No log 10.0417 482 0.8302 0.4908 0.8302 0.9111
No log 10.0833 484 0.7640 0.4862 0.7640 0.8741
No log 10.125 486 0.6811 0.4384 0.6811 0.8253
No log 10.1667 488 0.6621 0.3904 0.6621 0.8137
No log 10.2083 490 0.6894 0.4907 0.6894 0.8303
No log 10.25 492 0.7379 0.4874 0.7379 0.8590
No log 10.2917 494 0.8079 0.4792 0.8079 0.8988
No log 10.3333 496 0.7725 0.4699 0.7725 0.8789
No log 10.375 498 0.6927 0.4717 0.6927 0.8323
0.339 10.4167 500 0.6361 0.4312 0.6361 0.7976
0.339 10.4583 502 0.6318 0.4373 0.6318 0.7949
0.339 10.5 504 0.6471 0.4469 0.6471 0.8044
0.339 10.5417 506 0.6775 0.4854 0.6775 0.8231
0.339 10.5833 508 0.7012 0.4993 0.7012 0.8374
0.339 10.625 510 0.6993 0.4741 0.6993 0.8363
0.339 10.6667 512 0.6727 0.4289 0.6727 0.8202
0.339 10.7083 514 0.6612 0.4308 0.6612 0.8131
0.339 10.75 516 0.6520 0.3886 0.6520 0.8075
0.339 10.7917 518 0.6534 0.3922 0.6534 0.8083
0.339 10.8333 520 0.6770 0.4036 0.6770 0.8228

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k9_task2_organization

Finetuned
(4222)
this model