ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8056
  • Qwk: 0.4824
  • Mse: 0.8056
  • Rmse: 0.8976

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0417 2 4.3322 -0.0048 4.3322 2.0814
No log 0.0833 4 2.5281 -0.0340 2.5281 1.5900
No log 0.125 6 1.4897 0.0185 1.4897 1.2205
No log 0.1667 8 1.1472 0.2343 1.1472 1.0711
No log 0.2083 10 1.4258 0.0343 1.4258 1.1941
No log 0.25 12 1.4676 0.0568 1.4676 1.2114
No log 0.2917 14 1.1440 0.1848 1.1440 1.0696
No log 0.3333 16 1.0720 0.1313 1.0720 1.0354
No log 0.375 18 1.0918 0.1864 1.0918 1.0449
No log 0.4167 20 1.0645 0.0422 1.0645 1.0318
No log 0.4583 22 1.1121 0.1076 1.1121 1.0546
No log 0.5 24 1.1023 0.0824 1.1023 1.0499
No log 0.5417 26 1.1543 0.1203 1.1543 1.0744
No log 0.5833 28 1.0460 0.1263 1.0460 1.0228
No log 0.625 30 0.9569 0.2865 0.9569 0.9782
No log 0.6667 32 0.9375 0.2865 0.9375 0.9682
No log 0.7083 34 0.9329 0.2671 0.9329 0.9659
No log 0.75 36 1.0419 0.1881 1.0419 1.0207
No log 0.7917 38 1.4336 -0.0270 1.4336 1.1973
No log 0.8333 40 1.3413 -0.0112 1.3413 1.1581
No log 0.875 42 0.8825 0.3221 0.8825 0.9394
No log 0.9167 44 0.9190 0.4406 0.9190 0.9587
No log 0.9583 46 0.9979 0.2956 0.9979 0.9989
No log 1.0 48 0.9045 0.4357 0.9045 0.9510
No log 1.0417 50 0.8200 0.4210 0.8200 0.9055
No log 1.0833 52 0.8961 0.3815 0.8961 0.9466
No log 1.125 54 0.9969 0.2956 0.9969 0.9984
No log 1.1667 56 1.1032 0.2038 1.1032 1.0503
No log 1.2083 58 1.1889 0.1426 1.1889 1.0904
No log 1.25 60 0.9552 0.3958 0.9552 0.9773
No log 1.2917 62 0.8357 0.5472 0.8357 0.9142
No log 1.3333 64 0.8712 0.5062 0.8712 0.9334
No log 1.375 66 0.8265 0.5195 0.8265 0.9091
No log 1.4167 68 0.7830 0.5107 0.7830 0.8849
No log 1.4583 70 0.7917 0.5528 0.7917 0.8898
No log 1.5 72 0.8603 0.5279 0.8603 0.9275
No log 1.5417 74 0.8340 0.5183 0.8340 0.9132
No log 1.5833 76 0.7311 0.5329 0.7311 0.8550
No log 1.625 78 0.7282 0.5748 0.7282 0.8533
No log 1.6667 80 0.7276 0.5650 0.7276 0.8530
No log 1.7083 82 0.7115 0.5797 0.7115 0.8435
No log 1.75 84 0.8166 0.4921 0.8166 0.9037
No log 1.7917 86 0.7273 0.5654 0.7272 0.8528
No log 1.8333 88 0.7993 0.5339 0.7993 0.8941
No log 1.875 90 1.4456 0.3099 1.4456 1.2023
No log 1.9167 92 1.4957 0.3138 1.4957 1.2230
No log 1.9583 94 1.1015 0.3539 1.1015 1.0495
No log 2.0 96 0.7592 0.4838 0.7592 0.8713
No log 2.0417 98 0.8232 0.4962 0.8232 0.9073
No log 2.0833 100 1.0543 0.3040 1.0543 1.0268
No log 2.125 102 1.0385 0.3424 1.0385 1.0191
No log 2.1667 104 0.8155 0.5245 0.8155 0.9031
No log 2.2083 106 0.7862 0.5575 0.7862 0.8867
No log 2.25 108 0.8147 0.4810 0.8147 0.9026
No log 2.2917 110 0.8344 0.3800 0.8344 0.9134
No log 2.3333 112 0.8142 0.5342 0.8142 0.9023
No log 2.375 114 0.9227 0.4482 0.9227 0.9606
No log 2.4167 116 0.9353 0.4577 0.9353 0.9671
No log 2.4583 118 0.8401 0.4966 0.8401 0.9166
No log 2.5 120 0.7965 0.5450 0.7965 0.8925
No log 2.5417 122 0.8057 0.4996 0.8057 0.8976
No log 2.5833 124 0.7873 0.5124 0.7873 0.8873
No log 2.625 126 0.8080 0.4964 0.8080 0.8989
No log 2.6667 128 0.8035 0.4969 0.8035 0.8964
No log 2.7083 130 0.7780 0.5451 0.7780 0.8821
No log 2.75 132 0.8098 0.5300 0.8098 0.8999
No log 2.7917 134 0.7938 0.5038 0.7938 0.8909
No log 2.8333 136 0.8025 0.5261 0.8025 0.8958
No log 2.875 138 0.8010 0.4903 0.8010 0.8950
No log 2.9167 140 0.8132 0.5463 0.8132 0.9018
No log 2.9583 142 0.9190 0.4521 0.9190 0.9587
No log 3.0 144 1.0141 0.3959 1.0141 1.0070
No log 3.0417 146 0.9964 0.4150 0.9964 0.9982
No log 3.0833 148 0.8910 0.3993 0.8910 0.9439
No log 3.125 150 0.8635 0.4910 0.8635 0.9293
No log 3.1667 152 0.8781 0.4956 0.8781 0.9371
No log 3.2083 154 0.8687 0.5002 0.8687 0.9320
No log 3.25 156 0.9480 0.4244 0.9480 0.9736
No log 3.2917 158 0.9392 0.4098 0.9392 0.9691
No log 3.3333 160 0.8499 0.5135 0.8499 0.9219
No log 3.375 162 0.8727 0.3908 0.8727 0.9342
No log 3.4167 164 0.8658 0.3908 0.8658 0.9305
No log 3.4583 166 0.8206 0.4676 0.8206 0.9059
No log 3.5 168 0.8540 0.4599 0.8540 0.9241
No log 3.5417 170 0.9222 0.4695 0.9222 0.9603
No log 3.5833 172 0.8504 0.4510 0.8504 0.9222
No log 3.625 174 0.7662 0.5621 0.7662 0.8753
No log 3.6667 176 0.8113 0.5098 0.8113 0.9007
No log 3.7083 178 0.7760 0.5253 0.7760 0.8809
No log 3.75 180 0.7376 0.5752 0.7376 0.8589
No log 3.7917 182 0.7214 0.5822 0.7214 0.8493
No log 3.8333 184 0.7139 0.5572 0.7139 0.8449
No log 3.875 186 0.6895 0.5247 0.6895 0.8304
No log 3.9167 188 0.6717 0.5747 0.6717 0.8196
No log 3.9583 190 0.6500 0.5871 0.6500 0.8062
No log 4.0 192 0.6237 0.6076 0.6237 0.7898
No log 4.0417 194 0.6414 0.6179 0.6414 0.8009
No log 4.0833 196 0.6532 0.6512 0.6532 0.8082
No log 4.125 198 0.6099 0.5742 0.6099 0.7810
No log 4.1667 200 0.6749 0.6082 0.6749 0.8215
No log 4.2083 202 0.7344 0.5857 0.7344 0.8570
No log 4.25 204 0.6795 0.5548 0.6795 0.8243
No log 4.2917 206 0.6474 0.5577 0.6474 0.8046
No log 4.3333 208 0.7071 0.5837 0.7071 0.8409
No log 4.375 210 0.7084 0.5746 0.7084 0.8416
No log 4.4167 212 0.6769 0.5391 0.6769 0.8227
No log 4.4583 214 0.6887 0.5214 0.6887 0.8299
No log 4.5 216 0.8266 0.5271 0.8266 0.9092
No log 4.5417 218 0.8964 0.5070 0.8964 0.9468
No log 4.5833 220 0.7725 0.5345 0.7725 0.8789
No log 4.625 222 0.6848 0.5396 0.6848 0.8275
No log 4.6667 224 0.7351 0.5585 0.7351 0.8574
No log 4.7083 226 0.7816 0.4560 0.7816 0.8841
No log 4.75 228 0.7741 0.3882 0.7741 0.8798
No log 4.7917 230 0.7753 0.4373 0.7753 0.8805
No log 4.8333 232 0.7593 0.4218 0.7593 0.8714
No log 4.875 234 0.7375 0.4124 0.7375 0.8588
No log 4.9167 236 0.7291 0.5025 0.7291 0.8539
No log 4.9583 238 0.7672 0.4973 0.7672 0.8759
No log 5.0 240 0.8069 0.4952 0.8069 0.8983
No log 5.0417 242 0.8570 0.5458 0.8570 0.9258
No log 5.0833 244 0.7863 0.4597 0.7863 0.8867
No log 5.125 246 0.7187 0.5734 0.7187 0.8478
No log 5.1667 248 0.7070 0.5171 0.7070 0.8408
No log 5.2083 250 0.8157 0.4578 0.8157 0.9032
No log 5.25 252 0.9051 0.4280 0.9051 0.9514
No log 5.2917 254 0.8427 0.4489 0.8427 0.9180
No log 5.3333 256 0.7294 0.5329 0.7294 0.8541
No log 5.375 258 0.7366 0.5232 0.7366 0.8582
No log 5.4167 260 0.8103 0.4697 0.8103 0.9002
No log 5.4583 262 0.7901 0.5088 0.7901 0.8889
No log 5.5 264 0.7732 0.5304 0.7732 0.8793
No log 5.5417 266 0.7664 0.5303 0.7664 0.8754
No log 5.5833 268 0.7688 0.4835 0.7688 0.8768
No log 5.625 270 0.8095 0.4513 0.8095 0.8997
No log 5.6667 272 0.7794 0.4069 0.7794 0.8828
No log 5.7083 274 0.8005 0.4714 0.8005 0.8947
No log 5.75 276 0.8894 0.4310 0.8894 0.9431
No log 5.7917 278 0.8908 0.4318 0.8908 0.9438
No log 5.8333 280 0.8475 0.4439 0.8475 0.9206
No log 5.875 282 0.8721 0.4310 0.8721 0.9339
No log 5.9167 284 0.8673 0.4310 0.8673 0.9313
No log 5.9583 286 0.8025 0.4714 0.8025 0.8958
No log 6.0 288 0.7557 0.4748 0.7557 0.8693
No log 6.0417 290 0.7620 0.4748 0.7620 0.8729
No log 6.0833 292 0.7999 0.4461 0.7999 0.8944
No log 6.125 294 0.8796 0.4054 0.8796 0.9379
No log 6.1667 296 0.9760 0.4197 0.9760 0.9879
No log 6.2083 298 0.9634 0.3953 0.9634 0.9815
No log 6.25 300 0.8784 0.4558 0.8784 0.9372
No log 6.2917 302 0.7759 0.5305 0.7759 0.8809
No log 6.3333 304 0.7060 0.4883 0.7060 0.8402
No log 6.375 306 0.6884 0.5142 0.6884 0.8297
No log 6.4167 308 0.6818 0.5129 0.6818 0.8257
No log 6.4583 310 0.7029 0.5676 0.7029 0.8384
No log 6.5 312 0.7450 0.5663 0.7450 0.8631
No log 6.5417 314 0.7159 0.5905 0.7159 0.8461
No log 6.5833 316 0.6941 0.5510 0.6941 0.8332
No log 6.625 318 0.7028 0.5950 0.7028 0.8383
No log 6.6667 320 0.6992 0.5510 0.6992 0.8362
No log 6.7083 322 0.7002 0.5510 0.7002 0.8368
No log 6.75 324 0.7063 0.5342 0.7063 0.8404
No log 6.7917 326 0.7439 0.5599 0.7439 0.8625
No log 6.8333 328 0.8410 0.5735 0.8410 0.9171
No log 6.875 330 0.8522 0.4794 0.8522 0.9232
No log 6.9167 332 0.7670 0.5964 0.7670 0.8758
No log 6.9583 334 0.7010 0.4898 0.7010 0.8372
No log 7.0 336 0.7317 0.5005 0.7317 0.8554
No log 7.0417 338 0.7231 0.4641 0.7231 0.8503
No log 7.0833 340 0.6862 0.5033 0.6862 0.8284
No log 7.125 342 0.7623 0.4824 0.7623 0.8731
No log 7.1667 344 0.8879 0.4216 0.8879 0.9423
No log 7.2083 346 0.9156 0.4216 0.9156 0.9569
No log 7.25 348 0.9001 0.3539 0.9001 0.9488
No log 7.2917 350 0.8835 0.3222 0.8835 0.9399
No log 7.3333 352 0.8397 0.3169 0.8397 0.9163
No log 7.375 354 0.7874 0.3902 0.7874 0.8873
No log 7.4167 356 0.7727 0.4180 0.7727 0.8790
No log 7.4583 358 0.7868 0.4911 0.7868 0.8870
No log 7.5 360 0.7883 0.5666 0.7883 0.8878
No log 7.5417 362 0.8288 0.5636 0.8288 0.9104
No log 7.5833 364 0.7960 0.6247 0.7960 0.8922
No log 7.625 366 0.7107 0.5923 0.7107 0.8430
No log 7.6667 368 0.6815 0.5678 0.6815 0.8255
No log 7.7083 370 0.6762 0.5833 0.6762 0.8223
No log 7.75 372 0.6870 0.4781 0.6870 0.8289
No log 7.7917 374 0.6921 0.5153 0.6921 0.8320
No log 7.8333 376 0.7100 0.5380 0.7100 0.8426
No log 7.875 378 0.7433 0.5601 0.7433 0.8622
No log 7.9167 380 0.7401 0.5733 0.7401 0.8603
No log 7.9583 382 0.7059 0.6082 0.7059 0.8402
No log 8.0 384 0.6679 0.5577 0.6679 0.8173
No log 8.0417 386 0.6745 0.5629 0.6745 0.8213
No log 8.0833 388 0.6799 0.5629 0.6799 0.8246
No log 8.125 390 0.6760 0.5629 0.6760 0.8222
No log 8.1667 392 0.6878 0.5304 0.6878 0.8293
No log 8.2083 394 0.6885 0.5442 0.6885 0.8298
No log 8.25 396 0.6829 0.5171 0.6829 0.8264
No log 8.2917 398 0.6945 0.5316 0.6945 0.8333
No log 8.3333 400 0.7035 0.4995 0.7035 0.8388
No log 8.375 402 0.7529 0.5429 0.7529 0.8677
No log 8.4167 404 0.7516 0.5429 0.7516 0.8669
No log 8.4583 406 0.7378 0.4987 0.7378 0.8590
No log 8.5 408 0.7400 0.4987 0.7400 0.8602
No log 8.5417 410 0.7282 0.5142 0.7282 0.8534
No log 8.5833 412 0.7281 0.4565 0.7281 0.8533
No log 8.625 414 0.7281 0.4547 0.7281 0.8533
No log 8.6667 416 0.7295 0.4882 0.7295 0.8541
No log 8.7083 418 0.7634 0.5540 0.7634 0.8737
No log 8.75 420 0.7658 0.5470 0.7658 0.8751
No log 8.7917 422 0.7648 0.5451 0.7648 0.8745
No log 8.8333 424 0.7353 0.5528 0.7353 0.8575
No log 8.875 426 0.7225 0.5247 0.7225 0.8500
No log 8.9167 428 0.7230 0.5002 0.7230 0.8503
No log 8.9583 430 0.7235 0.5017 0.7235 0.8506
No log 9.0 432 0.7351 0.4161 0.7351 0.8574
No log 9.0417 434 0.7200 0.4723 0.7200 0.8485
No log 9.0833 436 0.6980 0.5050 0.6980 0.8355
No log 9.125 438 0.6974 0.5587 0.6974 0.8351
No log 9.1667 440 0.7102 0.5731 0.7102 0.8427
No log 9.2083 442 0.7700 0.5658 0.7700 0.8775
No log 9.25 444 0.7801 0.5756 0.7801 0.8832
No log 9.2917 446 0.7199 0.6062 0.7199 0.8485
No log 9.3333 448 0.6807 0.5274 0.6807 0.8250
No log 9.375 450 0.7141 0.4640 0.7141 0.8451
No log 9.4167 452 0.7352 0.4545 0.7352 0.8575
No log 9.4583 454 0.7041 0.4393 0.7041 0.8391
No log 9.5 456 0.6733 0.5510 0.6733 0.8205
No log 9.5417 458 0.7006 0.5837 0.7006 0.8370
No log 9.5833 460 0.7936 0.5455 0.7936 0.8908
No log 9.625 462 0.8474 0.5428 0.8474 0.9206
No log 9.6667 464 0.8058 0.5455 0.8058 0.8977
No log 9.7083 466 0.7535 0.5777 0.7535 0.8680
No log 9.75 468 0.7313 0.6053 0.7313 0.8551
No log 9.7917 470 0.7335 0.5349 0.7335 0.8564
No log 9.8333 472 0.7405 0.5349 0.7405 0.8605
No log 9.875 474 0.7342 0.5244 0.7342 0.8569
No log 9.9167 476 0.7225 0.5380 0.7225 0.8500
No log 9.9583 478 0.7159 0.5274 0.7159 0.8461
No log 10.0 480 0.7134 0.5025 0.7134 0.8446
No log 10.0417 482 0.7214 0.5380 0.7214 0.8494
No log 10.0833 484 0.7597 0.5686 0.7597 0.8716
No log 10.125 486 0.8060 0.5131 0.8060 0.8978
No log 10.1667 488 0.7828 0.5756 0.7828 0.8848
No log 10.2083 490 0.7048 0.5787 0.7048 0.8395
No log 10.25 492 0.6723 0.5747 0.6723 0.8199
No log 10.2917 494 0.6885 0.4870 0.6885 0.8298
No log 10.3333 496 0.6814 0.5076 0.6814 0.8255
No log 10.375 498 0.6834 0.5247 0.6834 0.8267
0.3 10.4167 500 0.7437 0.5642 0.7437 0.8624
0.3 10.4583 502 0.7917 0.5254 0.7917 0.8898
0.3 10.5 504 0.7996 0.5266 0.7996 0.8942
0.3 10.5417 506 0.7948 0.4828 0.7948 0.8915
0.3 10.5833 508 0.8125 0.4180 0.8125 0.9014
0.3 10.625 510 0.8056 0.4824 0.8056 0.8976

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
9
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k19_task5_organization

Finetuned
(4222)
this model