ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5669
  • Qwk: 0.4747
  • Mse: 0.5669
  • Rmse: 0.7529

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 2.6703 -0.0262 2.6703 1.6341
No log 0.0889 4 1.4414 0.0511 1.4414 1.2006
No log 0.1333 6 1.2104 -0.1993 1.2104 1.1002
No log 0.1778 8 1.0885 -0.1095 1.0885 1.0433
No log 0.2222 10 1.1405 -0.2088 1.1405 1.0679
No log 0.2667 12 1.3054 -0.2026 1.3054 1.1425
No log 0.3111 14 1.1953 -0.2191 1.1953 1.0933
No log 0.3556 16 0.9715 0.0469 0.9715 0.9856
No log 0.4 18 0.8487 0.1184 0.8487 0.9212
No log 0.4444 20 0.7605 0.1139 0.7605 0.8720
No log 0.4889 22 0.7263 0.1863 0.7263 0.8522
No log 0.5333 24 0.7119 0.1863 0.7119 0.8437
No log 0.5778 26 0.7245 0.2206 0.7245 0.8512
No log 0.6222 28 0.7335 0.0846 0.7335 0.8565
No log 0.6667 30 0.7293 0.0481 0.7293 0.8540
No log 0.7111 32 0.7412 0.0 0.7412 0.8609
No log 0.7556 34 0.7349 0.0840 0.7349 0.8572
No log 0.8 36 0.7266 0.1617 0.7266 0.8524
No log 0.8444 38 0.7234 0.2270 0.7234 0.8506
No log 0.8889 40 0.7434 0.2206 0.7434 0.8622
No log 0.9333 42 0.7752 0.1807 0.7752 0.8804
No log 0.9778 44 0.7901 0.2046 0.7901 0.8889
No log 1.0222 46 0.8162 0.0851 0.8162 0.9034
No log 1.0667 48 0.8215 0.0053 0.8215 0.9064
No log 1.1111 50 0.8064 0.0509 0.8064 0.8980
No log 1.1556 52 0.7774 0.0053 0.7774 0.8817
No log 1.2 54 0.7464 0.1282 0.7464 0.8639
No log 1.2444 56 0.7439 0.1699 0.7439 0.8625
No log 1.2889 58 0.7539 0.1737 0.7539 0.8683
No log 1.3333 60 0.8049 0.1558 0.8049 0.8972
No log 1.3778 62 0.8533 0.1166 0.8533 0.9238
No log 1.4222 64 0.8114 0.2092 0.8114 0.9008
No log 1.4667 66 0.7791 0.3324 0.7791 0.8826
No log 1.5111 68 0.8180 0.2227 0.8180 0.9044
No log 1.5556 70 0.8138 0.2685 0.8138 0.9021
No log 1.6 72 0.7776 0.2685 0.7776 0.8818
No log 1.6444 74 0.7592 0.2685 0.7592 0.8713
No log 1.6889 76 0.7596 0.2652 0.7596 0.8716
No log 1.7333 78 0.7489 0.2652 0.7489 0.8654
No log 1.7778 80 0.7588 0.3594 0.7588 0.8711
No log 1.8222 82 0.7486 0.3594 0.7486 0.8652
No log 1.8667 84 0.7252 0.3594 0.7252 0.8516
No log 1.9111 86 0.7007 0.2285 0.7007 0.8371
No log 1.9556 88 0.7318 0.3868 0.7318 0.8554
No log 2.0 90 0.8332 0.3819 0.8332 0.9128
No log 2.0444 92 0.7886 0.3894 0.7886 0.8880
No log 2.0889 94 0.7137 0.4052 0.7137 0.8448
No log 2.1333 96 0.7068 0.4052 0.7068 0.8407
No log 2.1778 98 0.7616 0.3894 0.7616 0.8727
No log 2.2222 100 0.6923 0.4437 0.6923 0.8320
No log 2.2667 102 0.6812 0.3713 0.6812 0.8254
No log 2.3111 104 0.6885 0.3452 0.6885 0.8298
No log 2.3556 106 0.8211 0.3891 0.8211 0.9062
No log 2.4 108 0.9487 0.2626 0.9487 0.9740
No log 2.4444 110 0.8642 0.3019 0.8642 0.9296
No log 2.4889 112 0.7254 0.3937 0.7254 0.8517
No log 2.5333 114 0.7038 0.4234 0.7038 0.8389
No log 2.5778 116 0.8216 0.3731 0.8216 0.9064
No log 2.6222 118 1.0351 0.2968 1.0351 1.0174
No log 2.6667 120 0.9387 0.3503 0.9387 0.9689
No log 2.7111 122 0.9550 0.3608 0.9550 0.9772
No log 2.7556 124 0.8227 0.3747 0.8227 0.9070
No log 2.8 126 0.6298 0.4639 0.6298 0.7936
No log 2.8444 128 0.6946 0.2624 0.6946 0.8334
No log 2.8889 130 0.6680 0.3265 0.6680 0.8173
No log 2.9333 132 0.6431 0.4451 0.6431 0.8019
No log 2.9778 134 0.9337 0.3523 0.9337 0.9663
No log 3.0222 136 1.0354 0.3016 1.0354 1.0175
No log 3.0667 138 0.8585 0.3727 0.8585 0.9266
No log 3.1111 140 0.6917 0.4898 0.6917 0.8317
No log 3.1556 142 0.6502 0.4562 0.6502 0.8063
No log 3.2 144 0.6434 0.3836 0.6434 0.8021
No log 3.2444 146 0.6139 0.4222 0.6139 0.7835
No log 3.2889 148 0.6491 0.4582 0.6491 0.8057
No log 3.3333 150 0.6651 0.4089 0.6651 0.8155
No log 3.3778 152 0.6548 0.3590 0.6548 0.8092
No log 3.4222 154 0.7770 0.4562 0.7770 0.8815
No log 3.4667 156 0.7765 0.4562 0.7765 0.8812
No log 3.5111 158 0.6529 0.3843 0.6529 0.8080
No log 3.5556 160 0.5936 0.3862 0.5936 0.7704
No log 3.6 162 0.6188 0.4205 0.6188 0.7866
No log 3.6444 164 0.5839 0.3915 0.5839 0.7641
No log 3.6889 166 0.7057 0.4089 0.7057 0.8401
No log 3.7333 168 0.9339 0.3868 0.9339 0.9664
No log 3.7778 170 0.8321 0.4217 0.8321 0.9122
No log 3.8222 172 0.6602 0.3891 0.6602 0.8125
No log 3.8667 174 0.6310 0.3990 0.6310 0.7944
No log 3.9111 176 0.6660 0.3817 0.6660 0.8161
No log 3.9556 178 0.6909 0.4294 0.6909 0.8312
No log 4.0 180 0.6910 0.4190 0.6910 0.8313
No log 4.0444 182 0.7259 0.3918 0.7259 0.8520
No log 4.0889 184 0.7445 0.3891 0.7445 0.8629
No log 4.1333 186 0.7339 0.4190 0.7339 0.8567
No log 4.1778 188 0.6357 0.3408 0.6357 0.7973
No log 4.2222 190 0.6002 0.3703 0.6002 0.7747
No log 4.2667 192 0.5895 0.4847 0.5895 0.7678
No log 4.3111 194 0.6215 0.5034 0.6215 0.7883
No log 4.3556 196 0.6734 0.5063 0.6734 0.8206
No log 4.4 198 0.7204 0.4979 0.7204 0.8487
No log 4.4444 200 0.6559 0.4864 0.6559 0.8099
No log 4.4889 202 0.6598 0.4827 0.6598 0.8123
No log 4.5333 204 0.6011 0.4875 0.6011 0.7753
No log 4.5778 206 0.5779 0.5289 0.5779 0.7602
No log 4.6222 208 0.6283 0.5219 0.6283 0.7927
No log 4.6667 210 0.6272 0.5131 0.6272 0.7920
No log 4.7111 212 0.5655 0.5133 0.5655 0.7520
No log 4.7556 214 0.5567 0.5272 0.5567 0.7461
No log 4.8 216 0.5274 0.5640 0.5274 0.7262
No log 4.8444 218 0.5230 0.4678 0.5230 0.7232
No log 4.8889 220 0.5287 0.4918 0.5287 0.7271
No log 4.9333 222 0.5753 0.4881 0.5753 0.7585
No log 4.9778 224 0.5220 0.4703 0.5220 0.7225
No log 5.0222 226 0.6425 0.4482 0.6425 0.8016
No log 5.0667 228 0.8924 0.3945 0.8924 0.9447
No log 5.1111 230 0.8873 0.4033 0.8873 0.9420
No log 5.1556 232 0.6892 0.4366 0.6892 0.8302
No log 5.2 234 0.5602 0.4576 0.5602 0.7485
No log 5.2444 236 0.5459 0.4701 0.5459 0.7388
No log 5.2889 238 0.5722 0.4234 0.5722 0.7564
No log 5.3333 240 0.6691 0.4801 0.6691 0.8180
No log 5.3778 242 0.7954 0.4432 0.7954 0.8919
No log 5.4222 244 0.7479 0.4432 0.7479 0.8648
No log 5.4667 246 0.6297 0.4374 0.6297 0.7935
No log 5.5111 248 0.5944 0.5057 0.5944 0.7709
No log 5.5556 250 0.6060 0.4820 0.6060 0.7785
No log 5.6 252 0.6230 0.4467 0.6230 0.7893
No log 5.6444 254 0.7415 0.5263 0.7415 0.8611
No log 5.6889 256 0.8409 0.4455 0.8409 0.9170
No log 5.7333 258 0.7431 0.5263 0.7431 0.8620
No log 5.7778 260 0.6073 0.4622 0.6073 0.7793
No log 5.8222 262 0.5697 0.4337 0.5697 0.7548
No log 5.8667 264 0.5697 0.4082 0.5697 0.7548
No log 5.9111 266 0.5679 0.3625 0.5679 0.7536
No log 5.9556 268 0.5883 0.4179 0.5883 0.7670
No log 6.0 270 0.6174 0.4703 0.6174 0.7857
No log 6.0444 272 0.5970 0.4260 0.5970 0.7726
No log 6.0889 274 0.5822 0.3864 0.5822 0.7630
No log 6.1333 276 0.6323 0.5342 0.6323 0.7952
No log 6.1778 278 0.7465 0.5542 0.7465 0.8640
No log 6.2222 280 0.6913 0.5339 0.6913 0.8315
No log 6.2667 282 0.5920 0.4740 0.5920 0.7694
No log 6.3111 284 0.5802 0.3791 0.5802 0.7617
No log 6.3556 286 0.5895 0.3813 0.5895 0.7678
No log 6.4 288 0.5740 0.3643 0.5740 0.7576
No log 6.4444 290 0.6805 0.4666 0.6805 0.8249
No log 6.4889 292 0.8338 0.4511 0.8338 0.9131
No log 6.5333 294 0.8457 0.4511 0.8457 0.9196
No log 6.5778 296 0.7313 0.4247 0.7313 0.8552
No log 6.6222 298 0.6879 0.4741 0.6879 0.8294
No log 6.6667 300 0.6530 0.4759 0.6530 0.8081
No log 6.7111 302 0.5749 0.4985 0.5749 0.7582
No log 6.7556 304 0.5879 0.4857 0.5879 0.7668
No log 6.8 306 0.5884 0.4915 0.5884 0.7671
No log 6.8444 308 0.5737 0.5109 0.5737 0.7575
No log 6.8889 310 0.7074 0.4610 0.7074 0.8411
No log 6.9333 312 0.8606 0.4895 0.8606 0.9277
No log 6.9778 314 0.7863 0.4562 0.7863 0.8867
No log 7.0222 316 0.6188 0.3471 0.6188 0.7866
No log 7.0667 318 0.5413 0.3840 0.5413 0.7358
No log 7.1111 320 0.5301 0.3840 0.5301 0.7281
No log 7.1556 322 0.5453 0.3425 0.5453 0.7385
No log 7.2 324 0.5925 0.5140 0.5925 0.7697
No log 7.2444 326 0.6779 0.5457 0.6779 0.8233
No log 7.2889 328 0.6757 0.5827 0.6757 0.8220
No log 7.3333 330 0.6017 0.5616 0.6017 0.7757
No log 7.3778 332 0.6069 0.5542 0.6069 0.7791
No log 7.4222 334 0.5918 0.5171 0.5918 0.7693
No log 7.4667 336 0.5691 0.5771 0.5691 0.7544
No log 7.5111 338 0.5988 0.5884 0.5988 0.7738
No log 7.5556 340 0.6271 0.5259 0.6271 0.7919
No log 7.6 342 0.6283 0.5042 0.6283 0.7926
No log 7.6444 344 0.5967 0.4979 0.5967 0.7725
No log 7.6889 346 0.5818 0.4845 0.5818 0.7628
No log 7.7333 348 0.5703 0.4845 0.5703 0.7552
No log 7.7778 350 0.5367 0.4724 0.5367 0.7326
No log 7.8222 352 0.5415 0.4724 0.5415 0.7359
No log 7.8667 354 0.5574 0.5291 0.5574 0.7466
No log 7.9111 356 0.5282 0.5032 0.5282 0.7268
No log 7.9556 358 0.5154 0.5476 0.5154 0.7179
No log 8.0 360 0.5279 0.5640 0.5279 0.7265
No log 8.0444 362 0.5550 0.5447 0.5550 0.7450
No log 8.0889 364 0.6205 0.5673 0.6205 0.7877
No log 8.1333 366 0.6420 0.5645 0.6420 0.8013
No log 8.1778 368 0.5752 0.5934 0.5752 0.7584
No log 8.2222 370 0.5155 0.5444 0.5155 0.7180
No log 8.2667 372 0.5102 0.5218 0.5102 0.7143
No log 8.3111 374 0.5531 0.5237 0.5531 0.7437
No log 8.3556 376 0.6294 0.5394 0.6294 0.7934
No log 8.4 378 0.7291 0.4502 0.7291 0.8539
No log 8.4444 380 0.7177 0.4991 0.7177 0.8471
No log 8.4889 382 0.6533 0.5090 0.6533 0.8083
No log 8.5333 384 0.5558 0.4749 0.5558 0.7455
No log 8.5778 386 0.5130 0.4555 0.5130 0.7163
No log 8.6222 388 0.5037 0.4555 0.5037 0.7097
No log 8.6667 390 0.5409 0.5237 0.5409 0.7355
No log 8.7111 392 0.6461 0.5310 0.6461 0.8038
No log 8.7556 394 0.7054 0.5047 0.7054 0.8399
No log 8.8 396 0.7834 0.5031 0.7834 0.8851
No log 8.8444 398 0.7096 0.4978 0.7096 0.8424
No log 8.8889 400 0.5768 0.5712 0.5768 0.7594
No log 8.9333 402 0.5033 0.5485 0.5033 0.7094
No log 8.9778 404 0.4997 0.5577 0.4997 0.7069
No log 9.0222 406 0.5388 0.5016 0.5388 0.7340
No log 9.0667 408 0.6234 0.5107 0.6234 0.7895
No log 9.1111 410 0.6471 0.5227 0.6471 0.8044
No log 9.1556 412 0.6730 0.5147 0.6730 0.8204
No log 9.2 414 0.6028 0.5326 0.6028 0.7764
No log 9.2444 416 0.5143 0.5485 0.5143 0.7171
No log 9.2889 418 0.4959 0.4955 0.4959 0.7042
No log 9.3333 420 0.4924 0.4955 0.4924 0.7017
No log 9.3778 422 0.5037 0.5831 0.5037 0.7097
No log 9.4222 424 0.5085 0.5577 0.5085 0.7131
No log 9.4667 426 0.5207 0.5123 0.5207 0.7216
No log 9.5111 428 0.5305 0.4875 0.5305 0.7284
No log 9.5556 430 0.5251 0.4147 0.5251 0.7247
No log 9.6 432 0.5270 0.3889 0.5270 0.7260
No log 9.6444 434 0.5355 0.3661 0.5355 0.7318
No log 9.6889 436 0.5232 0.3889 0.5232 0.7233
No log 9.7333 438 0.5237 0.3788 0.5237 0.7237
No log 9.7778 440 0.5475 0.5237 0.5475 0.7399
No log 9.8222 442 0.6365 0.5465 0.6365 0.7978
No log 9.8667 444 0.7005 0.4648 0.7005 0.8369
No log 9.9111 446 0.6594 0.5227 0.6594 0.8120
No log 9.9556 448 0.5954 0.4845 0.5954 0.7716
No log 10.0 450 0.5947 0.5166 0.5947 0.7712
No log 10.0444 452 0.6388 0.4606 0.6388 0.7992
No log 10.0889 454 0.7601 0.4574 0.7601 0.8718
No log 10.1333 456 0.8772 0.3707 0.8772 0.9366
No log 10.1778 458 0.8593 0.4208 0.8593 0.9270
No log 10.2222 460 0.7747 0.4208 0.7747 0.8801
No log 10.2667 462 0.7031 0.4307 0.7031 0.8385
No log 10.3111 464 0.6634 0.4470 0.6634 0.8145
No log 10.3556 466 0.7032 0.4307 0.7032 0.8386
No log 10.4 468 0.7337 0.4203 0.7337 0.8566
No log 10.4444 470 0.6697 0.4203 0.6697 0.8183
No log 10.4889 472 0.6579 0.4666 0.6579 0.8111
No log 10.5333 474 0.7147 0.4203 0.7147 0.8454
No log 10.5778 476 0.7208 0.4133 0.7208 0.8490
No log 10.6222 478 0.6545 0.4880 0.6545 0.8090
No log 10.6667 480 0.5680 0.4705 0.5680 0.7536
No log 10.7111 482 0.5556 0.4124 0.5556 0.7454
No log 10.7556 484 0.5781 0.4437 0.5781 0.7603
No log 10.8 486 0.6621 0.4646 0.6621 0.8137
No log 10.8444 488 0.7085 0.4328 0.7085 0.8417
No log 10.8889 490 0.6749 0.4328 0.6749 0.8215
No log 10.9333 492 0.5898 0.5527 0.5898 0.7680
No log 10.9778 494 0.5399 0.4747 0.5399 0.7348
No log 11.0222 496 0.5343 0.3910 0.5343 0.7310
No log 11.0667 498 0.5400 0.3628 0.5400 0.7349
0.3551 11.1111 500 0.5628 0.4828 0.5628 0.7502
0.3551 11.1556 502 0.5764 0.5485 0.5764 0.7592
0.3551 11.2 504 0.5640 0.5485 0.5640 0.7510
0.3551 11.2444 506 0.5460 0.6334 0.5460 0.7389
0.3551 11.2889 508 0.5420 0.5578 0.5420 0.7362
0.3551 11.3333 510 0.5420 0.5285 0.5420 0.7362
0.3551 11.3778 512 0.5407 0.6021 0.5407 0.7353
0.3551 11.4222 514 0.5805 0.6150 0.5805 0.7619
0.3551 11.4667 516 0.6137 0.5625 0.6137 0.7834
0.3551 11.5111 518 0.5959 0.6325 0.5959 0.7719
0.3551 11.5556 520 0.5705 0.5813 0.5705 0.7553
0.3551 11.6 522 0.5811 0.5886 0.5811 0.7623
0.3551 11.6444 524 0.5889 0.5886 0.5889 0.7674
0.3551 11.6889 526 0.5795 0.5886 0.5795 0.7613
0.3551 11.7333 528 0.5682 0.5373 0.5682 0.7538
0.3551 11.7778 530 0.5675 0.5357 0.5675 0.7533
0.3551 11.8222 532 0.5940 0.5845 0.5940 0.7707
0.3551 11.8667 534 0.6309 0.5226 0.6309 0.7943
0.3551 11.9111 536 0.6616 0.4862 0.6616 0.8134
0.3551 11.9556 538 0.6485 0.5418 0.6485 0.8053
0.3551 12.0 540 0.6148 0.5373 0.6148 0.7841
0.3551 12.0444 542 0.6129 0.5373 0.6129 0.7828
0.3551 12.0889 544 0.6003 0.5418 0.6003 0.7748
0.3551 12.1333 546 0.5800 0.5557 0.5800 0.7616
0.3551 12.1778 548 0.5777 0.5913 0.5777 0.7601
0.3551 12.2222 550 0.5743 0.5913 0.5743 0.7578
0.3551 12.2667 552 0.5786 0.5594 0.5786 0.7606
0.3551 12.3111 554 0.5611 0.5557 0.5611 0.7491
0.3551 12.3556 556 0.5661 0.5272 0.5661 0.7524
0.3551 12.4 558 0.6003 0.5457 0.6003 0.7748
0.3551 12.4444 560 0.6967 0.4768 0.6967 0.8347
0.3551 12.4889 562 0.7667 0.4381 0.7667 0.8756
0.3551 12.5333 564 0.7095 0.4759 0.7095 0.8423
0.3551 12.5778 566 0.6175 0.4336 0.6175 0.7858
0.3551 12.6222 568 0.5669 0.4747 0.5669 0.7529

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k18_task7_organization

Finetuned
(4223)
this model