ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5074
  • Qwk: 0.3974
  • Mse: 0.5074
  • Rmse: 0.7123

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0741 2 2.5262 -0.0262 2.5262 1.5894
No log 0.1481 4 1.3142 0.1262 1.3142 1.1464
No log 0.2222 6 1.0509 -0.0970 1.0509 1.0251
No log 0.2963 8 0.8892 -0.0764 0.8892 0.9430
No log 0.3704 10 0.8148 0.0026 0.8148 0.9027
No log 0.4444 12 0.8081 0.0053 0.8081 0.8989
No log 0.5185 14 0.8261 0.0481 0.8261 0.9089
No log 0.5926 16 0.8230 0.0481 0.8230 0.9072
No log 0.6667 18 0.8115 0.0 0.8115 0.9009
No log 0.7407 20 0.7755 0.0 0.7755 0.8806
No log 0.8148 22 0.7757 0.0 0.7757 0.8808
No log 0.8889 24 0.8157 0.0393 0.8157 0.9032
No log 0.9630 26 0.8129 -0.0027 0.8129 0.9016
No log 1.0370 28 0.7998 0.0 0.7998 0.8943
No log 1.1111 30 0.8472 -0.0027 0.8472 0.9205
No log 1.1852 32 0.7863 0.0 0.7863 0.8867
No log 1.2593 34 0.7741 0.0937 0.7741 0.8798
No log 1.3333 36 0.7823 0.0937 0.7823 0.8845
No log 1.4074 38 0.7801 0.0481 0.7801 0.8832
No log 1.4815 40 0.7897 0.0937 0.7897 0.8887
No log 1.5556 42 0.7700 0.0481 0.7700 0.8775
No log 1.6296 44 0.7677 0.0481 0.7677 0.8762
No log 1.7037 46 0.7744 0.0 0.7744 0.8800
No log 1.7778 48 0.7892 0.0 0.7892 0.8884
No log 1.8519 50 0.8407 0.1050 0.8407 0.9169
No log 1.9259 52 0.9376 0.1264 0.9376 0.9683
No log 2.0 54 0.9221 0.1264 0.9221 0.9603
No log 2.0741 56 0.7818 0.2285 0.7818 0.8842
No log 2.1481 58 0.6869 0.1983 0.6869 0.8288
No log 2.2222 60 0.7223 0.0481 0.7223 0.8499
No log 2.2963 62 0.7632 0.2132 0.7632 0.8736
No log 2.3704 64 0.6797 0.1372 0.6797 0.8244
No log 2.4444 66 0.6410 0.1604 0.6410 0.8006
No log 2.5185 68 0.6390 0.2270 0.6390 0.7994
No log 2.5926 70 0.6376 0.2085 0.6376 0.7985
No log 2.6667 72 0.6776 0.3564 0.6776 0.8232
No log 2.7407 74 0.6286 0.2379 0.6286 0.7928
No log 2.8148 76 0.6215 0.2675 0.6215 0.7883
No log 2.8889 78 0.6335 0.2418 0.6335 0.7959
No log 2.9630 80 0.6532 0.2237 0.6532 0.8082
No log 3.0370 82 0.7027 0.3060 0.7027 0.8383
No log 3.1111 84 0.6886 0.2440 0.6886 0.8298
No log 3.1852 86 0.7167 0.2960 0.7167 0.8466
No log 3.2593 88 0.9161 0.2993 0.9161 0.9571
No log 3.3333 90 1.0609 0.2861 1.0609 1.0300
No log 3.4074 92 1.1054 0.3273 1.1054 1.0514
No log 3.4815 94 0.8779 0.2843 0.8779 0.9370
No log 3.5556 96 0.6992 0.4029 0.6992 0.8362
No log 3.6296 98 0.7449 0.3710 0.7449 0.8631
No log 3.7037 100 0.7743 0.3579 0.7743 0.8800
No log 3.7778 102 0.6422 0.3452 0.6422 0.8014
No log 3.8519 104 0.6555 0.4028 0.6555 0.8096
No log 3.9259 106 0.7082 0.3434 0.7082 0.8415
No log 4.0 108 0.6697 0.2857 0.6697 0.8183
No log 4.0741 110 0.6381 0.4019 0.6381 0.7988
No log 4.1481 112 0.6531 0.2973 0.6531 0.8081
No log 4.2222 114 0.6519 0.3502 0.6519 0.8074
No log 4.2963 116 0.6581 0.3170 0.6581 0.8112
No log 4.3704 118 0.6603 0.3385 0.6603 0.8126
No log 4.4444 120 0.6563 0.3228 0.6563 0.8101
No log 4.5185 122 0.6962 0.3701 0.6962 0.8344
No log 4.5926 124 0.7455 0.3099 0.7455 0.8634
No log 4.6667 126 0.6938 0.3387 0.6938 0.8329
No log 4.7407 128 0.6459 0.3385 0.6459 0.8037
No log 4.8148 130 0.6755 0.2822 0.6755 0.8219
No log 4.8889 132 0.6635 0.2822 0.6635 0.8146
No log 4.9630 134 0.6467 0.3523 0.6467 0.8042
No log 5.0370 136 0.7359 0.2632 0.7359 0.8578
No log 5.1111 138 0.7449 0.2812 0.7449 0.8631
No log 5.1852 140 0.6527 0.3649 0.6527 0.8079
No log 5.2593 142 0.6268 0.4517 0.6268 0.7917
No log 5.3333 144 0.6738 0.4776 0.6738 0.8208
No log 5.4074 146 0.6421 0.4841 0.6421 0.8013
No log 5.4815 148 0.6105 0.4838 0.6105 0.7813
No log 5.5556 150 0.6009 0.4891 0.6009 0.7752
No log 5.6296 152 0.6003 0.3445 0.6003 0.7748
No log 5.7037 154 0.6317 0.3622 0.6317 0.7948
No log 5.7778 156 0.6048 0.4086 0.6048 0.7777
No log 5.8519 158 0.6325 0.3093 0.6325 0.7953
No log 5.9259 160 0.7240 0.3140 0.7240 0.8509
No log 6.0 162 0.6809 0.4129 0.6809 0.8252
No log 6.0741 164 0.5758 0.5010 0.5758 0.7588
No log 6.1481 166 0.6095 0.4260 0.6095 0.7807
No log 6.2222 168 0.6169 0.3816 0.6169 0.7854
No log 6.2963 170 0.5878 0.4402 0.5878 0.7667
No log 6.3704 172 0.6144 0.5083 0.6144 0.7838
No log 6.4444 174 0.6836 0.4904 0.6836 0.8268
No log 6.5185 176 0.6362 0.4610 0.6362 0.7976
No log 6.5926 178 0.5855 0.3728 0.5855 0.7652
No log 6.6667 180 0.6370 0.4234 0.6370 0.7981
No log 6.7407 182 0.7271 0.4430 0.7271 0.8527
No log 6.8148 184 0.7246 0.4606 0.7246 0.8512
No log 6.8889 186 0.6460 0.4523 0.6460 0.8037
No log 6.9630 188 0.5938 0.5107 0.5938 0.7706
No log 7.0370 190 0.6003 0.5075 0.6003 0.7748
No log 7.1111 192 0.6168 0.4615 0.6168 0.7854
No log 7.1852 194 0.6367 0.5159 0.6367 0.7980
No log 7.2593 196 0.6437 0.4934 0.6437 0.8023
No log 7.3333 198 0.6543 0.4742 0.6543 0.8089
No log 7.4074 200 0.6661 0.4345 0.6661 0.8162
No log 7.4815 202 0.6549 0.4701 0.6549 0.8093
No log 7.5556 204 0.7174 0.4212 0.7174 0.8470
No log 7.6296 206 0.7242 0.4212 0.7242 0.8510
No log 7.7037 208 0.6684 0.4828 0.6684 0.8175
No log 7.7778 210 0.6639 0.4828 0.6639 0.8148
No log 7.8519 212 0.6553 0.4006 0.6553 0.8095
No log 7.9259 214 0.6695 0.3163 0.6695 0.8182
No log 8.0 216 0.6838 0.4139 0.6838 0.8269
No log 8.0741 218 0.6683 0.4096 0.6683 0.8175
No log 8.1481 220 0.6464 0.4194 0.6464 0.8040
No log 8.2222 222 0.6723 0.4301 0.6723 0.8200
No log 8.2963 224 0.6752 0.4523 0.6752 0.8217
No log 8.3704 226 0.6351 0.4423 0.6351 0.7969
No log 8.4444 228 0.6678 0.3953 0.6678 0.8172
No log 8.5185 230 0.6381 0.4444 0.6381 0.7988
No log 8.5926 232 0.6217 0.4029 0.6217 0.7885
No log 8.6667 234 0.6036 0.5042 0.6036 0.7769
No log 8.7407 236 0.5920 0.5042 0.5920 0.7694
No log 8.8148 238 0.5880 0.5114 0.5880 0.7668
No log 8.8889 240 0.5938 0.5095 0.5938 0.7706
No log 8.9630 242 0.6156 0.4555 0.6156 0.7846
No log 9.0370 244 0.6230 0.5254 0.6230 0.7893
No log 9.1111 246 0.5954 0.4463 0.5954 0.7716
No log 9.1852 248 0.5865 0.4463 0.5865 0.7658
No log 9.2593 250 0.5885 0.4681 0.5885 0.7671
No log 9.3333 252 0.5838 0.4885 0.5838 0.7641
No log 9.4074 254 0.6167 0.4614 0.6167 0.7853
No log 9.4815 256 0.6355 0.3925 0.6355 0.7972
No log 9.5556 258 0.6043 0.4044 0.6043 0.7774
No log 9.6296 260 0.6239 0.4660 0.6239 0.7899
No log 9.7037 262 0.7763 0.3955 0.7763 0.8811
No log 9.7778 264 0.9238 0.3522 0.9238 0.9612
No log 9.8519 266 0.8822 0.4479 0.8822 0.9393
No log 9.9259 268 0.7096 0.4721 0.7096 0.8424
No log 10.0 270 0.6596 0.3754 0.6596 0.8122
No log 10.0741 272 0.7437 0.3383 0.7437 0.8624
No log 10.1481 274 0.7278 0.3425 0.7278 0.8531
No log 10.2222 276 0.6538 0.4240 0.6538 0.8086
No log 10.2963 278 0.6749 0.4207 0.6749 0.8215
No log 10.3704 280 0.6974 0.4303 0.6974 0.8351
No log 10.4444 282 0.6433 0.4724 0.6433 0.8021
No log 10.5185 284 0.6146 0.4768 0.6146 0.7840
No log 10.5926 286 0.6791 0.4058 0.6791 0.8241
No log 10.6667 288 0.7750 0.3180 0.7750 0.8803
No log 10.7407 290 0.7824 0.3239 0.7824 0.8845
No log 10.8148 292 0.6786 0.4411 0.6786 0.8238
No log 10.8889 294 0.5744 0.5246 0.5744 0.7579
No log 10.9630 296 0.5690 0.4314 0.5690 0.7543
No log 11.0370 298 0.6112 0.3763 0.6112 0.7818
No log 11.1111 300 0.5937 0.4875 0.5937 0.7705
No log 11.1852 302 0.5596 0.4639 0.5596 0.7481
No log 11.2593 304 0.5695 0.4948 0.5695 0.7546
No log 11.3333 306 0.5516 0.4883 0.5516 0.7427
No log 11.4074 308 0.5307 0.4726 0.5307 0.7285
No log 11.4815 310 0.5082 0.5324 0.5082 0.7129
No log 11.5556 312 0.5540 0.4389 0.5540 0.7443
No log 11.6296 314 0.6185 0.4509 0.6185 0.7865
No log 11.7037 316 0.6032 0.4489 0.6032 0.7766
No log 11.7778 318 0.5672 0.5171 0.5672 0.7531
No log 11.8519 320 0.5569 0.5114 0.5569 0.7462
No log 11.9259 322 0.6008 0.4029 0.6008 0.7751
No log 12.0 324 0.6082 0.3393 0.6082 0.7799
No log 12.0741 326 0.5893 0.3640 0.5893 0.7677
No log 12.1481 328 0.5611 0.4908 0.5611 0.7491
No log 12.2222 330 0.5589 0.3781 0.5589 0.7476
No log 12.2963 332 0.5732 0.4703 0.5732 0.7571
No log 12.3704 334 0.5881 0.3970 0.5881 0.7668
No log 12.4444 336 0.6008 0.4044 0.6008 0.7751
No log 12.5185 338 0.6201 0.3569 0.6201 0.7874
No log 12.5926 340 0.5815 0.4314 0.5815 0.7626
No log 12.6667 342 0.5434 0.4147 0.5434 0.7372
No log 12.7407 344 0.5234 0.5440 0.5234 0.7235
No log 12.8148 346 0.5441 0.4466 0.5441 0.7376
No log 12.8889 348 0.5787 0.4713 0.5787 0.7608
No log 12.9630 350 0.5563 0.4653 0.5563 0.7458
No log 13.0370 352 0.5221 0.4361 0.5221 0.7226
No log 13.1111 354 0.5767 0.4867 0.5767 0.7594
No log 13.1852 356 0.6607 0.3169 0.6607 0.8128
No log 13.2593 358 0.6406 0.3519 0.6406 0.8004
No log 13.3333 360 0.5682 0.4524 0.5682 0.7538
No log 13.4074 362 0.5283 0.4052 0.5283 0.7269
No log 13.4815 364 0.5265 0.4634 0.5265 0.7256
No log 13.5556 366 0.5316 0.4918 0.5316 0.7291
No log 13.6296 368 0.5481 0.4888 0.5481 0.7404
No log 13.7037 370 0.5427 0.4655 0.5427 0.7367
No log 13.7778 372 0.5133 0.5379 0.5133 0.7164
No log 13.8519 374 0.5580 0.5605 0.5580 0.7470
No log 13.9259 376 0.5931 0.5246 0.5931 0.7702
No log 14.0 378 0.5565 0.5403 0.5565 0.7460
No log 14.0741 380 0.5063 0.5567 0.5063 0.7115
No log 14.1481 382 0.4965 0.5304 0.4965 0.7046
No log 14.2222 384 0.5062 0.4611 0.5062 0.7115
No log 14.2963 386 0.5000 0.5379 0.5000 0.7071
No log 14.3704 388 0.5001 0.5076 0.5001 0.7072
No log 14.4444 390 0.5314 0.4493 0.5314 0.7289
No log 14.5185 392 0.5524 0.5254 0.5524 0.7432
No log 14.5926 394 0.5242 0.5272 0.5242 0.7240
No log 14.6667 396 0.4911 0.6242 0.4911 0.7008
No log 14.7407 398 0.4932 0.5875 0.4932 0.7023
No log 14.8148 400 0.5078 0.5523 0.5078 0.7126
No log 14.8889 402 0.4938 0.5797 0.4938 0.7027
No log 14.9630 404 0.4851 0.6242 0.4851 0.6965
No log 15.0370 406 0.5041 0.5455 0.5041 0.7100
No log 15.1111 408 0.5075 0.5232 0.5075 0.7124
No log 15.1852 410 0.5124 0.5455 0.5124 0.7158
No log 15.2593 412 0.4981 0.5550 0.4981 0.7057
No log 15.3333 414 0.4971 0.5550 0.4971 0.7051
No log 15.4074 416 0.5127 0.5307 0.5127 0.7160
No log 15.4815 418 0.5725 0.4568 0.5725 0.7566
No log 15.5556 420 0.6189 0.4423 0.6189 0.7867
No log 15.6296 422 0.5766 0.4795 0.5766 0.7593
No log 15.7037 424 0.5441 0.5501 0.5441 0.7377
No log 15.7778 426 0.5194 0.5550 0.5194 0.7207
No log 15.8519 428 0.5197 0.5042 0.5197 0.7209
No log 15.9259 430 0.5429 0.4849 0.5429 0.7368
No log 16.0 432 0.5420 0.4849 0.5420 0.7362
No log 16.0741 434 0.5297 0.4634 0.5297 0.7278
No log 16.1481 436 0.5313 0.4857 0.5313 0.7289
No log 16.2222 438 0.5429 0.4591 0.5429 0.7368
No log 16.2963 440 0.5770 0.4618 0.5770 0.7596
No log 16.3704 442 0.5905 0.4451 0.5905 0.7685
No log 16.4444 444 0.5919 0.4602 0.5919 0.7693
No log 16.5185 446 0.5422 0.5516 0.5422 0.7364
No log 16.5926 448 0.5211 0.5852 0.5211 0.7219
No log 16.6667 450 0.5107 0.5732 0.5107 0.7147
No log 16.7407 452 0.5179 0.5404 0.5179 0.7197
No log 16.8148 454 0.5353 0.5195 0.5353 0.7316
No log 16.8889 456 0.5278 0.5289 0.5278 0.7265
No log 16.9630 458 0.5107 0.4949 0.5107 0.7146
No log 17.0370 460 0.5209 0.4838 0.5209 0.7217
No log 17.1111 462 0.5227 0.4838 0.5227 0.7230
No log 17.1852 464 0.5347 0.4838 0.5347 0.7313
No log 17.2593 466 0.5396 0.4407 0.5396 0.7346
No log 17.3333 468 0.5331 0.4101 0.5331 0.7301
No log 17.4074 470 0.5349 0.4613 0.5349 0.7314
No log 17.4815 472 0.5464 0.4229 0.5464 0.7392
No log 17.5556 474 0.5649 0.4397 0.5649 0.7516
No log 17.6296 476 0.5978 0.4430 0.5978 0.7732
No log 17.7037 478 0.6183 0.4430 0.6183 0.7863
No log 17.7778 480 0.6370 0.4430 0.6370 0.7981
No log 17.8519 482 0.6048 0.4020 0.6048 0.7777
No log 17.9259 484 0.5751 0.4100 0.5751 0.7583
No log 18.0 486 0.5463 0.4124 0.5463 0.7391
No log 18.0741 488 0.5340 0.4067 0.5340 0.7307
No log 18.1481 490 0.5411 0.4576 0.5411 0.7356
No log 18.2222 492 0.5161 0.4569 0.5161 0.7184
No log 18.2963 494 0.5074 0.4569 0.5074 0.7123
No log 18.3704 496 0.5090 0.4857 0.5090 0.7135
No log 18.4444 498 0.5104 0.5189 0.5104 0.7144
0.333 18.5185 500 0.5138 0.4929 0.5138 0.7168
0.333 18.5926 502 0.5180 0.5307 0.5180 0.7197
0.333 18.6667 504 0.5173 0.4929 0.5173 0.7192
0.333 18.7407 506 0.5183 0.5189 0.5183 0.7199
0.333 18.8148 508 0.5190 0.5208 0.5190 0.7204
0.333 18.8889 510 0.5291 0.5902 0.5291 0.7274
0.333 18.9630 512 0.5406 0.5283 0.5406 0.7352
0.333 19.0370 514 0.5414 0.5208 0.5414 0.7358
0.333 19.1111 516 0.5229 0.5171 0.5229 0.7231
0.333 19.1852 518 0.5210 0.5152 0.5210 0.7218
0.333 19.2593 520 0.5137 0.4898 0.5137 0.7167
0.333 19.3333 522 0.5092 0.5422 0.5092 0.7136
0.333 19.4074 524 0.5150 0.5404 0.5150 0.7176
0.333 19.4815 526 0.5161 0.4314 0.5161 0.7184
0.333 19.5556 528 0.5176 0.4314 0.5176 0.7194
0.333 19.6296 530 0.5254 0.4027 0.5254 0.7249
0.333 19.7037 532 0.5446 0.5345 0.5446 0.7380
0.333 19.7778 534 0.5657 0.4997 0.5657 0.7521
0.333 19.8519 536 0.5376 0.5195 0.5376 0.7332
0.333 19.9259 538 0.5149 0.4895 0.5149 0.7176
0.333 20.0 540 0.5010 0.5071 0.5010 0.7078
0.333 20.0741 542 0.5054 0.4895 0.5054 0.7109
0.333 20.1481 544 0.5118 0.4895 0.5118 0.7154
0.333 20.2222 546 0.5024 0.4828 0.5024 0.7088
0.333 20.2963 548 0.4979 0.4224 0.4979 0.7056
0.333 20.3704 550 0.5074 0.3974 0.5074 0.7123

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k10_task7_organization

Finetuned
(4204)
this model