ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5106
  • Qwk: 0.5079
  • Mse: 0.5106
  • Rmse: 0.7146

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0417 2 2.5798 -0.0262 2.5798 1.6062
No log 0.0833 4 1.3015 0.0257 1.3015 1.1408
No log 0.125 6 0.9846 0.0952 0.9846 0.9922
No log 0.1667 8 0.9859 -0.0622 0.9859 0.9929
No log 0.2083 10 1.0198 0.1010 1.0198 1.0098
No log 0.25 12 1.0319 0.1573 1.0319 1.0158
No log 0.2917 14 0.8183 0.2604 0.8183 0.9046
No log 0.3333 16 0.7493 0.1604 0.7493 0.8656
No log 0.375 18 0.7263 0.1604 0.7263 0.8522
No log 0.4167 20 0.7026 0.2713 0.7026 0.8382
No log 0.4583 22 0.6832 0.2374 0.6832 0.8265
No log 0.5 24 0.6945 0.1272 0.6945 0.8333
No log 0.5417 26 0.6746 0.1282 0.6746 0.8214
No log 0.5833 28 0.6806 0.1282 0.6806 0.8250
No log 0.625 30 0.7425 0.1358 0.7425 0.8617
No log 0.6667 32 0.8876 0.3051 0.8876 0.9421
No log 0.7083 34 0.9346 0.2908 0.9346 0.9667
No log 0.75 36 0.8546 0.2109 0.8546 0.9244
No log 0.7917 38 0.7977 0.0 0.7977 0.8931
No log 0.8333 40 0.7529 0.0 0.7529 0.8677
No log 0.875 42 0.7419 0.0840 0.7419 0.8614
No log 0.9167 44 0.7544 0.1236 0.7544 0.8686
No log 0.9583 46 0.7678 0.1617 0.7678 0.8763
No log 1.0 48 0.7639 0.1617 0.7639 0.8740
No log 1.0417 50 0.7656 0.1617 0.7656 0.8750
No log 1.0833 52 0.7988 0.1648 0.7988 0.8937
No log 1.125 54 0.9725 0.2939 0.9725 0.9861
No log 1.1667 56 0.8658 0.3231 0.8658 0.9305
No log 1.2083 58 0.6718 0.2145 0.6718 0.8197
No log 1.25 60 0.6417 0.2405 0.6417 0.8011
No log 1.2917 62 0.6544 0.3151 0.6544 0.8089
No log 1.3333 64 0.5926 0.2923 0.5926 0.7698
No log 1.375 66 0.5840 0.3502 0.5840 0.7642
No log 1.4167 68 0.5950 0.4526 0.5950 0.7714
No log 1.4583 70 0.6134 0.4294 0.6134 0.7832
No log 1.5 72 0.6334 0.4850 0.6334 0.7959
No log 1.5417 74 0.6144 0.4294 0.6144 0.7838
No log 1.5833 76 0.6195 0.5142 0.6195 0.7871
No log 1.625 78 0.7888 0.3567 0.7888 0.8882
No log 1.6667 80 0.8591 0.3095 0.8591 0.9269
No log 1.7083 82 0.6896 0.4516 0.6896 0.8304
No log 1.75 84 0.6866 0.3953 0.6866 0.8286
No log 1.7917 86 0.6918 0.3706 0.6918 0.8318
No log 1.8333 88 0.6630 0.3656 0.6630 0.8143
No log 1.875 90 0.7336 0.3840 0.7336 0.8565
No log 1.9167 92 0.6924 0.3296 0.6924 0.8321
No log 1.9583 94 0.6084 0.2923 0.6084 0.7800
No log 2.0 96 0.6327 0.3563 0.6327 0.7954
No log 2.0417 98 0.7051 0.3522 0.7051 0.8397
No log 2.0833 100 0.6547 0.4083 0.6547 0.8091
No log 2.125 102 0.6031 0.2783 0.6031 0.7766
No log 2.1667 104 0.7852 0.3319 0.7852 0.8861
No log 2.2083 106 0.8404 0.3562 0.8404 0.9167
No log 2.25 108 0.6887 0.3840 0.6887 0.8299
No log 2.2917 110 0.6108 0.4074 0.6108 0.7815
No log 2.3333 112 0.5918 0.3859 0.5918 0.7693
No log 2.375 114 0.5867 0.3478 0.5867 0.7660
No log 2.4167 116 0.6434 0.4597 0.6434 0.8021
No log 2.4583 118 0.6364 0.4835 0.6364 0.7978
No log 2.5 120 0.5779 0.4684 0.5779 0.7602
No log 2.5417 122 0.5992 0.4782 0.5992 0.7740
No log 2.5833 124 0.5553 0.4455 0.5553 0.7452
No log 2.625 126 0.5550 0.4681 0.5550 0.7450
No log 2.6667 128 0.7089 0.5215 0.7089 0.8420
No log 2.7083 130 0.9969 0.3798 0.9969 0.9985
No log 2.75 132 0.7939 0.4366 0.7939 0.8910
No log 2.7917 134 0.5935 0.4212 0.5935 0.7704
No log 2.8333 136 0.6088 0.4051 0.6088 0.7802
No log 2.875 138 0.6471 0.4466 0.6471 0.8044
No log 2.9167 140 0.6424 0.3517 0.6424 0.8015
No log 2.9583 142 0.6223 0.4434 0.6223 0.7889
No log 3.0 144 0.8060 0.3909 0.8060 0.8978
No log 3.0417 146 0.7326 0.4444 0.7326 0.8559
No log 3.0833 148 0.5918 0.4555 0.5918 0.7693
No log 3.125 150 0.6317 0.3069 0.6317 0.7948
No log 3.1667 152 0.6055 0.3373 0.6055 0.7781
No log 3.2083 154 0.5699 0.4661 0.5699 0.7550
No log 3.25 156 0.5772 0.4473 0.5772 0.7597
No log 3.2917 158 0.5857 0.4352 0.5857 0.7653
No log 3.3333 160 0.5822 0.2852 0.5822 0.7630
No log 3.375 162 0.5746 0.3474 0.5746 0.7580
No log 3.4167 164 0.5620 0.3728 0.5620 0.7497
No log 3.4583 166 0.5711 0.3738 0.5711 0.7557
No log 3.5 168 0.6099 0.4222 0.6099 0.7809
No log 3.5417 170 0.5911 0.4516 0.5911 0.7688
No log 3.5833 172 0.5865 0.5319 0.5865 0.7658
No log 3.625 174 0.5601 0.5286 0.5601 0.7484
No log 3.6667 176 0.5481 0.5682 0.5481 0.7404
No log 3.7083 178 0.5313 0.5951 0.5313 0.7289
No log 3.75 180 0.5334 0.4904 0.5334 0.7303
No log 3.7917 182 0.6121 0.4395 0.6121 0.7823
No log 3.8333 184 0.6647 0.4341 0.6647 0.8153
No log 3.875 186 0.7135 0.4380 0.7135 0.8447
No log 3.9167 188 0.6309 0.4967 0.6309 0.7943
No log 3.9583 190 0.5569 0.5234 0.5569 0.7463
No log 4.0 192 0.5487 0.5084 0.5487 0.7407
No log 4.0417 194 0.5491 0.4757 0.5491 0.7410
No log 4.0833 196 0.5601 0.5009 0.5601 0.7484
No log 4.125 198 0.5529 0.4776 0.5529 0.7436
No log 4.1667 200 0.5447 0.4343 0.5447 0.7381
No log 4.2083 202 0.5522 0.4149 0.5522 0.7431
No log 4.25 204 0.5946 0.4732 0.5946 0.7711
No log 4.2917 206 0.5471 0.5587 0.5471 0.7397
No log 4.3333 208 0.5259 0.5234 0.5259 0.7252
No log 4.375 210 0.5152 0.5039 0.5152 0.7178
No log 4.4167 212 0.5098 0.4973 0.5098 0.7140
No log 4.4583 214 0.5374 0.5779 0.5374 0.7331
No log 4.5 216 0.5731 0.5922 0.5731 0.7570
No log 4.5417 218 0.6122 0.5664 0.6122 0.7824
No log 4.5833 220 0.5549 0.5584 0.5549 0.7449
No log 4.625 222 0.5120 0.5533 0.5120 0.7156
No log 4.6667 224 0.5581 0.6070 0.5581 0.7470
No log 4.7083 226 0.6007 0.5673 0.6007 0.7751
No log 4.75 228 0.5447 0.5886 0.5447 0.7380
No log 4.7917 230 0.5234 0.5009 0.5234 0.7235
No log 4.8333 232 0.5211 0.4776 0.5211 0.7219
No log 4.875 234 0.5226 0.4329 0.5226 0.7229
No log 4.9167 236 0.5155 0.4114 0.5155 0.7180
No log 4.9583 238 0.5322 0.4493 0.5322 0.7295
No log 5.0 240 0.5173 0.4937 0.5173 0.7192
No log 5.0417 242 0.5125 0.4991 0.5125 0.7159
No log 5.0833 244 0.5113 0.5523 0.5113 0.7150
No log 5.125 246 0.5229 0.5483 0.5229 0.7231
No log 5.1667 248 0.5016 0.4923 0.5016 0.7083
No log 5.2083 250 0.5542 0.5166 0.5542 0.7445
No log 5.25 252 0.6414 0.4606 0.6414 0.8009
No log 5.2917 254 0.5573 0.4948 0.5573 0.7465
No log 5.3333 256 0.5294 0.3806 0.5294 0.7276
No log 5.375 258 0.6844 0.3777 0.6844 0.8273
No log 5.4167 260 0.7034 0.3777 0.7034 0.8387
No log 5.4583 262 0.5797 0.3999 0.5797 0.7614
No log 5.5 264 0.5183 0.4837 0.5183 0.7199
No log 5.5417 266 0.5427 0.5034 0.5427 0.7367
No log 5.5833 268 0.5146 0.4660 0.5146 0.7173
No log 5.625 270 0.5005 0.5125 0.5005 0.7075
No log 5.6667 272 0.5952 0.4842 0.5952 0.7715
No log 5.7083 274 0.6344 0.4783 0.6344 0.7965
No log 5.75 276 0.5641 0.5283 0.5641 0.7511
No log 5.7917 278 0.5278 0.5177 0.5278 0.7265
No log 5.8333 280 0.6026 0.4721 0.6026 0.7763
No log 5.875 282 0.6115 0.4721 0.6115 0.7820
No log 5.9167 284 0.5563 0.4337 0.5563 0.7459
No log 5.9583 286 0.5612 0.4504 0.5612 0.7491
No log 6.0 288 0.5681 0.3651 0.5681 0.7537
No log 6.0417 290 0.5756 0.3915 0.5756 0.7587
No log 6.0833 292 0.6349 0.4292 0.6349 0.7968
No log 6.125 294 0.7296 0.4387 0.7296 0.8542
No log 6.1667 296 0.7129 0.4224 0.7129 0.8444
No log 6.2083 298 0.6361 0.4292 0.6361 0.7976
No log 6.25 300 0.5897 0.4229 0.5897 0.7679
No log 6.2917 302 0.5821 0.4171 0.5821 0.7629
No log 6.3333 304 0.6046 0.4314 0.6046 0.7776
No log 6.375 306 0.6341 0.4292 0.6341 0.7963
No log 6.4167 308 0.6629 0.4020 0.6629 0.8142
No log 6.4583 310 0.6282 0.4547 0.6282 0.7926
No log 6.5 312 0.6143 0.3702 0.6143 0.7838
No log 6.5417 314 0.5975 0.3995 0.5975 0.7730
No log 6.5833 316 0.6535 0.4451 0.6535 0.8084
No log 6.625 318 0.7171 0.4239 0.7171 0.8468
No log 6.6667 320 0.7063 0.4408 0.7063 0.8404
No log 6.7083 322 0.6532 0.4642 0.6532 0.8082
No log 6.75 324 0.6406 0.4103 0.6406 0.8004
No log 6.7917 326 0.5943 0.3864 0.5943 0.7709
No log 6.8333 328 0.5725 0.4617 0.5725 0.7566
No log 6.875 330 0.5639 0.4380 0.5639 0.7509
No log 6.9167 332 0.5652 0.3837 0.5652 0.7518
No log 6.9583 334 0.5607 0.4722 0.5607 0.7488
No log 7.0 336 0.5584 0.4722 0.5584 0.7473
No log 7.0417 338 0.5579 0.4484 0.5579 0.7469
No log 7.0833 340 0.5703 0.4849 0.5703 0.7552
No log 7.125 342 0.5642 0.3426 0.5642 0.7511
No log 7.1667 344 0.5915 0.3675 0.5915 0.7691
No log 7.2083 346 0.6453 0.3640 0.6453 0.8033
No log 7.25 348 0.6273 0.3640 0.6273 0.7920
No log 7.2917 350 0.5626 0.3970 0.5626 0.7501
No log 7.3333 352 0.5447 0.4526 0.5447 0.7380
No log 7.375 354 0.5423 0.4441 0.5423 0.7364
No log 7.4167 356 0.5610 0.4314 0.5610 0.7490
No log 7.4583 358 0.5503 0.4596 0.5503 0.7418
No log 7.5 360 0.5313 0.4617 0.5313 0.7289
No log 7.5417 362 0.5302 0.4617 0.5302 0.7281
No log 7.5833 364 0.5545 0.4234 0.5545 0.7446
No log 7.625 366 0.5731 0.5237 0.5731 0.7570
No log 7.6667 368 0.5178 0.4314 0.5178 0.7196
No log 7.7083 370 0.5037 0.5022 0.5037 0.7097
No log 7.75 372 0.5018 0.4885 0.5018 0.7084
No log 7.7917 374 0.4934 0.5125 0.4934 0.7025
No log 7.8333 376 0.4885 0.4640 0.4885 0.6989
No log 7.875 378 0.4899 0.4985 0.4899 0.6999
No log 7.9167 380 0.4863 0.4719 0.4863 0.6973
No log 7.9583 382 0.4990 0.5722 0.4990 0.7064
No log 8.0 384 0.5661 0.5414 0.5661 0.7524
No log 8.0417 386 0.5382 0.5607 0.5382 0.7336
No log 8.0833 388 0.5169 0.5874 0.5169 0.7190
No log 8.125 390 0.6998 0.5719 0.6998 0.8365
No log 8.1667 392 0.8034 0.5364 0.8034 0.8963
No log 8.2083 394 0.7192 0.5185 0.7192 0.8480
No log 8.25 396 0.5696 0.4724 0.5696 0.7547
No log 8.2917 398 0.5242 0.4444 0.5242 0.7240
No log 8.3333 400 0.5710 0.4633 0.5710 0.7557
No log 8.375 402 0.5605 0.4486 0.5605 0.7487
No log 8.4167 404 0.5360 0.4160 0.5360 0.7322
No log 8.4583 406 0.5396 0.4114 0.5396 0.7346
No log 8.5 408 0.5383 0.4114 0.5383 0.7337
No log 8.5417 410 0.5386 0.4114 0.5386 0.7339
No log 8.5833 412 0.5351 0.4059 0.5351 0.7315
No log 8.625 414 0.5256 0.4505 0.5256 0.7250
No log 8.6667 416 0.5312 0.4737 0.5312 0.7288
No log 8.7083 418 0.5295 0.5479 0.5295 0.7276
No log 8.75 420 0.5167 0.5397 0.5167 0.7188
No log 8.7917 422 0.5478 0.4681 0.5478 0.7401
No log 8.8333 424 0.6316 0.5624 0.6316 0.7948
No log 8.875 426 0.6512 0.5659 0.6512 0.8070
No log 8.9167 428 0.5834 0.4681 0.5834 0.7638
No log 8.9583 430 0.5190 0.5565 0.5190 0.7204
No log 9.0 432 0.5208 0.5736 0.5208 0.7217
No log 9.0417 434 0.5738 0.5636 0.5738 0.7575
No log 9.0833 436 0.5646 0.5706 0.5646 0.7514
No log 9.125 438 0.5118 0.5765 0.5118 0.7154
No log 9.1667 440 0.5616 0.4763 0.5616 0.7494
No log 9.2083 442 0.6635 0.4892 0.6635 0.8146
No log 9.25 444 0.6913 0.4892 0.6913 0.8314
No log 9.2917 446 0.7040 0.4892 0.7040 0.8390
No log 9.3333 448 0.6796 0.3867 0.6796 0.8244
No log 9.375 450 0.6293 0.4393 0.6293 0.7933
No log 9.4167 452 0.5808 0.4914 0.5808 0.7621
No log 9.4583 454 0.5748 0.4463 0.5748 0.7581
No log 9.5 456 0.5773 0.4463 0.5773 0.7598
No log 9.5417 458 0.6058 0.4681 0.6058 0.7783
No log 9.5833 460 0.6704 0.4352 0.6704 0.8188
No log 9.625 462 0.6879 0.4597 0.6879 0.8294
No log 9.6667 464 0.6756 0.3894 0.6756 0.8220
No log 9.7083 466 0.6484 0.3919 0.6484 0.8052
No log 9.75 468 0.6179 0.4001 0.6179 0.7861
No log 9.7917 470 0.6085 0.4147 0.6085 0.7801
No log 9.8333 472 0.5844 0.4617 0.5844 0.7645
No log 9.875 474 0.5597 0.4617 0.5597 0.7481
No log 9.9167 476 0.5562 0.4763 0.5562 0.7458
No log 9.9583 478 0.5559 0.4543 0.5559 0.7456
No log 10.0 480 0.5446 0.5434 0.5446 0.7379
No log 10.0417 482 0.5460 0.4763 0.5460 0.7389
No log 10.0833 484 0.5487 0.4763 0.5487 0.7408
No log 10.125 486 0.5628 0.4985 0.5628 0.7502
No log 10.1667 488 0.5608 0.4934 0.5608 0.7488
No log 10.2083 490 0.5485 0.4701 0.5485 0.7406
No log 10.25 492 0.5446 0.4701 0.5446 0.7380
No log 10.2917 494 0.5856 0.4763 0.5856 0.7652
No log 10.3333 496 0.6613 0.5211 0.6613 0.8132
No log 10.375 498 0.6473 0.5459 0.6473 0.8045
0.3249 10.4167 500 0.5972 0.5173 0.5972 0.7728
0.3249 10.4583 502 0.5334 0.5428 0.5334 0.7304
0.3249 10.5 504 0.4846 0.5565 0.4846 0.6961
0.3249 10.5417 506 0.4767 0.5565 0.4767 0.6904
0.3249 10.5833 508 0.4783 0.5413 0.4783 0.6916
0.3249 10.625 510 0.4808 0.5413 0.4808 0.6934
0.3249 10.6667 512 0.4803 0.5177 0.4803 0.6931
0.3249 10.7083 514 0.4806 0.5177 0.4806 0.6932
0.3249 10.75 516 0.4708 0.5413 0.4708 0.6862
0.3249 10.7917 518 0.4775 0.5640 0.4775 0.6910
0.3249 10.8333 520 0.5044 0.5640 0.5044 0.7102
0.3249 10.875 522 0.5136 0.5677 0.5136 0.7166
0.3249 10.9167 524 0.5008 0.5565 0.5008 0.7077
0.3249 10.9583 526 0.4925 0.5640 0.4925 0.7018
0.3249 11.0 528 0.5083 0.5640 0.5083 0.7130
0.3249 11.0417 530 0.4929 0.5640 0.4929 0.7021
0.3249 11.0833 532 0.5011 0.5177 0.5011 0.7079
0.3249 11.125 534 0.5547 0.4596 0.5547 0.7448
0.3249 11.1667 536 0.5710 0.5141 0.5710 0.7556
0.3249 11.2083 538 0.5293 0.4681 0.5293 0.7275
0.3249 11.25 540 0.4933 0.5177 0.4933 0.7024
0.3249 11.2917 542 0.5009 0.5397 0.5009 0.7078
0.3249 11.3333 544 0.5004 0.5397 0.5004 0.7074
0.3249 11.375 546 0.4975 0.5413 0.4975 0.7054
0.3249 11.4167 548 0.5188 0.4934 0.5188 0.7203
0.3249 11.4583 550 0.6036 0.5499 0.6036 0.7769
0.3249 11.5 552 0.6145 0.4704 0.6145 0.7839
0.3249 11.5417 554 0.5630 0.4660 0.5630 0.7503
0.3249 11.5833 556 0.5299 0.4934 0.5299 0.7279
0.3249 11.625 558 0.5109 0.4701 0.5109 0.7148
0.3249 11.6667 560 0.5106 0.5079 0.5106 0.7146

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k19_task7_organization

Finetuned
(4204)
this model