ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k12_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6880
  • Qwk: 0.5028
  • Mse: 0.6880
  • Rmse: 0.8294

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0625 2 3.8998 0.0124 3.8998 1.9748
No log 0.125 4 1.8693 0.0318 1.8693 1.3672
No log 0.1875 6 1.1996 -0.0627 1.1996 1.0953
No log 0.25 8 1.0795 0.2441 1.0795 1.0390
No log 0.3125 10 1.0976 0.1418 1.0976 1.0476
No log 0.375 12 1.2351 0.0249 1.2351 1.1114
No log 0.4375 14 1.4945 -0.0858 1.4945 1.2225
No log 0.5 16 1.6856 -0.0411 1.6856 1.2983
No log 0.5625 18 1.5115 -0.0560 1.5115 1.2294
No log 0.625 20 1.2983 -0.0328 1.2983 1.1394
No log 0.6875 22 1.1399 0.1268 1.1399 1.0676
No log 0.75 24 1.0546 0.2416 1.0546 1.0270
No log 0.8125 26 1.0514 0.0762 1.0514 1.0254
No log 0.875 28 1.0289 0.1076 1.0289 1.0143
No log 0.9375 30 1.0153 0.4051 1.0153 1.0076
No log 1.0 32 1.0239 0.2343 1.0239 1.0119
No log 1.0625 34 1.1216 0.1142 1.1216 1.0591
No log 1.125 36 1.1869 0.0 1.1869 1.0894
No log 1.1875 38 1.1328 0.0996 1.1328 1.0643
No log 1.25 40 0.9713 0.4167 0.9713 0.9855
No log 1.3125 42 0.9117 0.4031 0.9117 0.9548
No log 1.375 44 0.9185 0.4218 0.9185 0.9584
No log 1.4375 46 0.9131 0.4512 0.9131 0.9556
No log 1.5 48 0.9881 0.3790 0.9881 0.9940
No log 1.5625 50 1.1039 0.2513 1.1039 1.0507
No log 1.625 52 1.1056 0.2850 1.1056 1.0515
No log 1.6875 54 0.9419 0.375 0.9419 0.9705
No log 1.75 56 0.9125 0.2314 0.9125 0.9553
No log 1.8125 58 1.0117 0.1799 1.0117 1.0058
No log 1.875 60 1.0029 0.1545 1.0029 1.0014
No log 1.9375 62 0.9630 0.1783 0.9630 0.9813
No log 2.0 64 0.9759 0.3310 0.9759 0.9879
No log 2.0625 66 0.9466 0.4167 0.9466 0.9730
No log 2.125 68 0.8062 0.3435 0.8062 0.8979
No log 2.1875 70 0.7766 0.3652 0.7766 0.8813
No log 2.25 72 0.8180 0.3164 0.8180 0.9044
No log 2.3125 74 0.7902 0.3603 0.7902 0.8889
No log 2.375 76 0.7130 0.4831 0.7130 0.8444
No log 2.4375 78 0.7098 0.5763 0.7098 0.8425
No log 2.5 80 0.7035 0.5559 0.7035 0.8387
No log 2.5625 82 0.6607 0.5153 0.6607 0.8128
No log 2.625 84 0.6674 0.5562 0.6674 0.8170
No log 2.6875 86 0.6472 0.6272 0.6472 0.8045
No log 2.75 88 0.6978 0.6015 0.6978 0.8353
No log 2.8125 90 0.8254 0.5614 0.8254 0.9085
No log 2.875 92 1.0173 0.3942 1.0173 1.0086
No log 2.9375 94 1.0334 0.4073 1.0334 1.0166
No log 3.0 96 0.9522 0.4668 0.9522 0.9758
No log 3.0625 98 0.8172 0.6035 0.8172 0.9040
No log 3.125 100 0.7584 0.5902 0.7584 0.8708
No log 3.1875 102 0.7434 0.5675 0.7434 0.8622
No log 3.25 104 0.7553 0.5521 0.7553 0.8691
No log 3.3125 106 0.6617 0.6071 0.6617 0.8134
No log 3.375 108 0.6579 0.6445 0.6579 0.8111
No log 3.4375 110 0.7094 0.6529 0.7094 0.8423
No log 3.5 112 0.7746 0.5275 0.7746 0.8801
No log 3.5625 114 0.8172 0.5485 0.8172 0.9040
No log 3.625 116 0.7899 0.5239 0.7899 0.8887
No log 3.6875 118 0.8011 0.5968 0.8011 0.8950
No log 3.75 120 0.8089 0.6141 0.8089 0.8994
No log 3.8125 122 0.7102 0.6147 0.7102 0.8427
No log 3.875 124 0.6779 0.5495 0.6779 0.8234
No log 3.9375 126 0.6320 0.5603 0.6320 0.7950
No log 4.0 128 0.6061 0.5934 0.6061 0.7785
No log 4.0625 130 0.5689 0.6886 0.5689 0.7543
No log 4.125 132 0.5828 0.6719 0.5828 0.7634
No log 4.1875 134 0.5386 0.6878 0.5386 0.7339
No log 4.25 136 0.4913 0.7231 0.4913 0.7009
No log 4.3125 138 0.4835 0.7182 0.4835 0.6954
No log 4.375 140 0.5285 0.7483 0.5285 0.7270
No log 4.4375 142 0.6167 0.7469 0.6167 0.7853
No log 4.5 144 0.5436 0.7437 0.5436 0.7373
No log 4.5625 146 0.4737 0.7544 0.4737 0.6883
No log 4.625 148 0.4855 0.7449 0.4855 0.6967
No log 4.6875 150 0.5315 0.7437 0.5315 0.7291
No log 4.75 152 0.6915 0.6653 0.6915 0.8315
No log 4.8125 154 0.7098 0.6061 0.7098 0.8425
No log 4.875 156 0.6298 0.6053 0.6298 0.7936
No log 4.9375 158 0.6191 0.6301 0.6191 0.7868
No log 5.0 160 0.6033 0.6311 0.6033 0.7767
No log 5.0625 162 0.6026 0.5798 0.6026 0.7763
No log 5.125 164 0.7041 0.6170 0.7041 0.8391
No log 5.1875 166 0.9354 0.4854 0.9354 0.9671
No log 5.25 168 0.9265 0.5404 0.9265 0.9626
No log 5.3125 170 0.7382 0.6071 0.7382 0.8592
No log 5.375 172 0.6831 0.6743 0.6831 0.8265
No log 5.4375 174 0.7235 0.5995 0.7235 0.8506
No log 5.5 176 0.7447 0.5800 0.7447 0.8629
No log 5.5625 178 0.6538 0.6362 0.6538 0.8086
No log 5.625 180 0.5523 0.6973 0.5523 0.7432
No log 5.6875 182 0.5142 0.6788 0.5142 0.7171
No log 5.75 184 0.5317 0.6748 0.5317 0.7292
No log 5.8125 186 0.6390 0.6079 0.6390 0.7994
No log 5.875 188 0.6886 0.5943 0.6886 0.8298
No log 5.9375 190 0.6433 0.6229 0.6433 0.8021
No log 6.0 192 0.5722 0.6746 0.5722 0.7564
No log 6.0625 194 0.5545 0.7436 0.5545 0.7447
No log 6.125 196 0.5597 0.7436 0.5597 0.7481
No log 6.1875 198 0.5615 0.7079 0.5615 0.7493
No log 6.25 200 0.7072 0.6563 0.7072 0.8410
No log 6.3125 202 0.7178 0.6466 0.7178 0.8472
No log 6.375 204 0.5644 0.7368 0.5644 0.7513
No log 6.4375 206 0.4260 0.6980 0.4260 0.6527
No log 6.5 208 0.5349 0.6974 0.5349 0.7314
No log 6.5625 210 0.5654 0.6974 0.5654 0.7520
No log 6.625 212 0.4946 0.6087 0.4946 0.7033
No log 6.6875 214 0.5122 0.6296 0.5122 0.7157
No log 6.75 216 0.5672 0.5811 0.5672 0.7531
No log 6.8125 218 0.5607 0.6301 0.5607 0.7488
No log 6.875 220 0.5755 0.6014 0.5755 0.7586
No log 6.9375 222 0.6424 0.6015 0.6424 0.8015
No log 7.0 224 0.7807 0.6029 0.7807 0.8836
No log 7.0625 226 0.9065 0.5123 0.9065 0.9521
No log 7.125 228 0.8704 0.5145 0.8704 0.9330
No log 7.1875 230 0.7475 0.5147 0.7475 0.8646
No log 7.25 232 0.6928 0.4809 0.6928 0.8324
No log 7.3125 234 0.6443 0.5232 0.6443 0.8027
No log 7.375 236 0.6312 0.5949 0.6312 0.7945
No log 7.4375 238 0.6608 0.4937 0.6608 0.8129
No log 7.5 240 0.7649 0.5405 0.7649 0.8746
No log 7.5625 242 0.7997 0.6110 0.7997 0.8942
No log 7.625 244 0.6729 0.6275 0.6729 0.8203
No log 7.6875 246 0.5453 0.7477 0.5453 0.7384
No log 7.75 248 0.4845 0.7283 0.4845 0.6961
No log 7.8125 250 0.5035 0.7283 0.5035 0.7096
No log 7.875 252 0.5655 0.7477 0.5655 0.7520
No log 7.9375 254 0.5849 0.7531 0.5849 0.7648
No log 8.0 256 0.5314 0.7217 0.5314 0.7290
No log 8.0625 258 0.4647 0.7171 0.4647 0.6817
No log 8.125 260 0.4610 0.7179 0.4610 0.6790
No log 8.1875 262 0.4578 0.7066 0.4578 0.6766
No log 8.25 264 0.5117 0.7492 0.5117 0.7153
No log 8.3125 266 0.6155 0.6401 0.6155 0.7846
No log 8.375 268 0.6728 0.6151 0.6728 0.8203
No log 8.4375 270 0.6915 0.5734 0.6915 0.8316
No log 8.5 272 0.6398 0.6102 0.6398 0.7998
No log 8.5625 274 0.6145 0.6065 0.6145 0.7839
No log 8.625 276 0.6022 0.5747 0.6022 0.7760
No log 8.6875 278 0.5641 0.6198 0.5641 0.7511
No log 8.75 280 0.6241 0.5579 0.6241 0.7900
No log 8.8125 282 0.6690 0.5346 0.6690 0.8179
No log 8.875 284 0.6612 0.5463 0.6612 0.8131
No log 8.9375 286 0.6295 0.5663 0.6295 0.7934
No log 9.0 288 0.6285 0.5856 0.6285 0.7928
No log 9.0625 290 0.5589 0.6310 0.5589 0.7476
No log 9.125 292 0.5466 0.6420 0.5466 0.7393
No log 9.1875 294 0.6236 0.5833 0.6236 0.7897
No log 9.25 296 0.7751 0.5920 0.7751 0.8804
No log 9.3125 298 0.8750 0.5668 0.8750 0.9354
No log 9.375 300 0.8819 0.5330 0.8819 0.9391
No log 9.4375 302 0.7336 0.5320 0.7336 0.8565
No log 9.5 304 0.6952 0.5644 0.6952 0.8338
No log 9.5625 306 0.6473 0.5663 0.6473 0.8045
No log 9.625 308 0.6357 0.6151 0.6357 0.7973
No log 9.6875 310 0.7411 0.5614 0.7411 0.8609
No log 9.75 312 0.9103 0.5943 0.9103 0.9541
No log 9.8125 314 0.9162 0.5943 0.9162 0.9572
No log 9.875 316 0.7175 0.5631 0.7175 0.8470
No log 9.9375 318 0.5872 0.5927 0.5872 0.7663
No log 10.0 320 0.5730 0.6301 0.5730 0.7570
No log 10.0625 322 0.5982 0.5733 0.5982 0.7734
No log 10.125 324 0.6469 0.5437 0.6469 0.8043
No log 10.1875 326 0.7588 0.5320 0.7588 0.8711
No log 10.25 328 0.8064 0.5272 0.8064 0.8980
No log 10.3125 330 0.7763 0.5272 0.7763 0.8811
No log 10.375 332 0.7256 0.5750 0.7256 0.8518
No log 10.4375 334 0.7657 0.5562 0.7657 0.8750
No log 10.5 336 0.8493 0.4969 0.8493 0.9216
No log 10.5625 338 0.8219 0.4775 0.8219 0.9066
No log 10.625 340 0.8065 0.4775 0.8065 0.8981
No log 10.6875 342 0.6869 0.5265 0.6869 0.8288
No log 10.75 344 0.5937 0.6004 0.5937 0.7705
No log 10.8125 346 0.5117 0.7277 0.5117 0.7153
No log 10.875 348 0.4934 0.7171 0.4934 0.7024
No log 10.9375 350 0.5352 0.6719 0.5352 0.7316
No log 11.0 352 0.6561 0.5636 0.6561 0.8100
No log 11.0625 354 0.6766 0.5543 0.6766 0.8225
No log 11.125 356 0.6126 0.5491 0.6126 0.7827
No log 11.1875 358 0.5561 0.6413 0.5561 0.7457
No log 11.25 360 0.5310 0.7051 0.5310 0.7287
No log 11.3125 362 0.5524 0.6639 0.5524 0.7432
No log 11.375 364 0.6326 0.6226 0.6326 0.7954
No log 11.4375 366 0.6798 0.6385 0.6798 0.8245
No log 11.5 368 0.6136 0.6247 0.6136 0.7833
No log 11.5625 370 0.5218 0.6946 0.5218 0.7224
No log 11.625 372 0.4883 0.6597 0.4883 0.6988
No log 11.6875 374 0.4914 0.6175 0.4914 0.7010
No log 11.75 376 0.5096 0.6499 0.5096 0.7139
No log 11.8125 378 0.5218 0.6392 0.5218 0.7223
No log 11.875 380 0.5099 0.6764 0.5099 0.7141
No log 11.9375 382 0.5140 0.7012 0.5140 0.7169
No log 12.0 384 0.5074 0.7012 0.5074 0.7123
No log 12.0625 386 0.4942 0.7012 0.4942 0.7030
No log 12.125 388 0.4984 0.7213 0.4984 0.7059
No log 12.1875 390 0.5431 0.6940 0.5431 0.7370
No log 12.25 392 0.6403 0.7149 0.6403 0.8002
No log 12.3125 394 0.6484 0.6878 0.6484 0.8052
No log 12.375 396 0.5750 0.7036 0.5750 0.7583
No log 12.4375 398 0.4893 0.7341 0.4893 0.6995
No log 12.5 400 0.4747 0.7035 0.4747 0.6890
No log 12.5625 402 0.4704 0.7101 0.4704 0.6858
No log 12.625 404 0.4819 0.7141 0.4819 0.6942
No log 12.6875 406 0.5444 0.6815 0.5444 0.7378
No log 12.75 408 0.7225 0.6020 0.7225 0.8500
No log 12.8125 410 0.8058 0.5546 0.8058 0.8977
No log 12.875 412 0.7505 0.5177 0.7505 0.8663
No log 12.9375 414 0.6641 0.5463 0.6641 0.8149
No log 13.0 416 0.6724 0.5515 0.6724 0.8200
No log 13.0625 418 0.7042 0.5045 0.7042 0.8392
No log 13.125 420 0.7629 0.4157 0.7629 0.8735
No log 13.1875 422 0.7825 0.4157 0.7825 0.8846
No log 13.25 424 0.8108 0.4175 0.8108 0.9004
No log 13.3125 426 0.8044 0.4197 0.8044 0.8969
No log 13.375 428 0.8090 0.4326 0.8090 0.8995
No log 13.4375 430 0.7798 0.4667 0.7798 0.8831
No log 13.5 432 0.7228 0.5360 0.7228 0.8502
No log 13.5625 434 0.6733 0.5824 0.6733 0.8205
No log 13.625 436 0.6322 0.5875 0.6322 0.7951
No log 13.6875 438 0.6288 0.5875 0.6288 0.7930
No log 13.75 440 0.6844 0.5390 0.6844 0.8273
No log 13.8125 442 0.7568 0.5320 0.7568 0.8699
No log 13.875 444 0.7395 0.5420 0.7395 0.8599
No log 13.9375 446 0.6493 0.5567 0.6493 0.8058
No log 14.0 448 0.6001 0.6021 0.6001 0.7746
No log 14.0625 450 0.5753 0.6121 0.5753 0.7585
No log 14.125 452 0.5772 0.6121 0.5772 0.7598
No log 14.1875 454 0.5761 0.5970 0.5761 0.7590
No log 14.25 456 0.5841 0.6220 0.5841 0.7643
No log 14.3125 458 0.6320 0.6336 0.6320 0.7950
No log 14.375 460 0.6716 0.6053 0.6716 0.8195
No log 14.4375 462 0.6388 0.6154 0.6388 0.7992
No log 14.5 464 0.5529 0.6601 0.5529 0.7435
No log 14.5625 466 0.4954 0.6728 0.4954 0.7038
No log 14.625 468 0.4680 0.7402 0.4680 0.6841
No log 14.6875 470 0.4710 0.7285 0.4710 0.6863
No log 14.75 472 0.5060 0.6871 0.5060 0.7113
No log 14.8125 474 0.5924 0.6290 0.5924 0.7697
No log 14.875 476 0.5959 0.6489 0.5959 0.7720
No log 14.9375 478 0.5627 0.6422 0.5627 0.7502
No log 15.0 480 0.5402 0.6983 0.5402 0.7349
No log 15.0625 482 0.4998 0.7193 0.4998 0.7070
No log 15.125 484 0.4732 0.6659 0.4732 0.6879
No log 15.1875 486 0.4898 0.6779 0.4898 0.6998
No log 15.25 488 0.4895 0.7012 0.4895 0.6996
No log 15.3125 490 0.5249 0.7388 0.5249 0.7245
No log 15.375 492 0.5612 0.6885 0.5612 0.7491
No log 15.4375 494 0.5642 0.6619 0.5642 0.7511
No log 15.5 496 0.6264 0.6640 0.6264 0.7914
No log 15.5625 498 0.6910 0.6589 0.6910 0.8313
0.2924 15.625 500 0.6767 0.6094 0.6767 0.8226
0.2924 15.6875 502 0.6553 0.5952 0.6553 0.8095
0.2924 15.75 504 0.6032 0.6021 0.6032 0.7767
0.2924 15.8125 506 0.5653 0.6290 0.5653 0.7519
0.2924 15.875 508 0.5607 0.6582 0.5607 0.7488
0.2924 15.9375 510 0.5676 0.6132 0.5676 0.7534
0.2924 16.0 512 0.6098 0.6275 0.6098 0.7809
0.2924 16.0625 514 0.6297 0.6281 0.6297 0.7935
0.2924 16.125 516 0.6173 0.6608 0.6173 0.7857
0.2924 16.1875 518 0.5480 0.7013 0.5480 0.7403
0.2924 16.25 520 0.4986 0.7198 0.4986 0.7061
0.2924 16.3125 522 0.5003 0.7198 0.5003 0.7073
0.2924 16.375 524 0.5152 0.7191 0.5152 0.7177
0.2924 16.4375 526 0.5144 0.7348 0.5144 0.7172
0.2924 16.5 528 0.5287 0.6821 0.5287 0.7271
0.2924 16.5625 530 0.5562 0.6791 0.5562 0.7458
0.2924 16.625 532 0.5832 0.6529 0.5832 0.7637
0.2924 16.6875 534 0.5558 0.6993 0.5558 0.7455
0.2924 16.75 536 0.5247 0.6842 0.5247 0.7244
0.2924 16.8125 538 0.5110 0.6995 0.5110 0.7149
0.2924 16.875 540 0.5042 0.7131 0.5042 0.7100
0.2924 16.9375 542 0.4960 0.6886 0.4960 0.7043
0.2924 17.0 544 0.5230 0.6871 0.5230 0.7232
0.2924 17.0625 546 0.5488 0.6556 0.5488 0.7408
0.2924 17.125 548 0.5529 0.6278 0.5529 0.7436
0.2924 17.1875 550 0.5724 0.6450 0.5724 0.7566
0.2924 17.25 552 0.5595 0.6417 0.5595 0.7480
0.2924 17.3125 554 0.5368 0.7050 0.5368 0.7326
0.2924 17.375 556 0.5374 0.7015 0.5374 0.7331
0.2924 17.4375 558 0.5410 0.6906 0.5410 0.7355
0.2924 17.5 560 0.5600 0.6411 0.5600 0.7483
0.2924 17.5625 562 0.5271 0.7191 0.5271 0.7260
0.2924 17.625 564 0.4938 0.6916 0.4938 0.7027
0.2924 17.6875 566 0.4951 0.6916 0.4951 0.7036
0.2924 17.75 568 0.5220 0.7059 0.5220 0.7225
0.2924 17.8125 570 0.5961 0.6413 0.5961 0.7721
0.2924 17.875 572 0.6279 0.5902 0.6279 0.7924
0.2924 17.9375 574 0.6013 0.6596 0.6013 0.7754
0.2924 18.0 576 0.5338 0.6584 0.5338 0.7306
0.2924 18.0625 578 0.5031 0.6806 0.5031 0.7093
0.2924 18.125 580 0.4860 0.7081 0.4860 0.6971
0.2924 18.1875 582 0.4981 0.6929 0.4981 0.7058
0.2924 18.25 584 0.5627 0.7203 0.5627 0.7501
0.2924 18.3125 586 0.6312 0.6878 0.6312 0.7945
0.2924 18.375 588 0.6485 0.6909 0.6485 0.8053
0.2924 18.4375 590 0.5977 0.6738 0.5977 0.7731
0.2924 18.5 592 0.5620 0.6520 0.5620 0.7496
0.2924 18.5625 594 0.5461 0.6753 0.5461 0.7390
0.2924 18.625 596 0.5394 0.6925 0.5394 0.7344
0.2924 18.6875 598 0.5242 0.7109 0.5242 0.7240
0.2924 18.75 600 0.5296 0.6946 0.5296 0.7277
0.2924 18.8125 602 0.5359 0.6983 0.5359 0.7321
0.2924 18.875 604 0.5611 0.6871 0.5611 0.7491
0.2924 18.9375 606 0.5999 0.6035 0.5999 0.7745
0.2924 19.0 608 0.6188 0.6035 0.6188 0.7867
0.2924 19.0625 610 0.6217 0.6181 0.6217 0.7885
0.2924 19.125 612 0.5918 0.7030 0.5918 0.7693
0.2924 19.1875 614 0.5289 0.6766 0.5289 0.7273
0.2924 19.25 616 0.4887 0.7291 0.4887 0.6990
0.2924 19.3125 618 0.4783 0.7253 0.4783 0.6916
0.2924 19.375 620 0.4849 0.7049 0.4849 0.6963
0.2924 19.4375 622 0.4865 0.7253 0.4865 0.6975
0.2924 19.5 624 0.5522 0.6869 0.5522 0.7431
0.2924 19.5625 626 0.6427 0.6071 0.6427 0.8017
0.2924 19.625 628 0.6374 0.6071 0.6374 0.7984
0.2924 19.6875 630 0.5602 0.6869 0.5602 0.7485
0.2924 19.75 632 0.4917 0.6753 0.4917 0.7012
0.2924 19.8125 634 0.4830 0.6750 0.4830 0.6950
0.2924 19.875 636 0.4962 0.6566 0.4962 0.7044
0.2924 19.9375 638 0.4991 0.6598 0.4991 0.7065
0.2924 20.0 640 0.5182 0.6805 0.5182 0.7198
0.2924 20.0625 642 0.5493 0.6878 0.5493 0.7412
0.2924 20.125 644 0.6080 0.6605 0.6080 0.7797
0.2924 20.1875 646 0.6353 0.6136 0.6353 0.7970
0.2924 20.25 648 0.6259 0.6296 0.6259 0.7912
0.2924 20.3125 650 0.6217 0.6263 0.6217 0.7885
0.2924 20.375 652 0.5919 0.6740 0.5919 0.7694
0.2924 20.4375 654 0.5643 0.6938 0.5643 0.7512
0.2924 20.5 656 0.5456 0.7109 0.5456 0.7386
0.2924 20.5625 658 0.5330 0.7223 0.5330 0.7300
0.2924 20.625 660 0.5325 0.7335 0.5325 0.7297
0.2924 20.6875 662 0.5473 0.7385 0.5473 0.7398
0.2924 20.75 664 0.5997 0.7001 0.5997 0.7744
0.2924 20.8125 666 0.6200 0.6455 0.6200 0.7874
0.2924 20.875 668 0.6244 0.6385 0.6244 0.7902
0.2924 20.9375 670 0.6483 0.5953 0.6483 0.8052
0.2924 21.0 672 0.6735 0.5463 0.6735 0.8207
0.2924 21.0625 674 0.6887 0.5229 0.6887 0.8299
0.2924 21.125 676 0.6850 0.5028 0.6850 0.8277
0.2924 21.1875 678 0.6880 0.5028 0.6880 0.8294

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k12_task5_organization

Finetuned
(4205)
this model