ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k8_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6984
  • Qwk: 0.2506
  • Mse: 0.6984
  • Rmse: 0.8357

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0909 2 2.5813 -0.0262 2.5813 1.6066
No log 0.1818 4 1.3516 0.1000 1.3516 1.1626
No log 0.2727 6 0.7837 0.1372 0.7837 0.8852
No log 0.3636 8 0.8019 -0.0500 0.8019 0.8955
No log 0.4545 10 0.8678 0.0715 0.8678 0.9316
No log 0.5455 12 0.8221 0.1103 0.8221 0.9067
No log 0.6364 14 0.7702 0.1561 0.7702 0.8776
No log 0.7273 16 0.9536 0.1318 0.9536 0.9765
No log 0.8182 18 1.0841 0.0044 1.0841 1.0412
No log 0.9091 20 0.8890 0.1637 0.8890 0.9429
No log 1.0 22 0.7669 0.0771 0.7669 0.8757
No log 1.0909 24 0.7564 0.0679 0.7564 0.8697
No log 1.1818 26 0.7537 0.1050 0.7537 0.8681
No log 1.2727 28 0.7880 0.3518 0.7880 0.8877
No log 1.3636 30 0.7801 0.2652 0.7801 0.8832
No log 1.4545 32 0.7622 0.1598 0.7622 0.8730
No log 1.5455 34 0.7702 -0.0054 0.7702 0.8776
No log 1.6364 36 0.7821 0.0 0.7821 0.8844
No log 1.7273 38 0.8127 0.0 0.8127 0.9015
No log 1.8182 40 0.8914 0.0481 0.8914 0.9441
No log 1.9091 42 0.8700 0.0481 0.8700 0.9328
No log 2.0 44 0.7873 0.0 0.7873 0.8873
No log 2.0909 46 0.7074 -0.0500 0.7074 0.8410
No log 2.1818 48 0.7108 0.2407 0.7108 0.8431
No log 2.2727 50 0.7094 0.2883 0.7094 0.8423
No log 2.3636 52 0.6804 0.3622 0.6804 0.8249
No log 2.4545 54 0.7412 0.2940 0.7412 0.8610
No log 2.5455 56 0.8426 0.3754 0.8426 0.9179
No log 2.6364 58 0.7594 0.3590 0.7594 0.8714
No log 2.7273 60 0.6594 0.3243 0.6594 0.8120
No log 2.8182 62 0.7013 0.3425 0.7013 0.8375
No log 2.9091 64 0.8077 0.3483 0.8077 0.8987
No log 3.0 66 0.7763 0.3149 0.7763 0.8811
No log 3.0909 68 0.6635 0.2852 0.6635 0.8145
No log 3.1818 70 0.6558 0.3183 0.6558 0.8098
No log 3.2727 72 0.6607 0.3523 0.6607 0.8128
No log 3.3636 74 0.7774 0.3688 0.7774 0.8817
No log 3.4545 76 0.7903 0.3913 0.7903 0.8890
No log 3.5455 78 0.6966 0.4179 0.6966 0.8346
No log 3.6364 80 0.8274 0.4064 0.8274 0.9096
No log 3.7273 82 0.9230 0.3357 0.9230 0.9607
No log 3.8182 84 0.7740 0.3946 0.7740 0.8798
No log 3.9091 86 0.7511 0.3287 0.7511 0.8667
No log 4.0 88 0.8605 0.3560 0.8605 0.9276
No log 4.0909 90 0.7630 0.3344 0.7630 0.8735
No log 4.1818 92 0.7016 0.3669 0.7016 0.8376
No log 4.2727 94 0.7425 0.4245 0.7425 0.8617
No log 4.3636 96 0.7187 0.4533 0.7187 0.8478
No log 4.4545 98 0.7081 0.4840 0.7081 0.8415
No log 4.5455 100 0.7661 0.4725 0.7661 0.8753
No log 4.6364 102 0.9251 0.3123 0.9251 0.9618
No log 4.7273 104 0.8998 0.2705 0.8998 0.9486
No log 4.8182 106 0.7254 0.4315 0.7254 0.8517
No log 4.9091 108 0.7856 0.3092 0.7856 0.8864
No log 5.0 110 0.9254 0.2613 0.9254 0.9620
No log 5.0909 112 0.8718 0.3074 0.8718 0.9337
No log 5.1818 114 0.7104 0.3144 0.7104 0.8428
No log 5.2727 116 0.7101 0.3363 0.7101 0.8427
No log 5.3636 118 0.7154 0.3363 0.7154 0.8458
No log 5.4545 120 0.7147 0.3363 0.7147 0.8454
No log 5.5455 122 0.7153 0.3398 0.7153 0.8458
No log 5.6364 124 0.7360 0.2313 0.7360 0.8579
No log 5.7273 126 0.7316 0.2661 0.7316 0.8554
No log 5.8182 128 0.8090 0.3673 0.8090 0.8994
No log 5.9091 130 0.8603 0.4131 0.8603 0.9275
No log 6.0 132 0.7911 0.3433 0.7911 0.8894
No log 6.0909 134 0.7350 0.3144 0.7350 0.8573
No log 6.1818 136 0.8063 0.3719 0.8063 0.8980
No log 6.2727 138 0.8505 0.3067 0.8505 0.9222
No log 6.3636 140 0.8749 0.2849 0.8749 0.9354
No log 6.4545 142 0.9379 0.2916 0.9379 0.9685
No log 6.5455 144 0.8864 0.1209 0.8864 0.9415
No log 6.6364 146 0.8792 0.2202 0.8792 0.9377
No log 6.7273 148 0.8577 0.2223 0.8577 0.9261
No log 6.8182 150 0.9255 0.2993 0.9255 0.9620
No log 6.9091 152 0.8734 0.2555 0.8734 0.9346
No log 7.0 154 0.7431 0.4562 0.7431 0.8620
No log 7.0909 156 0.7506 0.3586 0.7506 0.8664
No log 7.1818 158 0.7558 0.3946 0.7558 0.8694
No log 7.2727 160 0.7060 0.2862 0.7060 0.8402
No log 7.3636 162 0.7041 0.3417 0.7041 0.8391
No log 7.4545 164 0.6996 0.2862 0.6996 0.8364
No log 7.5455 166 0.7059 0.3417 0.7059 0.8402
No log 7.6364 168 0.7070 0.3667 0.7070 0.8409
No log 7.7273 170 0.7286 0.3408 0.7286 0.8536
No log 7.8182 172 0.7527 0.3723 0.7527 0.8676
No log 7.9091 174 0.7112 0.3417 0.7112 0.8433
No log 8.0 176 0.7087 0.2958 0.7087 0.8418
No log 8.0909 178 0.7182 0.2889 0.7182 0.8475
No log 8.1818 180 0.7076 0.2683 0.7076 0.8412
No log 8.2727 182 0.7225 0.2780 0.7225 0.8500
No log 8.3636 184 0.7126 0.2621 0.7126 0.8441
No log 8.4545 186 0.6692 0.2360 0.6692 0.8181
No log 8.5455 188 0.6661 0.3320 0.6661 0.8161
No log 8.6364 190 0.6466 0.3407 0.6466 0.8041
No log 8.7273 192 0.6406 0.2683 0.6406 0.8004
No log 8.8182 194 0.6470 0.3293 0.6470 0.8043
No log 8.9091 196 0.6534 0.2994 0.6534 0.8083
No log 9.0 198 0.6747 0.3813 0.6747 0.8214
No log 9.0909 200 0.6798 0.4455 0.6798 0.8245
No log 9.1818 202 0.6754 0.4342 0.6754 0.8218
No log 9.2727 204 0.6723 0.4078 0.6723 0.8199
No log 9.3636 206 0.7221 0.4284 0.7221 0.8498
No log 9.4545 208 0.7750 0.4468 0.7750 0.8803
No log 9.5455 210 0.7073 0.4665 0.7073 0.8410
No log 9.6364 212 0.6447 0.4206 0.6447 0.8029
No log 9.7273 214 0.6189 0.2715 0.6189 0.7867
No log 9.8182 216 0.6360 0.3607 0.6360 0.7975
No log 9.9091 218 0.6527 0.3762 0.6527 0.8079
No log 10.0 220 0.6486 0.4137 0.6486 0.8053
No log 10.0909 222 0.7404 0.4124 0.7404 0.8604
No log 10.1818 224 0.8895 0.3669 0.8895 0.9432
No log 10.2727 226 0.8278 0.3466 0.8278 0.9099
No log 10.3636 228 0.6770 0.4642 0.6770 0.8228
No log 10.4545 230 0.6995 0.4329 0.6995 0.8364
No log 10.5455 232 0.7484 0.4373 0.7484 0.8651
No log 10.6364 234 0.7683 0.4391 0.7683 0.8765
No log 10.7273 236 0.7444 0.4901 0.7444 0.8628
No log 10.8182 238 0.7077 0.3673 0.7077 0.8412
No log 10.9091 240 0.7174 0.3774 0.7174 0.8470
No log 11.0 242 0.7449 0.3836 0.7449 0.8631
No log 11.0909 244 0.7672 0.2942 0.7672 0.8759
No log 11.1818 246 0.7519 0.3369 0.7519 0.8671
No log 11.2727 248 0.7247 0.3336 0.7247 0.8513
No log 11.3636 250 0.7267 0.2901 0.7267 0.8525
No log 11.4545 252 0.7307 0.3069 0.7307 0.8548
No log 11.5455 254 0.7482 0.3836 0.7482 0.8650
No log 11.6364 256 0.7629 0.3838 0.7629 0.8734
No log 11.7273 258 0.7835 0.3283 0.7836 0.8852
No log 11.8182 260 0.8375 0.3548 0.8375 0.9152
No log 11.9091 262 0.8422 0.3489 0.8422 0.9177
No log 12.0 264 0.7885 0.2643 0.7885 0.8880
No log 12.0909 266 0.7560 0.3754 0.7560 0.8695
No log 12.1818 268 0.7566 0.3608 0.7566 0.8698
No log 12.2727 270 0.7343 0.2955 0.7343 0.8569
No log 12.3636 272 0.7587 0.2294 0.7587 0.8710
No log 12.4545 274 0.8671 0.2153 0.8671 0.9312
No log 12.5455 276 0.8989 0.3280 0.8989 0.9481
No log 12.6364 278 0.8433 0.2871 0.8433 0.9183
No log 12.7273 280 0.7617 0.1699 0.7617 0.8727
No log 12.8182 282 0.7272 0.2063 0.7272 0.8528
No log 12.9091 284 0.7330 0.2413 0.7330 0.8562
No log 13.0 286 0.7551 0.2590 0.7551 0.8690
No log 13.0909 288 0.7672 0.2661 0.7672 0.8759
No log 13.1818 290 0.7985 0.2549 0.7985 0.8936
No log 13.2727 292 0.8359 0.2516 0.8359 0.9143
No log 13.3636 294 0.7867 0.3121 0.7867 0.8869
No log 13.4545 296 0.7347 0.2471 0.7347 0.8572
No log 13.5455 298 0.6973 0.2007 0.6973 0.8351
No log 13.6364 300 0.6712 0.2413 0.6712 0.8193
No log 13.7273 302 0.6581 0.2413 0.6581 0.8113
No log 13.8182 304 0.6605 0.2353 0.6605 0.8127
No log 13.9091 306 0.6546 0.2413 0.6546 0.8091
No log 14.0 308 0.6599 0.2715 0.6599 0.8123
No log 14.0909 310 0.6647 0.2715 0.6647 0.8153
No log 14.1818 312 0.6833 0.2749 0.6833 0.8266
No log 14.2727 314 0.6919 0.2780 0.6919 0.8318
No log 14.3636 316 0.7011 0.2953 0.7011 0.8373
No log 14.4545 318 0.7504 0.3942 0.7504 0.8663
No log 14.5455 320 0.8293 0.2697 0.8293 0.9107
No log 14.6364 322 0.8069 0.2904 0.8069 0.8983
No log 14.7273 324 0.7521 0.2285 0.7521 0.8672
No log 14.8182 326 0.7431 0.2685 0.7431 0.8621
No log 14.9091 328 0.7302 0.2652 0.7302 0.8545
No log 15.0 330 0.7311 0.2867 0.7311 0.8550
No log 15.0909 332 0.7351 0.3122 0.7351 0.8574
No log 15.1818 334 0.7335 0.2834 0.7335 0.8565
No log 15.2727 336 0.7462 0.3478 0.7462 0.8638
No log 15.3636 338 0.7496 0.2907 0.7496 0.8658
No log 15.4545 340 0.7873 0.2129 0.7873 0.8873
No log 15.5455 342 0.8742 0.3012 0.8742 0.9350
No log 15.6364 344 0.8902 0.2826 0.8902 0.9435
No log 15.7273 346 0.8160 0.3206 0.8160 0.9033
No log 15.8182 348 0.7125 0.3478 0.7125 0.8441
No log 15.9091 350 0.6848 0.3492 0.6848 0.8275
No log 16.0 352 0.6953 0.4294 0.6953 0.8339
No log 16.0909 354 0.6994 0.4536 0.6994 0.8363
No log 16.1818 356 0.6781 0.4595 0.6781 0.8234
No log 16.2727 358 0.6467 0.4051 0.6467 0.8042
No log 16.3636 360 0.6629 0.4375 0.6629 0.8142
No log 16.4545 362 0.7142 0.4444 0.7142 0.8451
No log 16.5455 364 0.6926 0.4190 0.6926 0.8322
No log 16.6364 366 0.6508 0.3840 0.6508 0.8067
No log 16.7273 368 0.6159 0.2715 0.6159 0.7848
No log 16.8182 370 0.6169 0.4036 0.6169 0.7854
No log 16.9091 372 0.6739 0.4931 0.6739 0.8209
No log 17.0 374 0.6658 0.4931 0.6658 0.8159
No log 17.0909 376 0.6300 0.3833 0.6300 0.7937
No log 17.1818 378 0.6588 0.4854 0.6588 0.8117
No log 17.2727 380 0.7556 0.4937 0.7556 0.8693
No log 17.3636 382 0.7598 0.4805 0.7598 0.8717
No log 17.4545 384 0.6818 0.4602 0.6818 0.8257
No log 17.5455 386 0.6229 0.4124 0.6229 0.7892
No log 17.6364 388 0.6167 0.3324 0.6167 0.7853
No log 17.7273 390 0.6203 0.3324 0.6203 0.7876
No log 17.8182 392 0.6287 0.3599 0.6287 0.7929
No log 17.9091 394 0.6440 0.3478 0.6440 0.8025
No log 18.0 396 0.6780 0.3640 0.6780 0.8234
No log 18.0909 398 0.7009 0.4602 0.7009 0.8372
No log 18.1818 400 0.7178 0.4193 0.7178 0.8473
No log 18.2727 402 0.6957 0.3329 0.6957 0.8341
No log 18.3636 404 0.6691 0.3524 0.6691 0.8180
No log 18.4545 406 0.6570 0.3253 0.6570 0.8106
No log 18.5455 408 0.6498 0.2852 0.6498 0.8061
No log 18.6364 410 0.6451 0.2852 0.6451 0.8032
No log 18.7273 412 0.6448 0.2852 0.6448 0.8030
No log 18.8182 414 0.6503 0.3523 0.6503 0.8064
No log 18.9091 416 0.6651 0.3523 0.6651 0.8155
No log 19.0 418 0.6791 0.3144 0.6791 0.8241
No log 19.0909 420 0.6628 0.3183 0.6628 0.8141
No log 19.1818 422 0.6456 0.3213 0.6456 0.8035
No log 19.2727 424 0.6450 0.3860 0.6450 0.8031
No log 19.3636 426 0.6417 0.3780 0.6417 0.8011
No log 19.4545 428 0.6471 0.4345 0.6471 0.8044
No log 19.5455 430 0.6398 0.4345 0.6398 0.7999
No log 19.6364 432 0.6381 0.3183 0.6381 0.7988
No log 19.7273 434 0.6548 0.2907 0.6548 0.8092
No log 19.8182 436 0.6629 0.3572 0.6629 0.8142
No log 19.9091 438 0.6628 0.3498 0.6628 0.8141
No log 20.0 440 0.6658 0.3498 0.6658 0.8160
No log 20.0909 442 0.6671 0.3498 0.6671 0.8167
No log 20.1818 444 0.6548 0.3011 0.6548 0.8092
No log 20.2727 446 0.6505 0.2852 0.6505 0.8065
No log 20.3636 448 0.6386 0.2852 0.6386 0.7991
No log 20.4545 450 0.6392 0.2852 0.6392 0.7995
No log 20.5455 452 0.6441 0.2852 0.6441 0.8026
No log 20.6364 454 0.6735 0.2007 0.6735 0.8206
No log 20.7273 456 0.7331 0.3060 0.7331 0.8562
No log 20.8182 458 0.7511 0.3060 0.7511 0.8666
No log 20.9091 460 0.7029 0.3525 0.7029 0.8384
No log 21.0 462 0.6607 0.3352 0.6607 0.8129
No log 21.0909 464 0.6603 0.3213 0.6603 0.8126
No log 21.1818 466 0.6631 0.3141 0.6631 0.8143
No log 21.2727 468 0.6822 0.2943 0.6822 0.8260
No log 21.3636 470 0.7324 0.3918 0.7324 0.8558
No log 21.4545 472 0.7397 0.3918 0.7397 0.8601
No log 21.5455 474 0.7124 0.4352 0.7124 0.8440
No log 21.6364 476 0.6854 0.2943 0.6854 0.8279
No log 21.7273 478 0.6573 0.3141 0.6573 0.8107
No log 21.8182 480 0.6478 0.3454 0.6478 0.8048
No log 21.9091 482 0.6381 0.3141 0.6381 0.7988
No log 22.0 484 0.6327 0.2715 0.6327 0.7954
No log 22.0909 486 0.6368 0.2936 0.6368 0.7980
No log 22.1818 488 0.6319 0.2652 0.6319 0.7949
No log 22.2727 490 0.6371 0.2622 0.6371 0.7982
No log 22.3636 492 0.6566 0.3478 0.6566 0.8103
No log 22.4545 494 0.6914 0.3942 0.6914 0.8315
No log 22.5455 496 0.7001 0.3688 0.7001 0.8367
No log 22.6364 498 0.6745 0.3478 0.6745 0.8213
0.2995 22.7273 500 0.6705 0.3002 0.6705 0.8188
0.2995 22.8182 502 0.6852 0.3280 0.6852 0.8278
0.2995 22.9091 504 0.7146 0.3224 0.7146 0.8453
0.2995 23.0 506 0.7422 0.3648 0.7422 0.8615
0.2995 23.0909 508 0.7573 0.3586 0.7573 0.8702
0.2995 23.1818 510 0.7709 0.3906 0.7709 0.8780
0.2995 23.2727 512 0.7832 0.3679 0.7832 0.8850
0.2995 23.3636 514 0.7432 0.3626 0.7432 0.8621
0.2995 23.4545 516 0.7188 0.3865 0.7188 0.8478
0.2995 23.5455 518 0.6860 0.4186 0.6860 0.8282
0.2995 23.6364 520 0.6585 0.4281 0.6585 0.8115
0.2995 23.7273 522 0.6430 0.4067 0.6430 0.8019
0.2995 23.8182 524 0.6403 0.3840 0.6403 0.8002
0.2995 23.9091 526 0.6410 0.3840 0.6410 0.8006
0.2995 24.0 528 0.6476 0.3840 0.6476 0.8047
0.2995 24.0909 530 0.6512 0.3840 0.6512 0.8070
0.2995 24.1818 532 0.6415 0.3945 0.6415 0.8010
0.2995 24.2727 534 0.6195 0.3141 0.6195 0.7871
0.2995 24.3636 536 0.6105 0.3407 0.6105 0.7814
0.2995 24.4545 538 0.6054 0.3703 0.6054 0.7781
0.2995 24.5455 540 0.6069 0.4441 0.6069 0.7790
0.2995 24.6364 542 0.6096 0.4171 0.6096 0.7808
0.2995 24.7273 544 0.6066 0.4482 0.6066 0.7788
0.2995 24.8182 546 0.5959 0.4229 0.5959 0.7720
0.2995 24.9091 548 0.6000 0.4482 0.6000 0.7746
0.2995 25.0 550 0.6161 0.3763 0.6161 0.7849
0.2995 25.0909 552 0.6458 0.4827 0.6458 0.8036
0.2995 25.1818 554 0.6552 0.4964 0.6552 0.8094
0.2995 25.2727 556 0.6344 0.4777 0.6344 0.7965
0.2995 25.3636 558 0.6404 0.5056 0.6404 0.8002
0.2995 25.4545 560 0.6373 0.4997 0.6373 0.7983
0.2995 25.5455 562 0.6457 0.4777 0.6457 0.8035
0.2995 25.6364 564 0.6606 0.4476 0.6606 0.8127
0.2995 25.7273 566 0.6827 0.5293 0.6827 0.8263
0.2995 25.8182 568 0.6603 0.4294 0.6603 0.8126
0.2995 25.9091 570 0.6343 0.4292 0.6343 0.7964
0.2995 26.0 572 0.6240 0.5123 0.6240 0.7899
0.2995 26.0909 574 0.6221 0.3995 0.6221 0.7887
0.2995 26.1818 576 0.6289 0.5383 0.6289 0.7931
0.2995 26.2727 578 0.6427 0.5156 0.6427 0.8017
0.2995 26.3636 580 0.6546 0.5114 0.6546 0.8091
0.2995 26.4545 582 0.6601 0.5360 0.6601 0.8125
0.2995 26.5455 584 0.6574 0.5513 0.6574 0.8108
0.2995 26.6364 586 0.6593 0.4507 0.6593 0.8120
0.2995 26.7273 588 0.6541 0.4096 0.6541 0.8088
0.2995 26.8182 590 0.6419 0.3934 0.6419 0.8012
0.2995 26.9091 592 0.6364 0.4194 0.6364 0.7978
0.2995 27.0 594 0.6542 0.4212 0.6542 0.8088
0.2995 27.0909 596 0.6887 0.4190 0.6887 0.8299
0.2995 27.1818 598 0.7024 0.3918 0.7024 0.8381
0.2995 27.2727 600 0.6907 0.4190 0.6907 0.8311
0.2995 27.3636 602 0.6855 0.4190 0.6855 0.8279
0.2995 27.4545 604 0.6883 0.4190 0.6883 0.8296
0.2995 27.5455 606 0.6807 0.4451 0.6807 0.8250
0.2995 27.6364 608 0.6762 0.4542 0.6762 0.8223
0.2995 27.7273 610 0.6726 0.4542 0.6726 0.8202
0.2995 27.8182 612 0.6518 0.3914 0.6518 0.8073
0.2995 27.9091 614 0.6349 0.4201 0.6349 0.7968
0.2995 28.0 616 0.6248 0.3835 0.6248 0.7904
0.2995 28.0909 618 0.6221 0.3835 0.6221 0.7887
0.2995 28.1818 620 0.6224 0.3886 0.6224 0.7889
0.2995 28.2727 622 0.6233 0.4278 0.6233 0.7895
0.2995 28.3636 624 0.6320 0.3841 0.6320 0.7950
0.2995 28.4545 626 0.6503 0.4625 0.6503 0.8064
0.2995 28.5455 628 0.6487 0.4625 0.6487 0.8054
0.2995 28.6364 630 0.6269 0.4705 0.6269 0.7918
0.2995 28.7273 632 0.6202 0.3939 0.6202 0.7875
0.2995 28.8182 634 0.6260 0.3939 0.6260 0.7912
0.2995 28.9091 636 0.6299 0.3939 0.6299 0.7936
0.2995 29.0 638 0.6369 0.3995 0.6369 0.7981
0.2995 29.0909 640 0.6448 0.3995 0.6448 0.8030
0.2995 29.1818 642 0.6544 0.3407 0.6544 0.8090
0.2995 29.2727 644 0.6656 0.3100 0.6656 0.8159
0.2995 29.3636 646 0.6828 0.3689 0.6828 0.8263
0.2995 29.4545 648 0.6793 0.3375 0.6793 0.8242
0.2995 29.5455 650 0.6766 0.3460 0.6766 0.8226
0.2995 29.6364 652 0.6816 0.3504 0.6816 0.8256
0.2995 29.7273 654 0.6908 0.3144 0.6908 0.8311
0.2995 29.8182 656 0.6744 0.2852 0.6744 0.8212
0.2995 29.9091 658 0.6596 0.2852 0.6596 0.8121
0.2995 30.0 660 0.6581 0.2852 0.6581 0.8112
0.2995 30.0909 662 0.6694 0.2506 0.6694 0.8181
0.2995 30.1818 664 0.6984 0.2506 0.6984 0.8357

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k8_task7_organization

Finetuned
(4204)
this model