ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6554
  • Qwk: 0.6571
  • Mse: 0.6554
  • Rmse: 0.8096

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.05 2 5.3439 -0.0693 5.3439 2.3117
No log 0.1 4 2.9562 0.0320 2.9562 1.7194
No log 0.15 6 2.1972 -0.0453 2.1972 1.4823
No log 0.2 8 1.5175 0.0225 1.5175 1.2319
No log 0.25 10 1.1941 0.2749 1.1941 1.0928
No log 0.3 12 1.3445 0.1938 1.3445 1.1595
No log 0.35 14 1.2249 0.2233 1.2249 1.1068
No log 0.4 16 1.0395 0.2854 1.0395 1.0195
No log 0.45 18 1.0929 0.1972 1.0929 1.0454
No log 0.5 20 1.3457 0.0822 1.3457 1.1600
No log 0.55 22 1.2030 0.1516 1.2030 1.0968
No log 0.6 24 0.9788 0.2853 0.9788 0.9894
No log 0.65 26 0.9535 0.3542 0.9535 0.9765
No log 0.7 28 0.9477 0.3516 0.9477 0.9735
No log 0.75 30 0.9016 0.4080 0.9016 0.9495
No log 0.8 32 0.9215 0.3613 0.9215 0.9600
No log 0.85 34 1.2176 0.1484 1.2176 1.1035
No log 0.9 36 1.9507 -0.2452 1.9507 1.3967
No log 0.95 38 2.4507 -0.3487 2.4507 1.5655
No log 1.0 40 1.9297 -0.1422 1.9297 1.3892
No log 1.05 42 1.0974 0.3697 1.0974 1.0476
No log 1.1 44 0.8417 0.4528 0.8417 0.9174
No log 1.15 46 0.8709 0.5354 0.8709 0.9332
No log 1.2 48 0.9094 0.4874 0.9094 0.9536
No log 1.25 50 0.9189 0.4607 0.9189 0.9586
No log 1.3 52 0.7797 0.5524 0.7797 0.8830
No log 1.35 54 0.7526 0.5399 0.7526 0.8675
No log 1.4 56 0.9173 0.4904 0.9173 0.9577
No log 1.45 58 0.9143 0.5205 0.9143 0.9562
No log 1.5 60 0.7557 0.6050 0.7557 0.8693
No log 1.55 62 0.7100 0.5327 0.7100 0.8426
No log 1.6 64 0.7509 0.5546 0.7509 0.8665
No log 1.65 66 0.7273 0.5670 0.7273 0.8528
No log 1.7 68 0.7183 0.5579 0.7183 0.8475
No log 1.75 70 0.8210 0.5432 0.8210 0.9061
No log 1.8 72 1.1345 0.4242 1.1345 1.0652
No log 1.85 74 1.3067 0.3355 1.3067 1.1431
No log 1.9 76 1.1706 0.4347 1.1706 1.0820
No log 1.95 78 1.0619 0.4761 1.0619 1.0305
No log 2.0 80 0.9337 0.5242 0.9337 0.9663
No log 2.05 82 0.9310 0.5403 0.9310 0.9649
No log 2.1 84 0.9275 0.5804 0.9275 0.9631
No log 2.15 86 1.1061 0.5163 1.1061 1.0517
No log 2.2 88 1.4262 0.4044 1.4262 1.1943
No log 2.25 90 1.6503 0.2678 1.6503 1.2846
No log 2.3 92 1.5481 0.3429 1.5481 1.2442
No log 2.35 94 1.2331 0.4619 1.2331 1.1105
No log 2.4 96 0.9281 0.5655 0.9281 0.9634
No log 2.45 98 0.7484 0.6247 0.7484 0.8651
No log 2.5 100 0.6861 0.6210 0.6861 0.8283
No log 2.55 102 0.6624 0.6400 0.6624 0.8139
No log 2.6 104 0.7324 0.6635 0.7324 0.8558
No log 2.65 106 0.8994 0.5743 0.8994 0.9484
No log 2.7 108 0.9086 0.5794 0.9086 0.9532
No log 2.75 110 0.7822 0.6280 0.7822 0.8844
No log 2.8 112 0.6311 0.6989 0.6311 0.7944
No log 2.85 114 0.5880 0.7068 0.5880 0.7668
No log 2.9 116 0.5936 0.7112 0.5936 0.7704
No log 2.95 118 0.5927 0.7121 0.5927 0.7699
No log 3.0 120 0.6444 0.6964 0.6444 0.8027
No log 3.05 122 0.6129 0.6744 0.6129 0.7829
No log 3.1 124 0.6003 0.7093 0.6003 0.7748
No log 3.15 126 0.6434 0.6820 0.6434 0.8021
No log 3.2 128 0.6931 0.6958 0.6931 0.8325
No log 3.25 130 0.6151 0.6896 0.6151 0.7843
No log 3.3 132 0.6473 0.6790 0.6473 0.8045
No log 3.35 134 0.7359 0.6908 0.7359 0.8578
No log 3.4 136 0.7979 0.6752 0.7979 0.8933
No log 3.45 138 0.8438 0.6455 0.8438 0.9186
No log 3.5 140 0.7062 0.6773 0.7062 0.8404
No log 3.55 142 0.6216 0.6592 0.6216 0.7884
No log 3.6 144 0.6615 0.6635 0.6615 0.8133
No log 3.65 146 0.7222 0.6625 0.7222 0.8499
No log 3.7 148 0.8972 0.6079 0.8972 0.9472
No log 3.75 150 0.9323 0.5978 0.9323 0.9655
No log 3.8 152 0.7974 0.6331 0.7974 0.8930
No log 3.85 154 0.7785 0.6466 0.7785 0.8823
No log 3.9 156 0.9308 0.6122 0.9308 0.9648
No log 3.95 158 0.9557 0.5998 0.9557 0.9776
No log 4.0 160 0.7783 0.6657 0.7783 0.8822
No log 4.05 162 0.7265 0.6844 0.7265 0.8524
No log 4.1 164 0.8858 0.5847 0.8858 0.9412
No log 4.15 166 0.9395 0.5874 0.9395 0.9693
No log 4.2 168 0.7850 0.6189 0.7850 0.8860
No log 4.25 170 0.6524 0.6238 0.6524 0.8077
No log 4.3 172 0.6474 0.6124 0.6474 0.8046
No log 4.35 174 0.7375 0.6586 0.7375 0.8588
No log 4.4 176 0.8866 0.5926 0.8866 0.9416
No log 4.45 178 0.8874 0.5835 0.8874 0.9420
No log 4.5 180 0.7833 0.6339 0.7833 0.8851
No log 4.55 182 0.6775 0.6776 0.6775 0.8231
No log 4.6 184 0.6345 0.6631 0.6345 0.7965
No log 4.65 186 0.6682 0.6923 0.6682 0.8174
No log 4.7 188 0.8632 0.6153 0.8632 0.9291
No log 4.75 190 0.9101 0.6133 0.9101 0.9540
No log 4.8 192 0.8525 0.6323 0.8525 0.9233
No log 4.85 194 0.7584 0.6638 0.7584 0.8708
No log 4.9 196 0.6339 0.6581 0.6339 0.7962
No log 4.95 198 0.6583 0.6567 0.6583 0.8113
No log 5.0 200 0.7603 0.6245 0.7603 0.8720
No log 5.05 202 0.9329 0.5656 0.9329 0.9659
No log 5.1 204 0.8766 0.5822 0.8766 0.9362
No log 5.15 206 0.6982 0.6617 0.6982 0.8356
No log 5.2 208 0.5848 0.7166 0.5848 0.7647
No log 5.25 210 0.5861 0.7027 0.5861 0.7656
No log 5.3 212 0.6114 0.7059 0.6114 0.7820
No log 5.35 214 0.6378 0.6865 0.6378 0.7986
No log 5.4 216 0.6596 0.6806 0.6596 0.8121
No log 5.45 218 0.6713 0.6949 0.6713 0.8193
No log 5.5 220 0.6009 0.7153 0.6009 0.7752
No log 5.55 222 0.6123 0.7077 0.6123 0.7825
No log 5.6 224 0.6066 0.7281 0.6066 0.7789
No log 5.65 226 0.5872 0.7337 0.5872 0.7663
No log 5.7 228 0.6485 0.7209 0.6485 0.8053
No log 5.75 230 0.7906 0.6507 0.7906 0.8892
No log 5.8 232 0.7896 0.6648 0.7896 0.8886
No log 5.85 234 0.6849 0.7111 0.6849 0.8276
No log 5.9 236 0.7515 0.6783 0.7515 0.8669
No log 5.95 238 0.8608 0.6512 0.8608 0.9278
No log 6.0 240 0.7947 0.6696 0.7947 0.8915
No log 6.05 242 0.6747 0.7016 0.6747 0.8214
No log 6.1 244 0.6005 0.7167 0.6005 0.7749
No log 6.15 246 0.6068 0.7556 0.6068 0.7790
No log 6.2 248 0.6246 0.7225 0.6246 0.7903
No log 6.25 250 0.7600 0.7102 0.7600 0.8718
No log 6.3 252 0.9091 0.6546 0.9091 0.9535
No log 6.35 254 0.8496 0.6651 0.8496 0.9217
No log 6.4 256 0.6804 0.6974 0.6804 0.8249
No log 6.45 258 0.5953 0.7155 0.5953 0.7716
No log 6.5 260 0.6071 0.6967 0.6071 0.7792
No log 6.55 262 0.6186 0.6879 0.6186 0.7865
No log 6.6 264 0.6622 0.7007 0.6622 0.8138
No log 6.65 266 0.8012 0.6604 0.8012 0.8951
No log 6.7 268 0.8045 0.6702 0.8045 0.8969
No log 6.75 270 0.7086 0.7014 0.7086 0.8418
No log 6.8 272 0.6528 0.7130 0.6528 0.8079
No log 6.85 274 0.6500 0.7114 0.6500 0.8062
No log 6.9 276 0.6563 0.6912 0.6563 0.8101
No log 6.95 278 0.7313 0.7049 0.7313 0.8552
No log 7.0 280 0.8553 0.5992 0.8553 0.9248
No log 7.05 282 0.9077 0.5898 0.9077 0.9527
No log 7.1 284 0.9059 0.5898 0.9059 0.9518
No log 7.15 286 0.8132 0.6557 0.8132 0.9018
No log 7.2 288 0.7017 0.7056 0.7017 0.8377
No log 7.25 290 0.6567 0.7027 0.6567 0.8104
No log 7.3 292 0.7147 0.7012 0.7147 0.8454
No log 7.35 294 0.7266 0.7012 0.7266 0.8524
No log 7.4 296 0.6658 0.7033 0.6658 0.8159
No log 7.45 298 0.6178 0.7318 0.6178 0.7860
No log 7.5 300 0.6479 0.7043 0.6479 0.8050
No log 7.55 302 0.7513 0.6864 0.7513 0.8668
No log 7.6 304 0.7410 0.6742 0.7410 0.8608
No log 7.65 306 0.6312 0.7114 0.6312 0.7945
No log 7.7 308 0.5915 0.6997 0.5915 0.7691
No log 7.75 310 0.5921 0.7209 0.5921 0.7695
No log 7.8 312 0.5929 0.7068 0.5929 0.7700
No log 7.85 314 0.6313 0.7270 0.6313 0.7945
No log 7.9 316 0.6313 0.7184 0.6313 0.7946
No log 7.95 318 0.6113 0.7486 0.6113 0.7819
No log 8.0 320 0.6016 0.7460 0.6016 0.7757
No log 8.05 322 0.6367 0.7323 0.6367 0.7980
No log 8.1 324 0.7631 0.6631 0.7631 0.8736
No log 8.15 326 0.7985 0.6090 0.7985 0.8936
No log 8.2 328 0.7173 0.6606 0.7173 0.8470
No log 8.25 330 0.6372 0.6847 0.6372 0.7983
No log 8.3 332 0.6253 0.6894 0.6253 0.7907
No log 8.35 334 0.6756 0.7107 0.6756 0.8220
No log 8.4 336 0.8049 0.6415 0.8049 0.8972
No log 8.45 338 0.8554 0.6475 0.8554 0.9249
No log 8.5 340 0.7957 0.6399 0.7957 0.8920
No log 8.55 342 0.7181 0.6954 0.7181 0.8474
No log 8.6 344 0.6480 0.7217 0.6480 0.8050
No log 8.65 346 0.5948 0.7497 0.5948 0.7712
No log 8.7 348 0.6324 0.7217 0.6324 0.7952
No log 8.75 350 0.7288 0.6919 0.7288 0.8537
No log 8.8 352 0.7849 0.6613 0.7849 0.8860
No log 8.85 354 0.7488 0.6885 0.7488 0.8654
No log 8.9 356 0.6401 0.7060 0.6401 0.8000
No log 8.95 358 0.5951 0.7349 0.5951 0.7714
No log 9.0 360 0.5834 0.7195 0.5834 0.7638
No log 9.05 362 0.6163 0.6894 0.6163 0.7850
No log 9.1 364 0.6428 0.6654 0.6428 0.8018
No log 9.15 366 0.6103 0.6992 0.6103 0.7812
No log 9.2 368 0.5822 0.7419 0.5822 0.7630
No log 9.25 370 0.5921 0.7412 0.5921 0.7695
No log 9.3 372 0.6555 0.7219 0.6555 0.8096
No log 9.35 374 0.6939 0.7006 0.6939 0.8330
No log 9.4 376 0.6495 0.7357 0.6495 0.8059
No log 9.45 378 0.6526 0.7278 0.6526 0.8078
No log 9.5 380 0.6776 0.7045 0.6776 0.8232
No log 9.55 382 0.6533 0.7157 0.6533 0.8083
No log 9.6 384 0.6725 0.7029 0.6725 0.8201
No log 9.65 386 0.6876 0.7051 0.6876 0.8292
No log 9.7 388 0.7725 0.6853 0.7725 0.8789
No log 9.75 390 0.8936 0.6353 0.8936 0.9453
No log 9.8 392 0.8600 0.6531 0.8600 0.9273
No log 9.85 394 0.7878 0.6870 0.7878 0.8876
No log 9.9 396 0.6690 0.7307 0.6690 0.8179
No log 9.95 398 0.6730 0.7422 0.6730 0.8203
No log 10.0 400 0.7370 0.7208 0.7370 0.8585
No log 10.05 402 0.7189 0.7038 0.7189 0.8479
No log 10.1 404 0.6555 0.7176 0.6555 0.8096
No log 10.15 406 0.6198 0.7155 0.6198 0.7873
No log 10.2 408 0.6131 0.7076 0.6131 0.7830
No log 10.25 410 0.6131 0.6859 0.6131 0.7830
No log 10.3 412 0.6213 0.7236 0.6213 0.7882
No log 10.35 414 0.7127 0.6978 0.7127 0.8442
No log 10.4 416 0.8023 0.6816 0.8023 0.8957
No log 10.45 418 0.7882 0.6959 0.7882 0.8878
No log 10.5 420 0.6727 0.7225 0.6727 0.8202
No log 10.55 422 0.6211 0.7261 0.6211 0.7881
No log 10.6 424 0.6019 0.7245 0.6019 0.7758
No log 10.65 426 0.5998 0.7324 0.5998 0.7745
No log 10.7 428 0.6257 0.7317 0.6257 0.7910
No log 10.75 430 0.6371 0.7224 0.6371 0.7982
No log 10.8 432 0.6054 0.7237 0.6054 0.7781
No log 10.85 434 0.6011 0.7037 0.6011 0.7753
No log 10.9 436 0.6052 0.7014 0.6052 0.7779
No log 10.95 438 0.6191 0.7114 0.6191 0.7869
No log 11.0 440 0.6847 0.6856 0.6847 0.8275
No log 11.05 442 0.7629 0.6423 0.7629 0.8735
No log 11.1 444 0.7651 0.6398 0.7651 0.8747
No log 11.15 446 0.7817 0.6362 0.7817 0.8841
No log 11.2 448 0.7336 0.6628 0.7336 0.8565
No log 11.25 450 0.7301 0.6648 0.7301 0.8545
No log 11.3 452 0.7075 0.6945 0.7075 0.8411
No log 11.35 454 0.6660 0.7234 0.6660 0.8161
No log 11.4 456 0.6555 0.7379 0.6555 0.8096
No log 11.45 458 0.6425 0.7271 0.6425 0.8015
No log 11.5 460 0.6395 0.7047 0.6395 0.7997
No log 11.55 462 0.6020 0.7235 0.6020 0.7759
No log 11.6 464 0.5885 0.7293 0.5885 0.7671
No log 11.65 466 0.5702 0.7333 0.5702 0.7551
No log 11.7 468 0.5616 0.7499 0.5616 0.7494
No log 11.75 470 0.5838 0.7284 0.5838 0.7641
No log 11.8 472 0.6064 0.7602 0.6064 0.7787
No log 11.85 474 0.5898 0.7487 0.5898 0.7680
No log 11.9 476 0.5721 0.7502 0.5721 0.7564
No log 11.95 478 0.5729 0.7467 0.5729 0.7569
No log 12.0 480 0.5738 0.7375 0.5738 0.7575
No log 12.05 482 0.5927 0.7439 0.5927 0.7698
No log 12.1 484 0.6276 0.7223 0.6276 0.7922
No log 12.15 486 0.6238 0.7110 0.6238 0.7898
No log 12.2 488 0.5995 0.7165 0.5995 0.7743
No log 12.25 490 0.5984 0.7083 0.5984 0.7736
No log 12.3 492 0.6215 0.7280 0.6215 0.7883
No log 12.35 494 0.6103 0.7066 0.6103 0.7812
No log 12.4 496 0.6105 0.7268 0.6105 0.7813
No log 12.45 498 0.6966 0.7168 0.6966 0.8346
0.4029 12.5 500 0.8494 0.6677 0.8494 0.9216
0.4029 12.55 502 0.8573 0.6560 0.8573 0.9259
0.4029 12.6 504 0.7573 0.6769 0.7573 0.8702
0.4029 12.65 506 0.6593 0.7309 0.6593 0.8119
0.4029 12.7 508 0.6340 0.6826 0.6340 0.7963
0.4029 12.75 510 0.6348 0.6826 0.6348 0.7967
0.4029 12.8 512 0.6451 0.7222 0.6451 0.8032
0.4029 12.85 514 0.7177 0.7165 0.7177 0.8472
0.4029 12.9 516 0.8889 0.6502 0.8889 0.9428
0.4029 12.95 518 0.9648 0.6309 0.9648 0.9822
0.4029 13.0 520 0.8958 0.6377 0.8958 0.9465
0.4029 13.05 522 0.7968 0.6797 0.7968 0.8926
0.4029 13.1 524 0.7423 0.7098 0.7423 0.8616
0.4029 13.15 526 0.6836 0.7328 0.6836 0.8268
0.4029 13.2 528 0.6558 0.7393 0.6558 0.8098
0.4029 13.25 530 0.6214 0.7207 0.6214 0.7883
0.4029 13.3 532 0.5996 0.7450 0.5996 0.7743
0.4029 13.35 534 0.5930 0.7551 0.5930 0.7701
0.4029 13.4 536 0.5952 0.7491 0.5952 0.7715
0.4029 13.45 538 0.6510 0.6894 0.6510 0.8068
0.4029 13.5 540 0.6421 0.7039 0.6421 0.8013
0.4029 13.55 542 0.5743 0.7429 0.5743 0.7578
0.4029 13.6 544 0.5908 0.7521 0.5908 0.7686
0.4029 13.65 546 0.6324 0.7182 0.6324 0.7952
0.4029 13.7 548 0.6226 0.7364 0.6226 0.7891
0.4029 13.75 550 0.5820 0.7108 0.5820 0.7629
0.4029 13.8 552 0.5859 0.7159 0.5859 0.7655
0.4029 13.85 554 0.6185 0.7051 0.6185 0.7865
0.4029 13.9 556 0.6932 0.7016 0.6932 0.8326
0.4029 13.95 558 0.7544 0.6947 0.7544 0.8686
0.4029 14.0 560 0.7631 0.7075 0.7631 0.8736
0.4029 14.05 562 0.6924 0.6952 0.6924 0.8321
0.4029 14.1 564 0.6409 0.7377 0.6409 0.8005
0.4029 14.15 566 0.6291 0.7334 0.6291 0.7931
0.4029 14.2 568 0.6388 0.7407 0.6388 0.7992
0.4029 14.25 570 0.6254 0.7476 0.6254 0.7908
0.4029 14.3 572 0.6506 0.7179 0.6506 0.8066
0.4029 14.35 574 0.7173 0.6855 0.7173 0.8469
0.4029 14.4 576 0.7796 0.6937 0.7796 0.8830
0.4029 14.45 578 0.7762 0.6873 0.7762 0.8810
0.4029 14.5 580 0.7096 0.7118 0.7096 0.8424
0.4029 14.55 582 0.6801 0.7168 0.6801 0.8247
0.4029 14.6 584 0.6542 0.7319 0.6542 0.8088
0.4029 14.65 586 0.6212 0.7172 0.6212 0.7881
0.4029 14.7 588 0.5947 0.7315 0.5947 0.7712
0.4029 14.75 590 0.5902 0.7198 0.5902 0.7683
0.4029 14.8 592 0.6216 0.7418 0.6216 0.7884
0.4029 14.85 594 0.6555 0.7463 0.6555 0.8096
0.4029 14.9 596 0.6741 0.7322 0.6741 0.8211
0.4029 14.95 598 0.6695 0.7306 0.6695 0.8182
0.4029 15.0 600 0.6204 0.7386 0.6204 0.7877
0.4029 15.05 602 0.5906 0.7422 0.5906 0.7685
0.4029 15.1 604 0.5949 0.7479 0.5949 0.7713
0.4029 15.15 606 0.6110 0.7283 0.6110 0.7817
0.4029 15.2 608 0.6049 0.7329 0.6049 0.7778
0.4029 15.25 610 0.6332 0.7344 0.6332 0.7957
0.4029 15.3 612 0.6390 0.7497 0.6390 0.7994
0.4029 15.35 614 0.6682 0.7201 0.6682 0.8174
0.4029 15.4 616 0.7189 0.6887 0.7189 0.8479
0.4029 15.45 618 0.6986 0.6956 0.6986 0.8358
0.4029 15.5 620 0.6949 0.7020 0.6949 0.8336
0.4029 15.55 622 0.7464 0.6686 0.7464 0.8639
0.4029 15.6 624 0.8500 0.6309 0.8500 0.9220
0.4029 15.65 626 0.8816 0.6278 0.8816 0.9389
0.4029 15.7 628 0.8340 0.6309 0.8340 0.9132
0.4029 15.75 630 0.7190 0.6733 0.7190 0.8479
0.4029 15.8 632 0.6530 0.6977 0.6530 0.8081
0.4029 15.85 634 0.6363 0.7043 0.6363 0.7977
0.4029 15.9 636 0.6472 0.6901 0.6472 0.8045
0.4029 15.95 638 0.6651 0.6813 0.6651 0.8155
0.4029 16.0 640 0.6765 0.6978 0.6765 0.8225
0.4029 16.05 642 0.7030 0.6821 0.7030 0.8384
0.4029 16.1 644 0.7166 0.6821 0.7166 0.8465
0.4029 16.15 646 0.7528 0.6621 0.7528 0.8676
0.4029 16.2 648 0.7146 0.6763 0.7146 0.8454
0.4029 16.25 650 0.6723 0.6832 0.6723 0.8200
0.4029 16.3 652 0.7106 0.6763 0.7106 0.8429
0.4029 16.35 654 0.7708 0.6618 0.7708 0.8780
0.4029 16.4 656 0.8478 0.6241 0.8478 0.9207
0.4029 16.45 658 0.8415 0.6288 0.8415 0.9173
0.4029 16.5 660 0.7510 0.6586 0.7510 0.8666
0.4029 16.55 662 0.6554 0.6571 0.6554 0.8096

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task1_organization

Finetuned
(4222)
this model