ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k3_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9881
  • Qwk: 0.0344
  • Mse: 0.9881
  • Rmse: 0.9941

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1818 2 3.8267 0.0017 3.8267 1.9562
No log 0.3636 4 2.3003 -0.0302 2.3003 1.5167
No log 0.5455 6 1.2880 0.0279 1.2880 1.1349
No log 0.7273 8 1.1572 0.0 1.1572 1.0757
No log 0.9091 10 0.8319 -0.0101 0.8319 0.9121
No log 1.0909 12 0.6837 0.0506 0.6837 0.8269
No log 1.2727 14 0.7543 0.0374 0.7543 0.8685
No log 1.4545 16 0.8269 0.0129 0.8269 0.9093
No log 1.6364 18 0.8428 0.0099 0.8428 0.9181
No log 1.8182 20 0.9464 -0.1002 0.9464 0.9728
No log 2.0 22 1.2491 0.0 1.2491 1.1176
No log 2.1818 24 1.3809 0.0 1.3809 1.1751
No log 2.3636 26 1.4664 0.0 1.4664 1.2110
No log 2.5455 28 1.6516 0.0 1.6516 1.2852
No log 2.7273 30 1.3908 0.0 1.3908 1.1793
No log 2.9091 32 0.9918 0.0016 0.9918 0.9959
No log 3.0909 34 0.8116 0.0225 0.8116 0.9009
No log 3.2727 36 0.7994 0.0260 0.7994 0.8941
No log 3.4545 38 0.7553 0.0374 0.7553 0.8691
No log 3.6364 40 0.7506 0.0506 0.7506 0.8664
No log 3.8182 42 0.7964 -0.0725 0.7964 0.8924
No log 4.0 44 0.7641 -0.0711 0.7641 0.8741
No log 4.1818 46 0.7159 0.0416 0.7159 0.8461
No log 4.3636 48 0.7260 0.0374 0.7260 0.8521
No log 4.5455 50 0.8205 0.0714 0.8205 0.9058
No log 4.7273 52 0.9739 -0.0253 0.9739 0.9868
No log 4.9091 54 0.9989 -0.0236 0.9989 0.9994
No log 5.0909 56 0.9364 0.0316 0.9364 0.9677
No log 5.2727 58 0.7354 0.0334 0.7354 0.8575
No log 5.4545 60 0.7159 0.0416 0.7159 0.8461
No log 5.6364 62 0.7317 0.0374 0.7317 0.8554
No log 5.8182 64 0.8202 -0.0753 0.8202 0.9057
No log 6.0 66 0.8460 -0.1715 0.8460 0.9198
No log 6.1818 68 0.8380 -0.0753 0.8380 0.9154
No log 6.3636 70 0.7560 -0.0215 0.7560 0.8695
No log 6.5455 72 0.7074 0.0374 0.7074 0.8411
No log 6.7273 74 0.6813 0.0416 0.6813 0.8254
No log 6.9091 76 0.6865 0.0416 0.6865 0.8285
No log 7.0909 78 0.7990 0.0549 0.7990 0.8939
No log 7.2727 80 0.7697 0.1202 0.7697 0.8773
No log 7.4545 82 0.6976 0.0416 0.6976 0.8352
No log 7.6364 84 0.7216 0.0 0.7216 0.8495
No log 7.8182 86 0.7251 0.0 0.7251 0.8515
No log 8.0 88 0.6758 0.0416 0.6758 0.8221
No log 8.1818 90 0.7148 0.0759 0.7148 0.8455
No log 8.3636 92 0.6839 -0.0188 0.6839 0.8270
No log 8.5455 94 0.6874 -0.0035 0.6874 0.8291
No log 8.7273 96 0.6812 -0.0069 0.6812 0.8254
No log 8.9091 98 0.6989 -0.0101 0.6989 0.8360
No log 9.0909 100 0.7647 0.0759 0.7647 0.8745
No log 9.2727 102 0.8160 0.0341 0.8160 0.9033
No log 9.4545 104 0.8363 0.0145 0.8363 0.9145
No log 9.6364 106 0.9092 0.0431 0.9092 0.9535
No log 9.8182 108 0.8943 -0.0717 0.8943 0.9457
No log 10.0 110 0.9024 0.0376 0.9024 0.9499
No log 10.1818 112 0.9119 -0.1205 0.9119 0.9549
No log 10.3636 114 0.9266 -0.0981 0.9266 0.9626
No log 10.5455 116 0.8747 -0.1817 0.8747 0.9353
No log 10.7273 118 0.8769 0.0341 0.8769 0.9365
No log 10.9091 120 0.9095 0.0049 0.9095 0.9537
No log 11.0909 122 0.9965 -0.0579 0.9965 0.9982
No log 11.2727 124 1.1007 0.0199 1.1007 1.0491
No log 11.4545 126 0.8910 0.0114 0.8910 0.9439
No log 11.6364 128 0.9502 0.1103 0.9502 0.9748
No log 11.8182 130 0.8616 0.0851 0.8616 0.9282
No log 12.0 132 1.1231 0.0213 1.1231 1.0598
No log 12.1818 134 1.0336 -0.0423 1.0336 1.0167
No log 12.3636 136 0.8311 0.0732 0.8311 0.9116
No log 12.5455 138 1.0151 0.0316 1.0151 1.0075
No log 12.7273 140 0.9575 0.0476 0.9575 0.9785
No log 12.9091 142 0.7913 0.0282 0.7913 0.8895
No log 13.0909 144 0.8370 -0.1895 0.8370 0.9149
No log 13.2727 146 0.8682 -0.0408 0.8682 0.9318
No log 13.4545 148 0.8808 -0.0279 0.8808 0.9385
No log 13.6364 150 0.9087 0.0146 0.9087 0.9533
No log 13.8182 152 1.0809 0.1379 1.0809 1.0397
No log 14.0 154 0.9108 0.0140 0.9108 0.9544
No log 14.1818 156 0.8284 0.0239 0.8284 0.9102
No log 14.3636 158 0.9058 0.0443 0.9058 0.9517
No log 14.5455 160 0.8573 -0.1011 0.8573 0.9259
No log 14.7273 162 0.7237 0.0970 0.7237 0.8507
No log 14.9091 164 0.7165 0.0869 0.7165 0.8464
No log 15.0909 166 0.7909 0.0937 0.7909 0.8893
No log 15.2727 168 0.9725 -0.0827 0.9725 0.9862
No log 15.4545 170 0.9963 -0.0474 0.9963 0.9982
No log 15.6364 172 0.8747 0.0441 0.8747 0.9353
No log 15.8182 174 0.8853 0.1094 0.8853 0.9409
No log 16.0 176 0.8854 0.0796 0.8854 0.9409
No log 16.1818 178 1.0024 0.0391 1.0024 1.0012
No log 16.3636 180 1.2141 0.0832 1.2141 1.1019
No log 16.5455 182 1.1184 0.0802 1.1184 1.0575
No log 16.7273 184 0.9348 0.0305 0.9348 0.9669
No log 16.9091 186 0.8471 0.0488 0.8471 0.9204
No log 17.0909 188 0.7983 0.0051 0.7983 0.8935
No log 17.2727 190 0.7912 0.0179 0.7912 0.8895
No log 17.4545 192 0.7738 0.0061 0.7738 0.8796
No log 17.6364 194 0.7296 -0.0101 0.7296 0.8542
No log 17.8182 196 0.7470 0.1318 0.7470 0.8643
No log 18.0 198 0.7750 0.1202 0.7750 0.8804
No log 18.1818 200 0.7720 0.1807 0.7720 0.8786
No log 18.3636 202 0.8163 0.0426 0.8163 0.9035
No log 18.5455 204 0.9439 -0.0291 0.9439 0.9715
No log 18.7273 206 1.0004 0.0101 1.0004 1.0002
No log 18.9091 208 0.9119 0.0246 0.9119 0.9549
No log 19.0909 210 0.9396 -0.0055 0.9396 0.9693
No log 19.2727 212 0.9582 0.0027 0.9582 0.9789
No log 19.4545 214 0.9113 -0.0055 0.9113 0.9546
No log 19.6364 216 0.8368 0.0441 0.8368 0.9147
No log 19.8182 218 0.8430 0.0469 0.8430 0.9181
No log 20.0 220 1.0492 -0.0137 1.0492 1.0243
No log 20.1818 222 1.2220 0.0285 1.2220 1.1054
No log 20.3636 224 1.0941 -0.0423 1.0941 1.0460
No log 20.5455 226 0.8695 0.0074 0.8695 0.9324
No log 20.7273 228 0.8797 0.1003 0.8797 0.9379
No log 20.9091 230 0.8459 0.1138 0.8459 0.9198
No log 21.0909 232 0.9294 -0.0707 0.9294 0.9640
No log 21.2727 234 1.0170 -0.0423 1.0170 1.0085
No log 21.4545 236 0.9275 -0.0627 0.9275 0.9631
No log 21.6364 238 0.8364 0.1138 0.8364 0.9145
No log 21.8182 240 0.8623 0.0956 0.8623 0.9286
No log 22.0 242 0.8312 0.0660 0.8312 0.9117
No log 22.1818 244 0.8659 -0.1355 0.8659 0.9305
No log 22.3636 246 1.0184 -0.0794 1.0184 1.0092
No log 22.5455 248 1.0164 -0.0794 1.0164 1.0082
No log 22.7273 250 0.8981 -0.1107 0.8981 0.9477
No log 22.9091 252 0.8084 0.0444 0.8084 0.8991
No log 23.0909 254 0.8523 0.1448 0.8523 0.9232
No log 23.2727 256 0.8729 0.1432 0.8729 0.9343
No log 23.4545 258 0.9209 -0.0484 0.9209 0.9597
No log 23.6364 260 0.9854 0.0391 0.9854 0.9927
No log 23.8182 262 0.9299 -0.0617 0.9299 0.9643
No log 24.0 264 0.8887 -0.1468 0.8887 0.9427
No log 24.1818 266 0.8542 -0.0173 0.8542 0.9242
No log 24.3636 268 0.8266 -0.0200 0.8266 0.9091
No log 24.5455 270 0.7812 0.1769 0.7812 0.8838
No log 24.7273 272 0.7552 0.1318 0.7552 0.8690
No log 24.9091 274 0.7649 0.1318 0.7649 0.8746
No log 25.0909 276 0.7848 0.1318 0.7848 0.8859
No log 25.2727 278 0.8327 0.0856 0.8327 0.9125
No log 25.4545 280 0.8660 0.0071 0.8660 0.9306
No log 25.6364 282 0.8320 0.0771 0.8320 0.9121
No log 25.8182 284 0.8183 0.1143 0.8183 0.9046
No log 26.0 286 0.8075 0.1141 0.8075 0.8986
No log 26.1818 288 0.8250 0.0455 0.8250 0.9083
No log 26.3636 290 0.8199 0.0488 0.8199 0.9055
No log 26.5455 292 0.8020 0.0410 0.8020 0.8956
No log 26.7273 294 0.7848 0.1298 0.7848 0.8859
No log 26.9091 296 0.7758 0.0355 0.7758 0.8808
No log 27.0909 298 0.7771 0.0488 0.7771 0.8815
No log 27.2727 300 0.7665 0.1423 0.7665 0.8755
No log 27.4545 302 0.7674 0.0759 0.7674 0.8760
No log 27.6364 304 0.8018 0.1095 0.8018 0.8955
No log 27.8182 306 0.8259 0.1094 0.8259 0.9088
No log 28.0 308 0.9164 0.0340 0.9164 0.9573
No log 28.1818 310 0.9821 -0.0137 0.9821 0.9910
No log 28.3636 312 0.9590 0.0159 0.9590 0.9793
No log 28.5455 314 0.8392 -0.0811 0.8392 0.9161
No log 28.7273 316 0.7789 0.0978 0.7789 0.8826
No log 28.9091 318 0.8005 0.0129 0.8005 0.8947
No log 29.0909 320 0.8166 -0.0608 0.8166 0.9036
No log 29.2727 322 0.8119 -0.0354 0.8119 0.9011
No log 29.4545 324 0.8090 0.0460 0.8090 0.8994
No log 29.6364 326 0.8433 -0.0220 0.8433 0.9183
No log 29.8182 328 0.8857 -0.0395 0.8857 0.9411
No log 30.0 330 0.8489 0.0178 0.8489 0.9214
No log 30.1818 332 0.7954 0.1187 0.7954 0.8919
No log 30.3636 334 0.7825 0.2431 0.7825 0.8846
No log 30.5455 336 0.7833 0.1094 0.7833 0.8851
No log 30.7273 338 0.8301 0.0139 0.8301 0.9111
No log 30.9091 340 0.9247 0.0117 0.9247 0.9616
No log 31.0909 342 0.9326 -0.0116 0.9326 0.9657
No log 31.2727 344 0.8586 -0.0762 0.8586 0.9266
No log 31.4545 346 0.7962 0.1139 0.7962 0.8923
No log 31.6364 348 0.8462 0.1943 0.8462 0.9199
No log 31.8182 350 0.8465 0.1943 0.8465 0.9201
No log 32.0 352 0.7981 0.1553 0.7981 0.8933
No log 32.1818 354 0.8161 0.0123 0.8161 0.9034
No log 32.3636 356 0.8438 -0.1111 0.8438 0.9186
No log 32.5455 358 0.8102 0.0547 0.8102 0.9001
No log 32.7273 360 0.7618 0.1617 0.7618 0.8728
No log 32.9091 362 0.7537 0.2105 0.7537 0.8681
No log 33.0909 364 0.7641 0.2053 0.7641 0.8741
No log 33.2727 366 0.7895 0.1272 0.7895 0.8885
No log 33.4545 368 0.8530 -0.0439 0.8530 0.9236
No log 33.6364 370 0.8504 -0.0456 0.8504 0.9222
No log 33.8182 372 0.8171 0.0220 0.8171 0.9039
No log 34.0 374 0.7771 0.1434 0.7771 0.8815
No log 34.1818 376 0.7275 0.1254 0.7275 0.8529
No log 34.3636 378 0.7279 0.1691 0.7279 0.8531
No log 34.5455 380 0.7466 0.1627 0.7466 0.8641
No log 34.7273 382 0.7487 0.1627 0.7487 0.8653
No log 34.9091 384 0.7562 0.1659 0.7562 0.8696
No log 35.0909 386 0.8214 0.0870 0.8214 0.9063
No log 35.2727 388 0.8580 -0.0204 0.8580 0.9263
No log 35.4545 390 0.8519 0.0129 0.8519 0.9230
No log 35.6364 392 0.7976 0.1604 0.7976 0.8931
No log 35.8182 394 0.7983 0.1495 0.7983 0.8935
No log 36.0 396 0.8031 0.1506 0.8031 0.8962
No log 36.1818 398 0.7840 0.1506 0.7840 0.8855
No log 36.3636 400 0.7547 0.1627 0.7547 0.8687
No log 36.5455 402 0.7380 0.1202 0.7380 0.8591
No log 36.7273 404 0.7251 0.1202 0.7251 0.8515
No log 36.9091 406 0.7206 0.1202 0.7206 0.8489
No log 37.0909 408 0.7181 0.1202 0.7181 0.8474
No log 37.2727 410 0.7262 0.1202 0.7262 0.8522
No log 37.4545 412 0.7490 0.2034 0.7490 0.8655
No log 37.6364 414 0.7661 0.1627 0.7661 0.8753
No log 37.8182 416 0.7714 0.1249 0.7714 0.8783
No log 38.0 418 0.7993 0.0856 0.7993 0.8940
No log 38.1818 420 0.8358 0.0153 0.8358 0.9142
No log 38.3636 422 0.8219 0.0114 0.8219 0.9066
No log 38.5455 424 0.7841 0.1236 0.7841 0.8855
No log 38.7273 426 0.7704 0.1659 0.7704 0.8777
No log 38.9091 428 0.7687 0.1675 0.7687 0.8767
No log 39.0909 430 0.7628 0.1675 0.7628 0.8734
No log 39.2727 432 0.7592 0.1675 0.7592 0.8713
No log 39.4545 434 0.7665 0.1244 0.7665 0.8755
No log 39.6364 436 0.7715 0.1244 0.7715 0.8783
No log 39.8182 438 0.7716 0.1244 0.7716 0.8784
No log 40.0 440 0.7743 0.1644 0.7743 0.8799
No log 40.1818 442 0.7964 0.1644 0.7964 0.8924
No log 40.3636 444 0.8262 0.1232 0.8262 0.9089
No log 40.5455 446 0.8377 0.0816 0.8377 0.9152
No log 40.7273 448 0.8459 0.0069 0.8459 0.9197
No log 40.9091 450 0.8231 0.0798 0.8231 0.9072
No log 41.0909 452 0.8029 0.1585 0.8029 0.8961
No log 41.2727 454 0.8089 0.1541 0.8089 0.8994
No log 41.4545 456 0.8057 0.1541 0.8057 0.8976
No log 41.6364 458 0.8046 0.1529 0.8046 0.8970
No log 41.8182 460 0.8027 0.1192 0.8027 0.8959
No log 42.0 462 0.7881 0.1189 0.7881 0.8877
No log 42.1818 464 0.7821 0.1189 0.7821 0.8843
No log 42.3636 466 0.7844 0.1189 0.7844 0.8857
No log 42.5455 468 0.7828 0.1599 0.7828 0.8848
No log 42.7273 470 0.7947 0.1097 0.7947 0.8915
No log 42.9091 472 0.8147 0.1506 0.8147 0.9026
No log 43.0909 474 0.8004 0.1506 0.8004 0.8947
No log 43.2727 476 0.7764 0.1599 0.7764 0.8811
No log 43.4545 478 0.7940 0.0376 0.7940 0.8911
No log 43.6364 480 0.8488 -0.0375 0.8488 0.9213
No log 43.8182 482 0.9103 0.0183 0.9103 0.9541
No log 44.0 484 0.9020 -0.0556 0.9020 0.9497
No log 44.1818 486 0.8645 0.0392 0.8645 0.9298
No log 44.3636 488 0.8436 0.1048 0.8436 0.9185
No log 44.5455 490 0.8428 0.1495 0.8428 0.9180
No log 44.7273 492 0.8222 0.1495 0.8222 0.9068
No log 44.9091 494 0.8028 0.1143 0.8028 0.8960
No log 45.0909 496 0.8081 0.0810 0.8081 0.8989
No log 45.2727 498 0.8176 0.0423 0.8176 0.9042
0.2983 45.4545 500 0.7915 0.0344 0.7915 0.8897
0.2983 45.6364 502 0.7856 0.1196 0.7856 0.8864
0.2983 45.8182 504 0.7968 0.1139 0.7968 0.8926
0.2983 46.0 506 0.8079 0.0725 0.8079 0.8989
0.2983 46.1818 508 0.8120 0.0771 0.8120 0.9011
0.2983 46.3636 510 0.8246 0.0407 0.8246 0.9081
0.2983 46.5455 512 0.8530 0.0518 0.8530 0.9236
0.2983 46.7273 514 0.8303 0.0833 0.8303 0.9112
0.2983 46.9091 516 0.8033 0.0423 0.8033 0.8963
0.2983 47.0909 518 0.7719 0.0798 0.7719 0.8786
0.2983 47.2727 520 0.7491 0.1192 0.7491 0.8655
0.2983 47.4545 522 0.7453 0.1196 0.7453 0.8633
0.2983 47.6364 524 0.7492 0.1627 0.7492 0.8655
0.2983 47.8182 526 0.7525 0.1599 0.7525 0.8674
0.2983 48.0 528 0.7559 0.1585 0.7559 0.8694
0.2983 48.1818 530 0.7868 -0.0076 0.7868 0.8870
0.2983 48.3636 532 0.8245 0.0802 0.8245 0.9080
0.2983 48.5455 534 0.8537 0.0802 0.8537 0.9240
0.2983 48.7273 536 0.8608 0.0761 0.8608 0.9278
0.2983 48.9091 538 0.8578 -0.0073 0.8578 0.9262
0.2983 49.0909 540 0.8691 0.0313 0.8691 0.9322
0.2983 49.2727 542 0.8938 0.0761 0.8938 0.9454
0.2983 49.4545 544 0.9266 0.1126 0.9266 0.9626
0.2983 49.6364 546 0.9709 -0.0423 0.9709 0.9853
0.2983 49.8182 548 0.9692 -0.0111 0.9692 0.9845
0.2983 50.0 550 0.9318 -0.0133 0.9318 0.9653
0.2983 50.1818 552 0.8765 0.0183 0.8765 0.9362
0.2983 50.3636 554 0.8567 0.0500 0.8567 0.9256
0.2983 50.5455 556 0.8086 0.0764 0.8086 0.8992
0.2983 50.7273 558 0.7767 0.1573 0.7767 0.8813
0.2983 50.9091 560 0.7837 0.1585 0.7837 0.8852
0.2983 51.0909 562 0.7944 0.1585 0.7944 0.8913
0.2983 51.2727 564 0.7973 0.1585 0.7973 0.8929
0.2983 51.4545 566 0.7908 0.1585 0.7908 0.8893
0.2983 51.6364 568 0.7911 0.0393 0.7911 0.8895
0.2983 51.8182 570 0.8314 -0.0220 0.8314 0.9118
0.2983 52.0 572 0.8595 0.0589 0.8595 0.9271
0.2983 52.1818 574 0.8887 0.0906 0.8887 0.9427
0.2983 52.3636 576 0.8955 0.0906 0.8955 0.9463
0.2983 52.5455 578 0.8688 0.0129 0.8688 0.9321
0.2983 52.7273 580 0.8510 0.0392 0.8510 0.9225
0.2983 52.9091 582 0.8298 0.1561 0.8298 0.9109
0.2983 53.0909 584 0.8254 0.1095 0.8254 0.9085
0.2983 53.2727 586 0.8196 0.1599 0.8196 0.9053
0.2983 53.4545 588 0.8132 0.2028 0.8132 0.9018
0.2983 53.6364 590 0.8069 0.0764 0.8069 0.8983
0.2983 53.8182 592 0.8133 0.0024 0.8133 0.9019
0.2983 54.0 594 0.8062 0.0 0.8062 0.8979
0.2983 54.1818 596 0.7829 0.0749 0.7829 0.8848
0.2983 54.3636 598 0.7742 0.1192 0.7742 0.8799
0.2983 54.5455 600 0.7829 0.0359 0.7829 0.8848
0.2983 54.7273 602 0.7866 0.0359 0.7866 0.8869
0.2983 54.9091 604 0.7862 0.0791 0.7862 0.8867
0.2983 55.0909 606 0.7793 0.1196 0.7793 0.8828
0.2983 55.2727 608 0.7821 0.1659 0.7821 0.8843
0.2983 55.4545 610 0.7969 0.1232 0.7969 0.8927
0.2983 55.6364 612 0.8116 0.0771 0.8116 0.9009
0.2983 55.8182 614 0.8267 0.0771 0.8267 0.9092
0.2983 56.0 616 0.8355 0.0771 0.8355 0.9140
0.2983 56.1818 618 0.8283 0.0771 0.8283 0.9101
0.2983 56.3636 620 0.8169 0.0771 0.8169 0.9038
0.2983 56.5455 622 0.8105 0.0822 0.8105 0.9003
0.2983 56.7273 624 0.7996 0.0327 0.7996 0.8942
0.2983 56.9091 626 0.7937 0.0741 0.7937 0.8909
0.2983 57.0909 628 0.7925 0.0741 0.7925 0.8902
0.2983 57.2727 630 0.7905 0.0741 0.7905 0.8891
0.2983 57.4545 632 0.7914 0.0757 0.7914 0.8896
0.2983 57.6364 634 0.7977 0.0359 0.7977 0.8931
0.2983 57.8182 636 0.8090 0.0423 0.8090 0.8994
0.2983 58.0 638 0.8157 0.0047 0.8157 0.9032
0.2983 58.1818 640 0.8118 0.0071 0.8118 0.9010
0.2983 58.3636 642 0.8074 0.0097 0.8074 0.8985
0.2983 58.5455 644 0.7744 0.0810 0.7744 0.8800
0.2983 58.7273 646 0.7493 0.1249 0.7493 0.8656
0.2983 58.9091 648 0.7302 0.1254 0.7302 0.8545
0.2983 59.0909 650 0.7236 0.1675 0.7236 0.8506
0.2983 59.2727 652 0.7292 0.1254 0.7292 0.8539
0.2983 59.4545 654 0.7421 0.1675 0.7421 0.8615
0.2983 59.6364 656 0.7619 0.1675 0.7619 0.8729
0.2983 59.8182 658 0.7926 -0.0025 0.7926 0.8903
0.2983 60.0 660 0.8729 -0.0167 0.8729 0.9343
0.2983 60.1818 662 0.9356 0.0262 0.9356 0.9673
0.2983 60.3636 664 0.9863 0.0344 0.9863 0.9931
0.2983 60.5455 666 0.9881 0.0344 0.9881 0.9941

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
8
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k3_task3_organization

Finetuned
(4222)
this model