ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k2_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4352
  • Qwk: 0.6132
  • Mse: 0.4352
  • Rmse: 0.6597

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.25 2 2.4023 0.0052 2.4023 1.5499
No log 0.5 4 1.1655 0.1259 1.1655 1.0796
No log 0.75 6 0.7541 0.1372 0.7541 0.8684
No log 1.0 8 0.7285 0.2319 0.7285 0.8535
No log 1.25 10 0.6985 0.2783 0.6985 0.8357
No log 1.5 12 0.8415 0.2375 0.8415 0.9173
No log 1.75 14 0.6713 0.3029 0.6713 0.8194
No log 2.0 16 0.6546 0.3690 0.6546 0.8091
No log 2.25 18 0.7346 0.4074 0.7346 0.8571
No log 2.5 20 0.6949 0.4172 0.6949 0.8336
No log 2.75 22 0.5933 0.4569 0.5933 0.7702
No log 3.0 24 0.5082 0.5046 0.5082 0.7129
No log 3.25 26 0.4662 0.4656 0.4662 0.6828
No log 3.5 28 0.4559 0.5488 0.4559 0.6752
No log 3.75 30 0.4866 0.6052 0.4866 0.6976
No log 4.0 32 0.5134 0.5950 0.5134 0.7165
No log 4.25 34 0.4695 0.4471 0.4695 0.6852
No log 4.5 36 0.6077 0.5112 0.6077 0.7795
No log 4.75 38 0.4910 0.5373 0.4910 0.7007
No log 5.0 40 0.4398 0.5939 0.4398 0.6632
No log 5.25 42 0.4167 0.6517 0.4167 0.6456
No log 5.5 44 0.4541 0.5779 0.4541 0.6738
No log 5.75 46 0.5788 0.5587 0.5788 0.7608
No log 6.0 48 0.8070 0.4464 0.8070 0.8983
No log 6.25 50 0.7189 0.4633 0.7189 0.8479
No log 6.5 52 0.4613 0.6017 0.4613 0.6792
No log 6.75 54 0.3903 0.6929 0.3903 0.6247
No log 7.0 56 0.4025 0.6598 0.4025 0.6345
No log 7.25 58 0.4032 0.6339 0.4032 0.6349
No log 7.5 60 0.4184 0.6946 0.4184 0.6468
No log 7.75 62 0.4942 0.6047 0.4942 0.7030
No log 8.0 64 0.4830 0.6114 0.4830 0.6950
No log 8.25 66 0.4558 0.6010 0.4558 0.6751
No log 8.5 68 0.4602 0.6068 0.4602 0.6784
No log 8.75 70 0.4621 0.5985 0.4621 0.6798
No log 9.0 72 0.4760 0.6514 0.4760 0.6899
No log 9.25 74 0.4554 0.6828 0.4554 0.6748
No log 9.5 76 0.4965 0.5514 0.4965 0.7046
No log 9.75 78 0.4453 0.6241 0.4453 0.6673
No log 10.0 80 0.4359 0.7012 0.4359 0.6602
No log 10.25 82 0.4308 0.7012 0.4308 0.6564
No log 10.5 84 0.4461 0.6252 0.4461 0.6679
No log 10.75 86 0.4604 0.5723 0.4604 0.6786
No log 11.0 88 0.4782 0.5723 0.4782 0.6916
No log 11.25 90 0.4452 0.6542 0.4452 0.6672
No log 11.5 92 0.4517 0.5549 0.4517 0.6721
No log 11.75 94 0.4680 0.5723 0.4680 0.6841
No log 12.0 96 0.4415 0.6078 0.4415 0.6645
No log 12.25 98 0.4280 0.7041 0.4280 0.6542
No log 12.5 100 0.4186 0.6847 0.4186 0.6470
No log 12.75 102 0.4381 0.6307 0.4381 0.6619
No log 13.0 104 0.4293 0.6484 0.4293 0.6552
No log 13.25 106 0.4267 0.6928 0.4267 0.6532
No log 13.5 108 0.4959 0.6206 0.4959 0.7042
No log 13.75 110 0.4875 0.6379 0.4875 0.6982
No log 14.0 112 0.4718 0.6547 0.4718 0.6869
No log 14.25 114 0.4790 0.6547 0.4790 0.6921
No log 14.5 116 0.4202 0.6900 0.4202 0.6483
No log 14.75 118 0.4143 0.6570 0.4143 0.6437
No log 15.0 120 0.4101 0.6639 0.4101 0.6404
No log 15.25 122 0.4147 0.6772 0.4147 0.6440
No log 15.5 124 0.3999 0.7032 0.3999 0.6324
No log 15.75 126 0.4301 0.6349 0.4301 0.6558
No log 16.0 128 0.5999 0.4617 0.5999 0.7745
No log 16.25 130 0.5035 0.6658 0.5035 0.7096
No log 16.5 132 0.4140 0.6890 0.4140 0.6435
No log 16.75 134 0.4616 0.6259 0.4616 0.6794
No log 17.0 136 0.4258 0.6552 0.4258 0.6525
No log 17.25 138 0.4281 0.5929 0.4281 0.6543
No log 17.5 140 0.4265 0.5875 0.4265 0.6531
No log 17.75 142 0.4325 0.5999 0.4325 0.6576
No log 18.0 144 0.4263 0.6068 0.4263 0.6529
No log 18.25 146 0.4761 0.6091 0.4761 0.6900
No log 18.5 148 0.5250 0.6038 0.5250 0.7246
No log 18.75 150 0.4500 0.6162 0.4500 0.6708
No log 19.0 152 0.4550 0.5657 0.4550 0.6746
No log 19.25 154 0.4499 0.5841 0.4499 0.6707
No log 19.5 156 0.4387 0.6530 0.4387 0.6623
No log 19.75 158 0.4610 0.6309 0.4610 0.6790
No log 20.0 160 0.4242 0.6448 0.4242 0.6513
No log 20.25 162 0.4733 0.5384 0.4733 0.6879
No log 20.5 164 0.5117 0.4969 0.5117 0.7154
No log 20.75 166 0.4441 0.6047 0.4441 0.6664
No log 21.0 168 0.4198 0.6344 0.4198 0.6479
No log 21.25 170 0.4222 0.6389 0.4222 0.6498
No log 21.5 172 0.4226 0.6125 0.4226 0.6501
No log 21.75 174 0.4213 0.6125 0.4213 0.6490
No log 22.0 176 0.4237 0.6125 0.4237 0.6509
No log 22.25 178 0.4467 0.5999 0.4467 0.6683
No log 22.5 180 0.4398 0.5985 0.4398 0.6632
No log 22.75 182 0.4332 0.5765 0.4332 0.6582
No log 23.0 184 0.4306 0.5831 0.4306 0.6562
No log 23.25 186 0.4306 0.5831 0.4306 0.6562
No log 23.5 188 0.4300 0.6448 0.4300 0.6558
No log 23.75 190 0.4559 0.5765 0.4559 0.6752
No log 24.0 192 0.4698 0.5779 0.4698 0.6855
No log 24.25 194 0.4195 0.6567 0.4195 0.6477
No log 24.5 196 0.4269 0.6919 0.4269 0.6534
No log 24.75 198 0.4420 0.6381 0.4420 0.6648
No log 25.0 200 0.4068 0.6747 0.4068 0.6378
No log 25.25 202 0.4562 0.6457 0.4562 0.6754
No log 25.5 204 0.5250 0.6042 0.5250 0.7245
No log 25.75 206 0.4513 0.6445 0.4513 0.6718
No log 26.0 208 0.4405 0.7025 0.4405 0.6637
No log 26.25 210 0.5498 0.6275 0.5498 0.7415
No log 26.5 212 0.5134 0.6200 0.5134 0.7165
No log 26.75 214 0.4229 0.6388 0.4229 0.6503
No log 27.0 216 0.4346 0.6419 0.4346 0.6592
No log 27.25 218 0.4205 0.6101 0.4205 0.6485
No log 27.5 220 0.4118 0.6357 0.4118 0.6417
No log 27.75 222 0.4378 0.6518 0.4378 0.6617
No log 28.0 224 0.4250 0.6431 0.4250 0.6519
No log 28.25 226 0.4136 0.5890 0.4136 0.6431
No log 28.5 228 0.4245 0.6305 0.4245 0.6515
No log 28.75 230 0.4131 0.6448 0.4131 0.6427
No log 29.0 232 0.4334 0.6809 0.4334 0.6583
No log 29.25 234 0.4306 0.6503 0.4306 0.6562
No log 29.5 236 0.4177 0.6448 0.4177 0.6463
No log 29.75 238 0.4224 0.6448 0.4224 0.6499
No log 30.0 240 0.4304 0.6068 0.4304 0.6560
No log 30.25 242 0.4300 0.5555 0.4300 0.6558
No log 30.5 244 0.4118 0.6154 0.4118 0.6417
No log 30.75 246 0.4114 0.6832 0.4114 0.6414
No log 31.0 248 0.4104 0.7055 0.4104 0.6406
No log 31.25 250 0.4232 0.6154 0.4232 0.6505
No log 31.5 252 0.4238 0.6370 0.4238 0.6510
No log 31.75 254 0.4069 0.6843 0.4069 0.6379
No log 32.0 256 0.4122 0.6611 0.4122 0.6421
No log 32.25 258 0.4061 0.6344 0.4061 0.6373
No log 32.5 260 0.4093 0.6344 0.4093 0.6398
No log 32.75 262 0.4081 0.6448 0.4081 0.6389
No log 33.0 264 0.4130 0.6542 0.4130 0.6427
No log 33.25 266 0.4134 0.6542 0.4134 0.6429
No log 33.5 268 0.4097 0.6448 0.4097 0.6401
No log 33.75 270 0.4173 0.6243 0.4173 0.6460
No log 34.0 272 0.4377 0.5208 0.4377 0.6616
No log 34.25 274 0.4245 0.6032 0.4245 0.6516
No log 34.5 276 0.4351 0.5899 0.4351 0.6597
No log 34.75 278 0.4475 0.6032 0.4475 0.6690
No log 35.0 280 0.4400 0.6241 0.4400 0.6633
No log 35.25 282 0.4339 0.6553 0.4339 0.6587
No log 35.5 284 0.4344 0.6289 0.4344 0.6591
No log 35.75 286 0.4354 0.6105 0.4354 0.6599
No log 36.0 288 0.4335 0.6154 0.4335 0.6584
No log 36.25 290 0.4489 0.6446 0.4489 0.6700
No log 36.5 292 0.4909 0.5944 0.4909 0.7006
No log 36.75 294 0.4658 0.5859 0.4658 0.6825
No log 37.0 296 0.4247 0.6357 0.4247 0.6517
No log 37.25 298 0.4252 0.6770 0.4252 0.6521
No log 37.5 300 0.4181 0.6357 0.4181 0.6466
No log 37.75 302 0.4199 0.6344 0.4199 0.6480
No log 38.0 304 0.4211 0.6435 0.4211 0.6490
No log 38.25 306 0.4346 0.6293 0.4346 0.6592
No log 38.5 308 0.4235 0.6215 0.4235 0.6508
No log 38.75 310 0.4264 0.5986 0.4264 0.6530
No log 39.0 312 0.4368 0.5836 0.4368 0.6609
No log 39.25 314 0.4633 0.5715 0.4633 0.6807
No log 39.5 316 0.4949 0.5802 0.4949 0.7035
No log 39.75 318 0.4754 0.5955 0.4754 0.6895
No log 40.0 320 0.4618 0.5693 0.4618 0.6796
No log 40.25 322 0.4533 0.5693 0.4533 0.6733
No log 40.5 324 0.4483 0.5693 0.4483 0.6696
No log 40.75 326 0.4428 0.5587 0.4428 0.6654
No log 41.0 328 0.4310 0.5836 0.4310 0.6565
No log 41.25 330 0.4209 0.5749 0.4209 0.6488
No log 41.5 332 0.4207 0.5749 0.4207 0.6486
No log 41.75 334 0.4378 0.6602 0.4378 0.6617
No log 42.0 336 0.4589 0.5939 0.4589 0.6774
No log 42.25 338 0.4726 0.5939 0.4726 0.6875
No log 42.5 340 0.4620 0.5939 0.4620 0.6797
No log 42.75 342 0.4261 0.6111 0.4261 0.6528
No log 43.0 344 0.4149 0.6542 0.4149 0.6442
No log 43.25 346 0.4128 0.6542 0.4128 0.6425
No log 43.5 348 0.4161 0.6830 0.4161 0.6451
No log 43.75 350 0.4240 0.6818 0.4240 0.6511
No log 44.0 352 0.4307 0.6627 0.4307 0.6563
No log 44.25 354 0.4357 0.6627 0.4357 0.6600
No log 44.5 356 0.4334 0.6542 0.4334 0.6583
No log 44.75 358 0.4296 0.6344 0.4296 0.6554
No log 45.0 360 0.4345 0.6096 0.4345 0.6592
No log 45.25 362 0.4430 0.6395 0.4430 0.6656
No log 45.5 364 0.4426 0.6395 0.4426 0.6653
No log 45.75 366 0.4290 0.6344 0.4290 0.6549
No log 46.0 368 0.4280 0.6243 0.4280 0.6542
No log 46.25 370 0.4249 0.6448 0.4249 0.6518
No log 46.5 372 0.4231 0.6448 0.4231 0.6505
No log 46.75 374 0.4246 0.6154 0.4246 0.6516
No log 47.0 376 0.4255 0.6489 0.4255 0.6523
No log 47.25 378 0.4195 0.6448 0.4195 0.6477
No log 47.5 380 0.4281 0.5883 0.4281 0.6543
No log 47.75 382 0.4340 0.5970 0.4340 0.6588
No log 48.0 384 0.4241 0.6001 0.4241 0.6512
No log 48.25 386 0.4213 0.6448 0.4213 0.6491
No log 48.5 388 0.4204 0.6229 0.4204 0.6484
No log 48.75 390 0.4198 0.6229 0.4198 0.6479
No log 49.0 392 0.4251 0.6370 0.4251 0.6520
No log 49.25 394 0.4347 0.6127 0.4347 0.6593
No log 49.5 396 0.4502 0.6260 0.4502 0.6710
No log 49.75 398 0.4312 0.6370 0.4312 0.6567
No log 50.0 400 0.4326 0.6240 0.4326 0.6577
No log 50.25 402 0.4485 0.5947 0.4485 0.6697
No log 50.5 404 0.4378 0.5631 0.4378 0.6616
No log 50.75 406 0.4292 0.5550 0.4292 0.6552
No log 51.0 408 0.4166 0.6111 0.4166 0.6454
No log 51.25 410 0.4168 0.6111 0.4168 0.6456
No log 51.5 412 0.4217 0.5550 0.4217 0.6494
No log 51.75 414 0.4245 0.6241 0.4245 0.6515
No log 52.0 416 0.4365 0.6518 0.4365 0.6607
No log 52.25 418 0.4304 0.6252 0.4304 0.6561
No log 52.5 420 0.4290 0.6678 0.4290 0.6550
No log 52.75 422 0.4385 0.6601 0.4385 0.6622
No log 53.0 424 0.4374 0.6601 0.4374 0.6614
No log 53.25 426 0.4276 0.6370 0.4276 0.6539
No log 53.5 428 0.4336 0.5883 0.4336 0.6585
No log 53.75 430 0.4437 0.5631 0.4437 0.6661
No log 54.0 432 0.4411 0.5897 0.4411 0.6642
No log 54.25 434 0.4297 0.5883 0.4297 0.6555
No log 54.5 436 0.4288 0.6282 0.4288 0.6548
No log 54.75 438 0.4365 0.6601 0.4365 0.6607
No log 55.0 440 0.4310 0.6601 0.4310 0.6565
No log 55.25 442 0.4246 0.6542 0.4246 0.6516
No log 55.5 444 0.4390 0.6321 0.4390 0.6625
No log 55.75 446 0.4444 0.6321 0.4444 0.6666
No log 56.0 448 0.4368 0.6313 0.4368 0.6609
No log 56.25 450 0.4242 0.6632 0.4242 0.6513
No log 56.5 452 0.4239 0.6242 0.4239 0.6511
No log 56.75 454 0.4261 0.5831 0.4261 0.6528
No log 57.0 456 0.4248 0.6214 0.4248 0.6518
No log 57.25 458 0.4347 0.6401 0.4347 0.6593
No log 57.5 460 0.4594 0.6388 0.4594 0.6778
No log 57.75 462 0.4710 0.6399 0.4710 0.6863
No log 58.0 464 0.4591 0.6388 0.4591 0.6775
No log 58.25 466 0.4352 0.6401 0.4352 0.6597
No log 58.5 468 0.4237 0.6111 0.4237 0.6510
No log 58.75 470 0.4304 0.6101 0.4304 0.6561
No log 59.0 472 0.4409 0.6330 0.4409 0.6640
No log 59.25 474 0.4321 0.5781 0.4321 0.6574
No log 59.5 476 0.4213 0.6024 0.4213 0.6491
No log 59.75 478 0.4270 0.5550 0.4270 0.6535
No log 60.0 480 0.4382 0.6492 0.4382 0.6620
No log 60.25 482 0.4518 0.6388 0.4518 0.6722
No log 60.5 484 0.4583 0.6173 0.4583 0.6770
No log 60.75 486 0.4425 0.5980 0.4425 0.6652
No log 61.0 488 0.4300 0.6303 0.4300 0.6558
No log 61.25 490 0.4195 0.6330 0.4195 0.6477
No log 61.5 492 0.4219 0.6068 0.4219 0.6495
No log 61.75 494 0.4225 0.6068 0.4225 0.6500
No log 62.0 496 0.4186 0.6377 0.4186 0.6470
No log 62.25 498 0.4158 0.6830 0.4158 0.6449
0.1924 62.5 500 0.4253 0.6517 0.4253 0.6521
0.1924 62.75 502 0.4523 0.6388 0.4523 0.6725
0.1924 63.0 504 0.4734 0.5802 0.4734 0.6880
0.1924 63.25 506 0.4719 0.5802 0.4719 0.6870
0.1924 63.5 508 0.4543 0.5980 0.4543 0.6740
0.1924 63.75 510 0.4352 0.6132 0.4352 0.6597

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k2_task7_organization

Finetuned
(4204)
this model