ArabicNewSplits8_FineTuningAraBERT_noAug_task7_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6075
- Qwk: 0.5101
- Mse: 0.6075
- Rmse: 0.7794
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
---|---|---|---|---|---|---|
No log | 0.6667 | 2 | 2.1474 | 0.0339 | 2.1474 | 1.4654 |
No log | 1.3333 | 4 | 1.3213 | 0.0723 | 1.3213 | 1.1495 |
No log | 2.0 | 6 | 0.6446 | 0.1622 | 0.6446 | 0.8028 |
No log | 2.6667 | 8 | 0.8130 | 0.2940 | 0.8130 | 0.9017 |
No log | 3.3333 | 10 | 1.0316 | 0.2027 | 1.0316 | 1.0157 |
No log | 4.0 | 12 | 0.8163 | 0.3321 | 0.8163 | 0.9035 |
No log | 4.6667 | 14 | 0.5551 | 0.5315 | 0.5551 | 0.7451 |
No log | 5.3333 | 16 | 0.5100 | 0.5123 | 0.5100 | 0.7141 |
No log | 6.0 | 18 | 0.5523 | 0.5025 | 0.5523 | 0.7432 |
No log | 6.6667 | 20 | 0.5506 | 0.4589 | 0.5506 | 0.7420 |
No log | 7.3333 | 22 | 0.5711 | 0.4759 | 0.5711 | 0.7557 |
No log | 8.0 | 24 | 0.5892 | 0.4492 | 0.5892 | 0.7676 |
No log | 8.6667 | 26 | 0.5732 | 0.4292 | 0.5732 | 0.7571 |
No log | 9.3333 | 28 | 0.6004 | 0.4185 | 0.6004 | 0.7749 |
No log | 10.0 | 30 | 0.6516 | 0.3661 | 0.6516 | 0.8072 |
No log | 10.6667 | 32 | 0.6818 | 0.3765 | 0.6818 | 0.8257 |
No log | 11.3333 | 34 | 0.6758 | 0.3988 | 0.6758 | 0.8221 |
No log | 12.0 | 36 | 0.6504 | 0.3905 | 0.6504 | 0.8065 |
No log | 12.6667 | 38 | 0.5567 | 0.4414 | 0.5567 | 0.7462 |
No log | 13.3333 | 40 | 0.5557 | 0.4669 | 0.5557 | 0.7454 |
No log | 14.0 | 42 | 0.5239 | 0.5430 | 0.5239 | 0.7238 |
No log | 14.6667 | 44 | 0.7165 | 0.4218 | 0.7165 | 0.8465 |
No log | 15.3333 | 46 | 0.6867 | 0.4343 | 0.6867 | 0.8287 |
No log | 16.0 | 48 | 0.5641 | 0.4725 | 0.5641 | 0.7511 |
No log | 16.6667 | 50 | 0.5551 | 0.5205 | 0.5551 | 0.7450 |
No log | 17.3333 | 52 | 0.5751 | 0.5035 | 0.5751 | 0.7583 |
No log | 18.0 | 54 | 0.6803 | 0.4549 | 0.6803 | 0.8248 |
No log | 18.6667 | 56 | 0.6782 | 0.5051 | 0.6782 | 0.8235 |
No log | 19.3333 | 58 | 0.5668 | 0.4929 | 0.5668 | 0.7528 |
No log | 20.0 | 60 | 0.5338 | 0.4315 | 0.5338 | 0.7306 |
No log | 20.6667 | 62 | 0.5276 | 0.4640 | 0.5276 | 0.7264 |
No log | 21.3333 | 64 | 0.5527 | 0.4722 | 0.5527 | 0.7435 |
No log | 22.0 | 66 | 0.6596 | 0.5376 | 0.6596 | 0.8122 |
No log | 22.6667 | 68 | 0.7277 | 0.4822 | 0.7277 | 0.8531 |
No log | 23.3333 | 70 | 0.5849 | 0.5272 | 0.5849 | 0.7648 |
No log | 24.0 | 72 | 0.5113 | 0.5053 | 0.5113 | 0.7151 |
No log | 24.6667 | 74 | 0.5041 | 0.5223 | 0.5041 | 0.7100 |
No log | 25.3333 | 76 | 0.5902 | 0.4664 | 0.5902 | 0.7682 |
No log | 26.0 | 78 | 0.7216 | 0.5015 | 0.7216 | 0.8495 |
No log | 26.6667 | 80 | 0.8025 | 0.4086 | 0.8025 | 0.8958 |
No log | 27.3333 | 82 | 0.7171 | 0.4218 | 0.7171 | 0.8468 |
No log | 28.0 | 84 | 0.5938 | 0.5055 | 0.5938 | 0.7706 |
No log | 28.6667 | 86 | 0.5747 | 0.4748 | 0.5747 | 0.7581 |
No log | 29.3333 | 88 | 0.6186 | 0.4123 | 0.6186 | 0.7865 |
No log | 30.0 | 90 | 0.7446 | 0.4676 | 0.7446 | 0.8629 |
No log | 30.6667 | 92 | 0.8161 | 0.4241 | 0.8161 | 0.9034 |
No log | 31.3333 | 94 | 0.8996 | 0.3973 | 0.8996 | 0.9485 |
No log | 32.0 | 96 | 0.7428 | 0.4369 | 0.7428 | 0.8619 |
No log | 32.6667 | 98 | 0.6311 | 0.4400 | 0.6311 | 0.7944 |
No log | 33.3333 | 100 | 0.6400 | 0.4848 | 0.6400 | 0.8000 |
No log | 34.0 | 102 | 0.7560 | 0.3974 | 0.7560 | 0.8695 |
No log | 34.6667 | 104 | 0.9413 | 0.3497 | 0.9413 | 0.9702 |
No log | 35.3333 | 106 | 0.9029 | 0.4086 | 0.9029 | 0.9502 |
No log | 36.0 | 108 | 0.7306 | 0.4360 | 0.7306 | 0.8548 |
No log | 36.6667 | 110 | 0.6129 | 0.4282 | 0.6129 | 0.7829 |
No log | 37.3333 | 112 | 0.6088 | 0.4249 | 0.6088 | 0.7803 |
No log | 38.0 | 114 | 0.7047 | 0.4360 | 0.7047 | 0.8394 |
No log | 38.6667 | 116 | 0.8933 | 0.3611 | 0.8933 | 0.9451 |
No log | 39.3333 | 118 | 0.9288 | 0.3568 | 0.9288 | 0.9638 |
No log | 40.0 | 120 | 0.8323 | 0.3761 | 0.8323 | 0.9123 |
No log | 40.6667 | 122 | 0.6955 | 0.4009 | 0.6955 | 0.8340 |
No log | 41.3333 | 124 | 0.6258 | 0.4705 | 0.6258 | 0.7911 |
No log | 42.0 | 126 | 0.5640 | 0.4905 | 0.5640 | 0.7510 |
No log | 42.6667 | 128 | 0.5791 | 0.4281 | 0.5791 | 0.7610 |
No log | 43.3333 | 130 | 0.6355 | 0.4606 | 0.6355 | 0.7972 |
No log | 44.0 | 132 | 0.7685 | 0.4066 | 0.7685 | 0.8766 |
No log | 44.6667 | 134 | 0.9025 | 0.2998 | 0.9025 | 0.9500 |
No log | 45.3333 | 136 | 0.9050 | 0.2861 | 0.9050 | 0.9513 |
No log | 46.0 | 138 | 0.8015 | 0.3321 | 0.8015 | 0.8953 |
No log | 46.6667 | 140 | 0.7314 | 0.3671 | 0.7314 | 0.8552 |
No log | 47.3333 | 142 | 0.7086 | 0.4331 | 0.7086 | 0.8418 |
No log | 48.0 | 144 | 0.6768 | 0.4747 | 0.6768 | 0.8227 |
No log | 48.6667 | 146 | 0.6122 | 0.4409 | 0.6122 | 0.7824 |
No log | 49.3333 | 148 | 0.5908 | 0.4241 | 0.5908 | 0.7686 |
No log | 50.0 | 150 | 0.6497 | 0.4589 | 0.6497 | 0.8060 |
No log | 50.6667 | 152 | 0.7459 | 0.3980 | 0.7459 | 0.8636 |
No log | 51.3333 | 154 | 0.8399 | 0.3844 | 0.8399 | 0.9165 |
No log | 52.0 | 156 | 0.9210 | 0.3730 | 0.9210 | 0.9597 |
No log | 52.6667 | 158 | 0.8614 | 0.4118 | 0.8614 | 0.9281 |
No log | 53.3333 | 160 | 0.7090 | 0.4469 | 0.7090 | 0.8420 |
No log | 54.0 | 162 | 0.6224 | 0.4262 | 0.6224 | 0.7889 |
No log | 54.6667 | 164 | 0.5817 | 0.4749 | 0.5817 | 0.7627 |
No log | 55.3333 | 166 | 0.5629 | 0.4963 | 0.5629 | 0.7503 |
No log | 56.0 | 168 | 0.5647 | 0.5250 | 0.5647 | 0.7515 |
No log | 56.6667 | 170 | 0.5722 | 0.5196 | 0.5722 | 0.7564 |
No log | 57.3333 | 172 | 0.6110 | 0.4436 | 0.6110 | 0.7816 |
No log | 58.0 | 174 | 0.6854 | 0.4598 | 0.6854 | 0.8279 |
No log | 58.6667 | 176 | 0.7666 | 0.3496 | 0.7666 | 0.8756 |
No log | 59.3333 | 178 | 0.8282 | 0.3293 | 0.8282 | 0.9100 |
No log | 60.0 | 180 | 0.8424 | 0.3426 | 0.8424 | 0.9178 |
No log | 60.6667 | 182 | 0.8670 | 0.3426 | 0.8670 | 0.9311 |
No log | 61.3333 | 184 | 0.8375 | 0.3426 | 0.8375 | 0.9152 |
No log | 62.0 | 186 | 0.7924 | 0.3415 | 0.7924 | 0.8902 |
No log | 62.6667 | 188 | 0.7521 | 0.3935 | 0.7521 | 0.8673 |
No log | 63.3333 | 190 | 0.7321 | 0.3980 | 0.7321 | 0.8556 |
No log | 64.0 | 192 | 0.6943 | 0.4598 | 0.6943 | 0.8333 |
No log | 64.6667 | 194 | 0.6372 | 0.4705 | 0.6372 | 0.7982 |
No log | 65.3333 | 196 | 0.5764 | 0.4072 | 0.5764 | 0.7592 |
No log | 66.0 | 198 | 0.5649 | 0.4623 | 0.5649 | 0.7516 |
No log | 66.6667 | 200 | 0.5604 | 0.4799 | 0.5604 | 0.7486 |
No log | 67.3333 | 202 | 0.5796 | 0.4260 | 0.5796 | 0.7613 |
No log | 68.0 | 204 | 0.6050 | 0.4357 | 0.6050 | 0.7778 |
No log | 68.6667 | 206 | 0.6350 | 0.4367 | 0.6350 | 0.7969 |
No log | 69.3333 | 208 | 0.6583 | 0.4367 | 0.6583 | 0.8113 |
No log | 70.0 | 210 | 0.6856 | 0.4520 | 0.6856 | 0.8280 |
No log | 70.6667 | 212 | 0.7125 | 0.4198 | 0.7125 | 0.8441 |
No log | 71.3333 | 214 | 0.7093 | 0.4363 | 0.7093 | 0.8422 |
No log | 72.0 | 216 | 0.6728 | 0.4198 | 0.6728 | 0.8202 |
No log | 72.6667 | 218 | 0.6735 | 0.4198 | 0.6735 | 0.8206 |
No log | 73.3333 | 220 | 0.6626 | 0.4367 | 0.6626 | 0.8140 |
No log | 74.0 | 222 | 0.6762 | 0.4198 | 0.6762 | 0.8223 |
No log | 74.6667 | 224 | 0.6924 | 0.4198 | 0.6924 | 0.8321 |
No log | 75.3333 | 226 | 0.6963 | 0.4198 | 0.6963 | 0.8344 |
No log | 76.0 | 228 | 0.6720 | 0.4198 | 0.6720 | 0.8198 |
No log | 76.6667 | 230 | 0.6603 | 0.4367 | 0.6603 | 0.8126 |
No log | 77.3333 | 232 | 0.6394 | 0.4533 | 0.6394 | 0.7996 |
No log | 78.0 | 234 | 0.6330 | 0.4533 | 0.6330 | 0.7956 |
No log | 78.6667 | 236 | 0.6186 | 0.4533 | 0.6186 | 0.7865 |
No log | 79.3333 | 238 | 0.6221 | 0.4479 | 0.6221 | 0.7887 |
No log | 80.0 | 240 | 0.6380 | 0.4222 | 0.6380 | 0.7988 |
No log | 80.6667 | 242 | 0.6631 | 0.4896 | 0.6631 | 0.8143 |
No log | 81.3333 | 244 | 0.6821 | 0.4896 | 0.6821 | 0.8259 |
No log | 82.0 | 246 | 0.6991 | 0.4896 | 0.6991 | 0.8361 |
No log | 82.6667 | 248 | 0.7052 | 0.4896 | 0.7052 | 0.8398 |
No log | 83.3333 | 250 | 0.6773 | 0.4896 | 0.6773 | 0.8230 |
No log | 84.0 | 252 | 0.6617 | 0.4748 | 0.6617 | 0.8134 |
No log | 84.6667 | 254 | 0.6431 | 0.4687 | 0.6431 | 0.8019 |
No log | 85.3333 | 256 | 0.6186 | 0.4543 | 0.6186 | 0.7865 |
No log | 86.0 | 258 | 0.5900 | 0.4688 | 0.5900 | 0.7681 |
No log | 86.6667 | 260 | 0.5775 | 0.4629 | 0.5775 | 0.7599 |
No log | 87.3333 | 262 | 0.5773 | 0.4678 | 0.5773 | 0.7598 |
No log | 88.0 | 264 | 0.5884 | 0.4688 | 0.5884 | 0.7671 |
No log | 88.6667 | 266 | 0.5977 | 0.5043 | 0.5977 | 0.7731 |
No log | 89.3333 | 268 | 0.6007 | 0.5101 | 0.6007 | 0.7751 |
No log | 90.0 | 270 | 0.6084 | 0.4836 | 0.6084 | 0.7800 |
No log | 90.6667 | 272 | 0.6202 | 0.4415 | 0.6202 | 0.7875 |
No log | 91.3333 | 274 | 0.6380 | 0.4369 | 0.6380 | 0.7987 |
No log | 92.0 | 276 | 0.6614 | 0.4748 | 0.6614 | 0.8133 |
No log | 92.6667 | 278 | 0.6784 | 0.4896 | 0.6784 | 0.8237 |
No log | 93.3333 | 280 | 0.6848 | 0.4896 | 0.6848 | 0.8275 |
No log | 94.0 | 282 | 0.6794 | 0.4896 | 0.6794 | 0.8243 |
No log | 94.6667 | 284 | 0.6719 | 0.4896 | 0.6719 | 0.8197 |
No log | 95.3333 | 286 | 0.6606 | 0.4748 | 0.6606 | 0.8128 |
No log | 96.0 | 288 | 0.6476 | 0.4687 | 0.6476 | 0.8047 |
No log | 96.6667 | 290 | 0.6362 | 0.4687 | 0.6362 | 0.7976 |
No log | 97.3333 | 292 | 0.6252 | 0.4333 | 0.6252 | 0.7907 |
No log | 98.0 | 294 | 0.6183 | 0.4586 | 0.6183 | 0.7863 |
No log | 98.6667 | 296 | 0.6130 | 0.4845 | 0.6130 | 0.7829 |
No log | 99.3333 | 298 | 0.6089 | 0.4676 | 0.6089 | 0.7803 |
No log | 100.0 | 300 | 0.6075 | 0.5101 | 0.6075 | 0.7794 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 5
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for MayBashendy/ArabicNewSplits8_FineTuningAraBERT_noAug_task7_organization
Base model
aubmindlab/bert-base-arabertv02