RoBERTa-Base-SE2025T11A-sun-v20241225073733

This model is a fine-tuned version of w11wo/sundanese-roberta-base-emotion-classifier on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5770
  • F1 Macro: 0.4036
  • F1 Micro: 0.6979
  • F1 Weighted: 0.6788
  • F1 Samples: 0.7240
  • F1 Label Marah: 0.3158
  • F1 Label Jijik: 0.4
  • F1 Label Takut: 0.0
  • F1 Label Senang: 0.8776
  • F1 Label Sedih: 0.7273
  • F1 Label Terkejut: 0.3043
  • F1 Label Biasa: 0.2

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss F1 Macro F1 Micro F1 Weighted F1 Samples F1 Label Marah F1 Label Jijik F1 Label Takut F1 Label Senang F1 Label Sedih F1 Label Terkejut F1 Label Biasa
0.4233 0.2994 100 0.3265 0.1217 0.6312 0.4679 0.6615 0.0 0.0 0.0 0.8520 0.0 0.0 0.0
0.4072 0.5988 200 0.2954 0.2063 0.6780 0.5708 0.6771 0.0 0.0 0.0 0.8788 0.5652 0.0 0.0
0.3778 0.8982 300 0.2838 0.2633 0.6887 0.5790 0.7161 0.4444 0.0 0.0 0.8720 0.5263 0.0 0.0
0.2968 1.1976 400 0.3056 0.3553 0.7055 0.6063 0.7292 0.5455 0.5 0.0 0.8774 0.5641 0.0 0.0
0.3193 1.4970 500 0.2859 0.3452 0.6623 0.5986 0.6654 0.4706 0.4444 0.0 0.8457 0.6557 0.0 0.0
0.308 1.7964 600 0.2684 0.3539 0.7086 0.6257 0.7240 0.4 0.5 0.0 0.8877 0.6897 0.0 0.0
0.2574 2.0958 700 0.2980 0.3544 0.6993 0.6130 0.7234 0.5455 0.4444 0.0 0.8788 0.6122 0.0 0.0
0.2352 2.3952 800 0.2827 0.3865 0.6965 0.6427 0.7117 0.5455 0.4444 0.0 0.8602 0.6885 0.1667 0.0
0.2533 2.6946 900 0.2573 0.4206 0.7492 0.6959 0.7638 0.5455 0.4444 0.0 0.8980 0.8 0.2564 0.0
0.2436 2.9940 1000 0.2990 0.3576 0.6769 0.6503 0.7013 0.4211 0.2857 0.0 0.8691 0.6667 0.2609 0.0
0.179 3.2934 1100 0.2911 0.3428 0.6730 0.6307 0.6977 0.375 0.3333 0.0 0.8649 0.6552 0.1714 0.0
0.1982 3.5928 1200 0.3006 0.3789 0.6951 0.6622 0.7172 0.4211 0.375 0.0 0.8691 0.7308 0.2564 0.0
0.1744 3.8922 1300 0.3002 0.3791 0.7015 0.6571 0.7169 0.4615 0.3636 0.0 0.8643 0.72 0.2439 0.0
0.1464 4.1916 1400 0.3178 0.3823 0.7032 0.6427 0.7146 0.5455 0.4444 0.0 0.8731 0.6957 0.1176 0.0
0.136 4.4910 1500 0.3143 0.3620 0.7107 0.6619 0.7266 0.4 0.25 0.0 0.8673 0.7170 0.3 0.0
0.1258 4.7904 1600 0.3286 0.4306 0.7021 0.6939 0.7208 0.4211 0.25 0.0 0.8705 0.7755 0.3333 0.3636
0.1144 5.0898 1700 0.3240 0.4429 0.7152 0.6865 0.7339 0.375 0.4444 0.0 0.8687 0.7755 0.2727 0.3636
0.0899 5.3892 1800 0.3375 0.4518 0.7256 0.6901 0.7427 0.4615 0.4444 0.0 0.8687 0.7547 0.3 0.3333
0.0959 5.6886 1900 0.3258 0.4375 0.7323 0.6995 0.7432 0.4615 0.25 0.0 0.8776 0.7843 0.3256 0.3636
0.0934 5.9880 2000 0.3437 0.4257 0.7055 0.6789 0.7240 0.4 0.3333 0.0 0.8586 0.7547 0.3 0.3333
0.0713 6.2874 2100 0.3629 0.4337 0.6977 0.6919 0.7195 0.3333 0.3636 0.0 0.8660 0.7170 0.3922 0.3636
0.0584 6.5868 2200 0.3695 0.4064 0.6982 0.6751 0.7180 0.4211 0.3636 0.0 0.875 0.7368 0.2667 0.1818
0.059 6.8862 2300 0.3909 0.3894 0.7126 0.6848 0.7333 0.375 0.3636 0.0 0.8780 0.7692 0.3396 0.0
0.0586 7.1856 2400 0.3815 0.4200 0.7025 0.6931 0.7263 0.375 0.4 0.0 0.8856 0.7778 0.3019 0.2
0.0314 7.4850 2500 0.3910 0.3824 0.6934 0.6882 0.7156 0.3 0.3333 0.0 0.8731 0.8 0.3704 0.0
0.0438 7.7844 2600 0.3702 0.4755 0.7359 0.7155 0.7451 0.4615 0.4444 0.0 0.8731 0.7778 0.4082 0.3636
0.0402 8.0838 2700 0.4042 0.4384 0.7147 0.7078 0.7362 0.3158 0.3333 0.0 0.8788 0.8 0.3774 0.3636
0.0264 8.3832 2800 0.4175 0.4423 0.7080 0.6986 0.7312 0.3 0.3636 0.0 0.8796 0.75 0.3415 0.4615
0.042 8.6826 2900 0.3898 0.4036 0.7152 0.6863 0.7331 0.3333 0.2222 0.0 0.8832 0.7451 0.3077 0.3333
0.0342 8.9820 3000 0.4055 0.4390 0.7209 0.7063 0.7448 0.3333 0.3636 0.0 0.8788 0.7636 0.4 0.3333
0.0268 9.2814 3100 0.4006 0.4312 0.7147 0.6957 0.7362 0.3529 0.3333 0.0 0.8808 0.7692 0.3182 0.3636
0.0253 9.5808 3200 0.4121 0.4302 0.7062 0.6983 0.7302 0.3636 0.2667 0.0 0.8705 0.7917 0.3556 0.3636
0.0179 9.8802 3300 0.4356 0.3574 0.6967 0.6606 0.7206 0.4 0.25 0.0 0.8788 0.7407 0.2326 0.0
0.018 10.1796 3400 0.4594 0.4081 0.7066 0.6713 0.7299 0.4706 0.3636 0.0 0.8821 0.7241 0.2162 0.2
0.0151 10.4790 3500 0.4579 0.3911 0.7190 0.6807 0.7422 0.375 0.2222 0.0 0.8844 0.7636 0.2703 0.2222
0.0124 10.7784 3600 0.4415 0.3710 0.6988 0.6648 0.7286 0.2857 0.2222 0.0 0.8744 0.7547 0.2381 0.2222
0.0193 11.0778 3700 0.4514 0.3797 0.6903 0.6708 0.7172 0.3529 0.2 0.0 0.8763 0.7018 0.3043 0.2222
0.0138 11.3772 3800 0.4427 0.3681 0.6909 0.6637 0.7065 0.25 0.25 0.0 0.8705 0.7273 0.2791 0.2
0.0101 11.6766 3900 0.4641 0.4035 0.6964 0.6718 0.7203 0.3478 0.4 0.0 0.8705 0.75 0.2564 0.2
0.0112 11.9760 4000 0.4713 0.4260 0.7048 0.6820 0.7260 0.3158 0.4 0.0 0.8731 0.7917 0.2381 0.3636
0.0118 12.2754 4100 0.4722 0.3811 0.6941 0.6751 0.7148 0.2857 0.25 0.0 0.8731 0.7547 0.3043 0.2
0.0086 12.5749 4200 0.4911 0.4104 0.7009 0.6925 0.7216 0.3077 0.3636 0.0 0.8776 0.7636 0.36 0.2
0.0099 12.8743 4300 0.4928 0.3666 0.6786 0.6584 0.7076 0.2857 0.25 0.0 0.8796 0.7018 0.2273 0.2222
0.0074 13.1737 4400 0.4873 0.3970 0.7072 0.6954 0.7234 0.3158 0.2222 0.0 0.8776 0.7636 0.4 0.2
0.0066 13.4731 4500 0.4947 0.4096 0.7080 0.6896 0.7286 0.3158 0.3636 0.0 0.8832 0.7843 0.2979 0.2222
0.0071 13.7725 4600 0.4939 0.3922 0.7055 0.6848 0.7302 0.3529 0.2222 0.0 0.8832 0.7241 0.3404 0.2222
0.0083 14.0719 4700 0.5015 0.3803 0.6997 0.6775 0.7255 0.3 0.2222 0.0 0.8832 0.7368 0.2979 0.2222
0.0065 14.3713 4800 0.5024 0.4186 0.7038 0.6867 0.7255 0.3636 0.4 0.0 0.8763 0.7636 0.3043 0.2222
0.0055 14.6707 4900 0.5100 0.4132 0.7035 0.6862 0.7292 0.3478 0.4 0.0 0.8788 0.7547 0.3111 0.2
0.0056 14.9701 5000 0.5013 0.4411 0.7059 0.6961 0.7357 0.2727 0.4 0.0 0.8808 0.7547 0.3182 0.4615
0.0044 15.2695 5100 0.5120 0.3840 0.6977 0.6843 0.7201 0.2857 0.2222 0.0 0.8776 0.7273 0.375 0.2
0.0052 15.5689 5200 0.5334 0.3811 0.7035 0.6818 0.7294 0.3 0.2 0.0 0.8844 0.75 0.3111 0.2222
0.0041 15.8683 5300 0.5184 0.4135 0.7143 0.6895 0.7352 0.3158 0.4 0.0 0.8832 0.7547 0.3182 0.2222
0.0052 16.1677 5400 0.5182 0.4188 0.7186 0.6910 0.7352 0.3529 0.4 0.0 0.8832 0.7547 0.3182 0.2222
0.0036 16.4671 5500 0.5208 0.4125 0.7164 0.6949 0.7370 0.3 0.3636 0.0 0.8832 0.8 0.3182 0.2222
0.004 16.7665 5600 0.5382 0.4086 0.7059 0.6842 0.7315 0.3 0.4 0.0 0.8788 0.7407 0.3182 0.2222
0.0037 17.0659 5700 0.5258 0.4208 0.7118 0.6959 0.7299 0.3158 0.4 0.0 0.8776 0.7547 0.375 0.2222
0.0036 17.3653 5800 0.5283 0.4065 0.7101 0.6873 0.7326 0.3158 0.3636 0.0 0.8788 0.7692 0.3182 0.2
0.0031 17.6647 5900 0.5315 0.4107 0.7041 0.6840 0.7247 0.3158 0.4 0.0 0.8776 0.7547 0.3043 0.2222
0.0041 17.9641 6000 0.5419 0.4084 0.6977 0.6806 0.7232 0.3636 0.3636 0.0 0.8776 0.7273 0.3043 0.2222
0.0031 18.2635 6100 0.5469 0.4071 0.7 0.6784 0.7253 0.3158 0.4 0.0 0.8731 0.7273 0.3111 0.2222
0.003 18.5629 6200 0.5477 0.4213 0.7080 0.6877 0.7292 0.4 0.4 0.0 0.8821 0.7407 0.3043 0.2222
0.0032 18.8623 6300 0.5395 0.4054 0.7003 0.6842 0.7208 0.3158 0.4 0.0 0.8808 0.7547 0.3043 0.1818
0.0033 19.1617 6400 0.5444 0.3989 0.7038 0.6968 0.7240 0.3478 0.2222 0.0 0.8808 0.7843 0.375 0.1818
0.0032 19.4611 6500 0.5482 0.4047 0.6982 0.6817 0.7182 0.3158 0.4 0.0 0.8763 0.7547 0.3043 0.1818
0.0025 19.7605 6600 0.5498 0.4047 0.7003 0.6818 0.7221 0.3158 0.4 0.0 0.8763 0.7407 0.3182 0.1818
0.0027 20.0599 6700 0.5542 0.4134 0.6997 0.6879 0.7232 0.3636 0.4 0.0 0.8808 0.7273 0.3404 0.1818
0.003 20.3593 6800 0.5599 0.3788 0.6962 0.6762 0.7214 0.3158 0.2222 0.0 0.8821 0.7273 0.3043 0.2
0.0025 20.6587 6900 0.5556 0.4121 0.7041 0.6838 0.7247 0.3158 0.4 0.0 0.8731 0.7692 0.3043 0.2222
0.0027 20.9581 7000 0.5579 0.4010 0.6959 0.6781 0.7214 0.3158 0.4 0.0 0.8776 0.7273 0.3043 0.1818
0.0025 21.2575 7100 0.5583 0.3762 0.6941 0.6754 0.7188 0.3158 0.2222 0.0 0.8821 0.7273 0.3043 0.1818
0.0025 21.5569 7200 0.5586 0.4021 0.6941 0.6771 0.7169 0.3158 0.4 0.0 0.8718 0.7407 0.3043 0.1818
0.0025 21.8563 7300 0.5612 0.3767 0.6923 0.6721 0.7188 0.3158 0.2222 0.0 0.8718 0.7273 0.3182 0.1818
0.0025 22.1557 7400 0.5688 0.3807 0.6962 0.6744 0.7214 0.3333 0.2222 0.0 0.8776 0.7273 0.3043 0.2
0.0025 22.4551 7500 0.5707 0.4054 0.6979 0.6771 0.7227 0.3333 0.4 0.0 0.8731 0.7273 0.3043 0.2
0.0024 22.7545 7600 0.5702 0.4055 0.7 0.6809 0.7247 0.3158 0.4 0.0 0.8776 0.7407 0.3043 0.2
0.0022 23.0539 7700 0.5706 0.4119 0.7038 0.6858 0.7266 0.3158 0.4 0.0 0.8776 0.7273 0.3404 0.2222
0.0017 23.3533 7800 0.5725 0.4067 0.7 0.6797 0.7240 0.3158 0.4 0.0 0.8776 0.7273 0.3043 0.2222
0.0026 23.6527 7900 0.5774 0.4061 0.6979 0.6773 0.7227 0.3158 0.4 0.0 0.8731 0.7273 0.3043 0.2222
0.0023 23.9521 8000 0.5758 0.4087 0.7021 0.6818 0.7253 0.3158 0.4 0.0 0.8776 0.7407 0.3043 0.2222
0.0022 24.2515 8100 0.5774 0.4067 0.7 0.6797 0.7240 0.3158 0.4 0.0 0.8776 0.7273 0.3043 0.2222
0.0022 24.5509 8200 0.5764 0.4067 0.7 0.6797 0.7240 0.3158 0.4 0.0 0.8776 0.7273 0.3043 0.2222
0.0023 24.8503 8300 0.5770 0.4036 0.6979 0.6788 0.7240 0.3158 0.4 0.0 0.8776 0.7273 0.3043 0.2

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
4
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for alxxtexxr/RoBERTa-Base-SE2025T11A-sun-v20241225073733

Finetuned
(4)
this model