robbert-v2-dutch-base-finetuned-emotion

This model is a fine-tuned version of pdelobelle/robbert-v2-dutch-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.3545
  • Accuracy: 0.52
  • F1: 0.5123

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.5586 1.0 25 1.4429 0.42 0.2485
1.4425 2.0 50 1.3576 0.46 0.3533
1.2834 3.0 75 1.3207 0.5 0.4369
1.1051 4.0 100 1.3228 0.48 0.4217
0.9053 5.0 125 1.3705 0.49 0.4302
0.7326 6.0 150 1.4522 0.53 0.5019
0.5724 7.0 175 1.5445 0.53 0.5064
0.4411 8.0 200 1.6560 0.54 0.5120
0.3476 9.0 225 1.7233 0.51 0.4845
0.2324 10.0 250 1.9150 0.52 0.5056
0.1866 11.0 275 2.0207 0.52 0.4975
0.165 12.0 300 2.0863 0.52 0.5094
0.1291 13.0 325 2.1584 0.5 0.4833
0.0762 14.0 350 2.2296 0.55 0.5332
0.0577 15.0 375 2.3171 0.5 0.4986
0.0424 16.0 400 2.4509 0.5 0.4795
0.0253 17.0 425 2.5444 0.49 0.4917
0.0191 18.0 450 2.5894 0.51 0.5031
0.0123 19.0 475 2.7144 0.5 0.4995
0.01 20.0 500 2.7358 0.53 0.5231
0.0086 21.0 525 2.8282 0.48 0.4825
0.0064 22.0 550 2.8421 0.52 0.5244
0.0059 23.0 575 2.9267 0.53 0.5200
0.005 24.0 600 2.9568 0.52 0.5074
0.0044 25.0 625 3.0420 0.47 0.4755
0.0066 26.0 650 3.0421 0.48 0.4881
0.0039 27.0 675 3.1039 0.51 0.4960
0.0033 28.0 700 3.1226 0.51 0.4955
0.0033 29.0 725 3.1215 0.51 0.4999
0.003 30.0 750 3.1649 0.51 0.4980
0.0025 31.0 775 3.1716 0.5 0.4921
0.0028 32.0 800 3.2371 0.5 0.4956
0.0028 33.0 825 3.1730 0.52 0.5154
0.0055 34.0 850 3.1842 0.49 0.4884
0.0022 35.0 875 3.2324 0.51 0.4955
0.0023 36.0 900 3.2221 0.52 0.5089
0.002 37.0 925 3.2756 0.51 0.4981
0.0021 38.0 950 3.2866 0.51 0.5010
0.0019 39.0 975 3.2882 0.51 0.5010
0.0018 40.0 1000 3.2864 0.51 0.4967
0.0017 41.0 1025 3.3101 0.51 0.4967
0.0017 42.0 1050 3.3215 0.52 0.5089
0.0016 43.0 1075 3.3253 0.51 0.5043
0.0056 44.0 1100 3.3118 0.51 0.5043
0.0016 45.0 1125 3.3566 0.51 0.4981
0.0016 46.0 1150 3.3593 0.51 0.4981
0.0016 47.0 1175 3.3638 0.51 0.4981
0.0017 48.0 1200 3.3605 0.52 0.5089
0.0017 49.0 1225 3.3526 0.52 0.5123
0.0016 50.0 1250 3.3545 0.52 0.5123

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
60
Safetensors
Model size
117M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for antalvdb/robbert-v2-dutch-base-finetuned-emotion

Finetuned
(41)
this model