IParraMartin's picture
IParraMartin/XLM-EusBERTa-sentiment-classification
4fca215
metadata
license: cc-by-sa-4.0
base_model: ClassCat/roberta-small-basque
tags:
  - generated_from_trainer
datasets:
  - basque_glue
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: XLM-EusBERTa-sentiment-classification
    results:
      - task:
          name: Text Classification
          type: text-classification
        dataset:
          name: basque_glue
          type: basque_glue
          config: bec
          split: validation
          args: bec
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.6290322580645161
          - name: F1
            type: f1
            value: 0.6290834931512662
          - name: Precision
            type: precision
            value: 0.630304630215078
          - name: Recall
            type: recall
            value: 0.6290322580645161

XLM-EusBERTa-sentiment-classification

This model is a fine-tuned version of ClassCat/roberta-small-basque on the basque_glue dataset. It achieves the following results on the evaluation set:

  • Loss: 4.0012
  • Accuracy: 0.6290
  • F1: 0.6291
  • Precision: 0.6303
  • Recall: 0.6290

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
No log 1.0 380 0.7366 0.6736 0.6589 0.6711 0.6736
0.7679 2.0 760 0.7654 0.6767 0.6692 0.6726 0.6767
0.4846 3.0 1140 0.9844 0.6621 0.6599 0.6681 0.6621
0.2952 4.0 1520 1.1162 0.6375 0.6371 0.6473 0.6375
0.2952 5.0 1900 1.4234 0.6329 0.6343 0.6425 0.6329
0.192 6.0 2280 1.8570 0.6413 0.6362 0.6424 0.6413
0.159 7.0 2660 2.1968 0.6152 0.6086 0.6152 0.6152
0.1265 8.0 3040 2.1853 0.6283 0.6267 0.6267 0.6283
0.1265 9.0 3420 2.1953 0.6467 0.6441 0.6435 0.6467
0.0807 10.0 3800 2.2806 0.6367 0.6381 0.6480 0.6367
0.0688 11.0 4180 2.7982 0.6175 0.6167 0.6356 0.6175
0.0675 12.0 4560 2.5182 0.6605 0.6587 0.6584 0.6605
0.0675 13.0 4940 2.6544 0.6413 0.6315 0.6391 0.6413
0.0451 14.0 5320 2.5889 0.6459 0.6427 0.6424 0.6459
0.0432 15.0 5700 2.8100 0.6290 0.6299 0.6359 0.6290
0.0297 16.0 6080 2.9983 0.6275 0.6262 0.6263 0.6275
0.0297 17.0 6460 2.7803 0.6313 0.6289 0.6311 0.6313
0.0369 18.0 6840 2.9602 0.6283 0.6287 0.6353 0.6283
0.0289 19.0 7220 2.9911 0.6298 0.6309 0.6356 0.6298
0.0251 20.0 7600 2.8634 0.6344 0.6350 0.6364 0.6344
0.0251 21.0 7980 2.7171 0.6406 0.6378 0.6375 0.6406
0.0332 22.0 8360 3.0386 0.6275 0.6215 0.6245 0.6275
0.0212 23.0 8740 2.9876 0.6313 0.6319 0.6344 0.6313
0.0218 24.0 9120 2.9776 0.6283 0.6267 0.6348 0.6283
0.0189 25.0 9500 2.9596 0.6329 0.6340 0.6381 0.6329
0.0189 26.0 9880 3.0420 0.6329 0.6324 0.6380 0.6329
0.0172 27.0 10260 3.3335 0.6336 0.6348 0.6369 0.6336
0.0054 28.0 10640 3.2843 0.6429 0.6442 0.6466 0.6429
0.0065 29.0 11020 3.4868 0.6275 0.6291 0.6399 0.6275
0.0065 30.0 11400 3.8241 0.6175 0.6174 0.6209 0.6175
0.0108 31.0 11780 3.5833 0.6260 0.6275 0.6317 0.6260
0.0127 32.0 12160 3.5452 0.6183 0.6203 0.6283 0.6183
0.0092 33.0 12540 3.8349 0.6167 0.6167 0.6389 0.6167
0.0092 34.0 12920 3.6464 0.6244 0.6260 0.6313 0.6244
0.0069 35.0 13300 3.7538 0.6352 0.6352 0.6359 0.6352
0.0028 36.0 13680 3.8862 0.6221 0.6243 0.6350 0.6221
0.0001 37.0 14060 3.9846 0.6229 0.6206 0.6252 0.6229
0.0001 38.0 14440 3.7743 0.6275 0.6287 0.6309 0.6275
0.0057 39.0 14820 3.9002 0.6290 0.6300 0.6319 0.6290
0.0004 40.0 15200 3.9651 0.6306 0.6315 0.6333 0.6306
0.0032 41.0 15580 4.0279 0.6206 0.6213 0.6365 0.6206
0.0032 42.0 15960 3.8244 0.6344 0.6342 0.6344 0.6344
0.0033 43.0 16340 3.9036 0.6198 0.6205 0.6237 0.6198
0.003 44.0 16720 4.0028 0.6198 0.6214 0.6263 0.6198
0.0005 45.0 17100 3.9621 0.6306 0.6315 0.6361 0.6306
0.0005 46.0 17480 3.9682 0.6306 0.6297 0.6298 0.6306
0.0003 47.0 17860 4.0103 0.6321 0.6310 0.6305 0.6321
0.0003 48.0 18240 3.9968 0.6321 0.6316 0.6317 0.6321
0.003 49.0 18620 3.9835 0.6298 0.6297 0.6304 0.6298
0.0005 50.0 19000 4.0012 0.6290 0.6291 0.6303 0.6290

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.15.0