lettuce_pos_nl_mono / README.md
pranaydeeps's picture
Upload folder using huggingface_hub
21cd340 verified
|
raw
history blame
5.24 kB
metadata
license: mit
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: pos_final_mono_nl
    results: []

pos_final_mono_nl

This model is a fine-tuned version of pdelobelle/robbert-v2-dutch-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1115
  • Precision: 0.9783
  • Recall: 0.9784
  • F1: 0.9783
  • Accuracy: 0.9791

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 256
  • eval_batch_size: 256
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 1024
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 40.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 69 3.7703 0.2597 0.1252 0.1689 0.2575
No log 2.0 138 1.0148 0.8058 0.8008 0.8033 0.8066
No log 3.0 207 0.3402 0.9302 0.9278 0.9290 0.9299
No log 4.0 276 0.2016 0.9559 0.9551 0.9555 0.9561
No log 5.0 345 0.1486 0.9643 0.9638 0.9641 0.9648
No log 6.0 414 0.1206 0.9697 0.9696 0.9697 0.9702
No log 7.0 483 0.1063 0.9720 0.9719 0.9720 0.9727
1.2192 8.0 552 0.0983 0.9734 0.9735 0.9735 0.9742
1.2192 9.0 621 0.0947 0.9746 0.9747 0.9746 0.9754
1.2192 10.0 690 0.0913 0.9753 0.9755 0.9754 0.9761
1.2192 11.0 759 0.0885 0.9761 0.9763 0.9762 0.9770
1.2192 12.0 828 0.0877 0.9764 0.9765 0.9764 0.9772
1.2192 13.0 897 0.0878 0.9767 0.9769 0.9768 0.9775
1.2192 14.0 966 0.0873 0.9767 0.9769 0.9768 0.9776
0.0688 15.0 1035 0.0877 0.9771 0.9773 0.9772 0.9779
0.0688 16.0 1104 0.0878 0.9773 0.9774 0.9773 0.9781
0.0688 17.0 1173 0.0897 0.9772 0.9773 0.9773 0.9781
0.0688 18.0 1242 0.0909 0.9775 0.9776 0.9776 0.9783
0.0688 19.0 1311 0.0917 0.9776 0.9778 0.9777 0.9785
0.0688 20.0 1380 0.0924 0.9778 0.9780 0.9779 0.9787
0.0688 21.0 1449 0.0949 0.9777 0.9779 0.9778 0.9785
0.0366 22.0 1518 0.0956 0.9776 0.9777 0.9777 0.9784
0.0366 23.0 1587 0.0962 0.9778 0.9780 0.9779 0.9786
0.0366 24.0 1656 0.0992 0.9777 0.9780 0.9779 0.9786
0.0366 25.0 1725 0.0999 0.9779 0.9781 0.9780 0.9787
0.0366 26.0 1794 0.1007 0.9780 0.9782 0.9781 0.9789
0.0366 27.0 1863 0.1022 0.9781 0.9782 0.9782 0.9789
0.0366 28.0 1932 0.1030 0.9781 0.9783 0.9782 0.9790
0.0226 29.0 2001 0.1055 0.9781 0.9782 0.9781 0.9789
0.0226 30.0 2070 0.1057 0.9780 0.9782 0.9781 0.9789
0.0226 31.0 2139 0.1067 0.9780 0.9781 0.9780 0.9788
0.0226 32.0 2208 0.1077 0.9780 0.9782 0.9781 0.9789
0.0226 33.0 2277 0.1085 0.9780 0.9781 0.9781 0.9789
0.0226 34.0 2346 0.1094 0.9781 0.9782 0.9781 0.9789
0.0226 35.0 2415 0.1095 0.9783 0.9784 0.9783 0.9791
0.0226 36.0 2484 0.1101 0.9780 0.9782 0.9781 0.9789
0.0159 37.0 2553 0.1114 0.9782 0.9784 0.9783 0.9791
0.0159 38.0 2622 0.1111 0.9782 0.9784 0.9783 0.9791
0.0159 39.0 2691 0.1114 0.9782 0.9784 0.9783 0.9791
0.0159 40.0 2760 0.1115 0.9783 0.9784 0.9783 0.9791

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.12.0
  • Datasets 2.18.0
  • Tokenizers 0.13.2