hts98's picture
End of training
ab340a0 verified
metadata
library_name: transformers
license: mit
base_model: vinai/phobert-base
tags:
  - generated_from_trainer
metrics:
  - f1
model-index:
  - name: phobert_classification
    results: []

phobert_classification

This model is a fine-tuned version of vinai/phobert-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3677
  • F1: 0.9408

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 40.0

Training results

Training Loss Epoch Step Validation Loss F1
No log 1.0 388 0.2170 0.9252
0.3098 2.0 776 0.2119 0.9369
0.1763 3.0 1164 0.2023 0.9330
0.1286 4.0 1552 0.2606 0.9344
0.1286 5.0 1940 0.2378 0.9391
0.0943 6.0 2328 0.2651 0.9336
0.0706 7.0 2716 0.2986 0.9366
0.0626 8.0 3104 0.3168 0.9305
0.0626 9.0 3492 0.3020 0.9358
0.0515 10.0 3880 0.3062 0.9366
0.0397 11.0 4268 0.3487 0.9344
0.0337 12.0 4656 0.4043 0.9291
0.031 13.0 5044 0.3779 0.9366
0.031 14.0 5432 0.3934 0.9294
0.0266 15.0 5820 0.3677 0.9408
0.0246 16.0 6208 0.3874 0.9355
0.0222 17.0 6596 0.4257 0.9344
0.0222 18.0 6984 0.4372 0.9369
0.022 19.0 7372 0.4408 0.9363
0.0176 20.0 7760 0.4601 0.9358
0.0142 21.0 8148 0.4503 0.9361
0.0134 22.0 8536 0.4835 0.9366
0.0134 23.0 8924 0.4594 0.9391
0.0126 24.0 9312 0.4809 0.9366
0.0109 25.0 9700 0.4859 0.9369
0.012 26.0 10088 0.4824 0.9386
0.012 27.0 10476 0.5067 0.9361
0.0101 28.0 10864 0.4870 0.9375
0.0102 29.0 11252 0.5302 0.9355
0.0088 30.0 11640 0.4953 0.9366
0.0093 31.0 12028 0.4914 0.9361
0.0093 32.0 12416 0.5014 0.9389
0.0084 33.0 12804 0.5026 0.9383
0.0078 34.0 13192 0.5043 0.9380
0.0075 35.0 13580 0.5035 0.9377
0.0075 36.0 13968 0.5007 0.9377
0.0077 37.0 14356 0.5102 0.9377
0.0076 38.0 14744 0.5069 0.9391
0.0059 39.0 15132 0.5105 0.9386
0.0071 40.0 15520 0.5111 0.9383

Framework versions

  • Transformers 4.48.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.21.0