finetuned-WangchanBERTa-TSCC-property

This model is a fine-tuned version of airesearch/wangchanberta-base-att-spm-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2288
  • Accuracy: 0.9634

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6835 0.1220 10 0.6029 0.6585
0.5255 0.2439 20 0.7177 0.6707
0.3303 0.3659 30 0.3454 0.8537
0.3471 0.4878 40 0.3404 0.8902
0.2569 0.6098 50 0.2818 0.9146
0.1546 0.7317 60 0.1566 0.9512
0.1234 0.8537 70 0.0562 0.9756
0.2935 0.9756 80 0.2113 0.9512
0.0292 1.0976 90 0.1369 0.9512
0.1396 1.2195 100 0.2530 0.9512
0.0867 1.3415 110 0.1326 0.9512
0.059 1.4634 120 0.0811 0.9512
0.0024 1.5854 130 0.3948 0.9512
0.2261 1.7073 140 0.2322 0.9268
0.0986 1.8293 150 0.3132 0.9390
0.0819 1.9512 160 0.2300 0.9390
0.0149 2.0732 170 0.2773 0.9268
0.0151 2.1951 180 0.2996 0.9268
0.001 2.3171 190 0.1910 0.9390
0.0005 2.4390 200 0.2285 0.9268
0.0379 2.5610 210 0.3384 0.9390
0.0013 2.6829 220 0.1087 0.9756
0.0002 2.8049 230 0.1113 0.9756
0.0901 2.9268 240 0.1219 0.9756
0.0537 3.0488 250 0.2109 0.9512
0.0004 3.1707 260 0.1496 0.9756
0.0012 3.2927 270 0.1627 0.9634
0.0107 3.4146 280 0.1552 0.9634
0.019 3.5366 290 0.1547 0.9634
0.0003 3.6585 300 0.1568 0.9634
0.0003 3.7805 310 0.1596 0.9634
0.0002 3.9024 320 0.2054 0.9634
0.0004 4.0244 330 0.3416 0.9268
0.0002 4.1463 340 0.4531 0.9390
0.0002 4.2683 350 0.4530 0.9390
0.0002 4.3902 360 0.4035 0.9268
0.0005 4.5122 370 0.3358 0.9268
0.0002 4.6341 380 0.2717 0.9390
0.0001 4.7561 390 0.2437 0.9512
0.0031 4.8780 400 0.2317 0.9634
0.0001 5.0 410 0.2259 0.9634
0.0001 5.1220 420 0.2159 0.9634
0.0001 5.2439 430 0.2098 0.9634
0.0001 5.3659 440 0.2432 0.9512
0.0001 5.4878 450 0.2555 0.9512
0.0001 5.6098 460 0.2576 0.9512
0.0001 5.7317 470 0.2557 0.9512
0.0364 5.8537 480 0.2550 0.9512
0.0001 5.9756 490 0.2543 0.9512
0.0001 6.0976 500 0.2516 0.9512
0.0001 6.2195 510 0.2487 0.9512
0.0001 6.3415 520 0.2484 0.9512
0.0001 6.4634 530 0.2082 0.9634
0.0001 6.5854 540 0.1980 0.9634
0.0007 6.7073 550 0.1934 0.9634
0.0001 6.8293 560 0.1916 0.9634
0.0001 6.9512 570 0.1900 0.9634
0.0066 7.0732 580 0.1863 0.9634
0.0001 7.1951 590 0.1829 0.9634
0.0001 7.3171 600 0.1856 0.9634
0.0013 7.4390 610 0.1972 0.9634
0.0001 7.5610 620 0.2031 0.9634
0.0 7.6829 630 0.2052 0.9634
0.0008 7.8049 640 0.2082 0.9634
0.001 7.9268 650 0.2091 0.9634
0.0001 8.0488 660 0.2103 0.9634
0.0001 8.1707 670 0.2098 0.9634
0.0 8.2927 680 0.2103 0.9634
0.0001 8.4146 690 0.2109 0.9634
0.0001 8.5366 700 0.2239 0.9634
0.0 8.6585 710 0.2281 0.9634
0.0 8.7805 720 0.2294 0.9634
0.0 8.9024 730 0.2296 0.9634
0.0003 9.0244 740 0.2300 0.9634
0.0001 9.1463 750 0.2301 0.9634
0.0001 9.2683 760 0.2294 0.9634
0.0001 9.3902 770 0.2290 0.9634
0.0 9.5122 780 0.2290 0.9634
0.0001 9.6341 790 0.2288 0.9634
0.0001 9.7561 800 0.2288 0.9634
0.0 9.8780 810 0.2288 0.9634
0.0001 10.0 820 0.2288 0.9634

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
23
Safetensors
Model size
105M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for SirawitC/finetuned-WangchanBERTa-TSCC-property

Finetuned
(30)
this model