FineTuning_Method_2_SC

This model is a fine-tuned version of rafsankabir/Pretrained_E13_Method2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.3223
  • Accuracy: 0.6790
  • F1 Macro: 0.6487

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 40
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Macro
No log 0.32 500 1.0745 0.3976 0.1896
1.0543 0.64 1000 0.9059 0.5967 0.4614
1.0543 0.95 1500 0.8259 0.6414 0.5633
0.8389 1.27 2000 0.8177 0.6394 0.5715
0.8389 1.59 2500 0.8269 0.6356 0.5724
0.7713 1.91 3000 0.7916 0.6631 0.6238
0.7713 2.23 3500 0.7996 0.6745 0.6155
0.6734 2.54 4000 0.7921 0.6624 0.6307
0.6734 2.86 4500 0.7743 0.6726 0.6459
0.6309 3.18 5000 0.8343 0.6803 0.6382
0.6309 3.5 5500 0.8233 0.6784 0.6390
0.5582 3.82 6000 0.8678 0.6631 0.6273
0.5582 4.13 6500 0.8621 0.6758 0.6368
0.4988 4.45 7000 0.9389 0.6720 0.6386
0.4988 4.77 7500 0.9067 0.6918 0.6505
0.4885 5.09 8000 0.9116 0.6937 0.6583
0.4885 5.41 8500 1.0357 0.6822 0.6459
0.427 5.73 9000 0.9428 0.6847 0.6479
0.427 6.04 9500 1.0233 0.6752 0.6531
0.4034 6.36 10000 1.1578 0.6835 0.6515
0.4034 6.68 10500 1.1870 0.6790 0.6545
0.4053 7.0 11000 1.0370 0.7007 0.6651
0.4053 7.32 11500 1.2087 0.6822 0.6497
0.3545 7.63 12000 1.2255 0.6847 0.6605
0.3545 7.95 12500 1.2710 0.6905 0.6609
0.3437 8.27 13000 1.3646 0.6918 0.6618
0.3437 8.59 13500 1.3767 0.6879 0.6563
0.3407 8.91 14000 1.2705 0.6796 0.6506
0.3407 9.22 14500 1.4605 0.6803 0.6496
0.2876 9.54 15000 1.4202 0.6860 0.6555
0.2876 9.86 15500 1.4151 0.6847 0.6517
0.3035 10.18 16000 1.4536 0.6713 0.6514
0.3035 10.5 16500 1.4806 0.6828 0.6469
0.2733 10.81 17000 1.4596 0.6899 0.6552
0.2733 11.13 17500 1.6183 0.6886 0.6557
0.2562 11.45 18000 1.6054 0.6771 0.6591
0.2562 11.77 18500 1.5966 0.6701 0.6503
0.2582 12.09 19000 1.5659 0.6822 0.6531
0.2582 12.4 19500 1.6146 0.6867 0.6575
0.2368 12.72 20000 1.6207 0.6899 0.6629
0.2368 13.04 20500 1.5220 0.6918 0.6640
0.245 13.36 21000 1.6572 0.6720 0.6489
0.245 13.68 21500 1.6443 0.6860 0.6590
0.2226 13.99 22000 1.6238 0.6847 0.6589
0.2226 14.31 22500 1.7241 0.6777 0.6521
0.2117 14.63 23000 1.6134 0.6867 0.6580
0.2117 14.95 23500 1.6723 0.6911 0.6618
0.2056 15.27 24000 1.6257 0.6892 0.6529
0.2056 15.59 24500 1.7072 0.6796 0.6531
0.1859 15.9 25000 1.7174 0.6771 0.6554
0.1859 16.22 25500 1.6951 0.6879 0.6555
0.1725 16.54 26000 1.7240 0.6905 0.6632
0.1725 16.86 26500 1.7126 0.6879 0.6608
0.1817 17.18 27000 1.7949 0.6847 0.6520
0.1817 17.49 27500 1.7694 0.6911 0.6622
0.1617 17.81 28000 1.7891 0.6828 0.6527
0.1617 18.13 28500 1.7860 0.6790 0.6526
0.1628 18.45 29000 1.8127 0.6867 0.6605
0.1628 18.77 29500 1.7317 0.6892 0.6610
0.1736 19.08 30000 1.7273 0.6899 0.6569
0.1736 19.4 30500 1.7853 0.6854 0.6584
0.1441 19.72 31000 1.7866 0.6918 0.6624
0.1441 20.04 31500 1.7842 0.6873 0.6580
0.1392 20.36 32000 1.8669 0.6860 0.6597
0.1392 20.67 32500 1.8392 0.6899 0.6639
0.159 20.99 33000 1.8412 0.6784 0.6552
0.159 21.31 33500 1.8673 0.6854 0.6584
0.1275 21.63 34000 1.8622 0.6854 0.6571
0.1275 21.95 34500 1.8622 0.6796 0.6583
0.1216 22.26 35000 1.9509 0.6854 0.6604
0.1216 22.58 35500 1.9425 0.6809 0.6550
0.1351 22.9 36000 1.9496 0.6784 0.6559
0.1351 23.22 36500 1.9685 0.6847 0.6582
0.1221 23.54 37000 1.9112 0.6911 0.6642
0.1221 23.85 37500 1.9341 0.6726 0.6526
0.1155 24.17 38000 1.9573 0.6899 0.6614
0.1155 24.49 38500 1.9853 0.6873 0.6580
0.1139 24.81 39000 1.9915 0.6790 0.6533
0.1139 25.13 39500 1.9997 0.6796 0.6539
0.1166 25.45 40000 1.9994 0.6847 0.6592
0.1166 25.76 40500 1.9848 0.6745 0.6513
0.1128 26.08 41000 2.0095 0.6867 0.6578
0.1128 26.4 41500 2.0585 0.6822 0.6547
0.1048 26.72 42000 2.0293 0.6777 0.6510
0.1048 27.04 42500 2.0797 0.6758 0.6512
0.1 27.35 43000 2.1162 0.6822 0.6544
0.1 27.67 43500 2.0569 0.6835 0.6538
0.1106 27.99 44000 2.0991 0.6828 0.6565
0.1106 28.31 44500 2.0976 0.6841 0.6563
0.0886 28.63 45000 2.1305 0.6854 0.6532
0.0886 28.94 45500 2.1015 0.6867 0.6564
0.1027 29.26 46000 2.1105 0.6867 0.6559
0.1027 29.58 46500 2.1396 0.6765 0.6499
0.1057 29.9 47000 2.1237 0.6790 0.6501
0.1057 30.22 47500 2.1849 0.6790 0.6518
0.0876 30.53 48000 2.1346 0.6841 0.6533
0.0876 30.85 48500 2.1441 0.6828 0.6540
0.0856 31.17 49000 2.1528 0.6911 0.6600
0.0856 31.49 49500 2.1725 0.6847 0.6509
0.0869 31.81 50000 2.2085 0.6771 0.6503
0.0869 32.12 50500 2.2606 0.6688 0.6434
0.0848 32.44 51000 2.2510 0.6745 0.6451
0.0848 32.76 51500 2.2528 0.6739 0.6496
0.0816 33.08 52000 2.2532 0.6758 0.6503
0.0816 33.4 52500 2.2356 0.6803 0.6500
0.0793 33.72 53000 2.2579 0.6745 0.6483
0.0793 34.03 53500 2.2126 0.6816 0.6520
0.0767 34.35 54000 2.2504 0.6803 0.6497
0.0767 34.67 54500 2.2601 0.6803 0.6524
0.0844 34.99 55000 2.2785 0.6733 0.6470
0.0844 35.31 55500 2.2756 0.6784 0.6520
0.0755 35.62 56000 2.2813 0.6816 0.6542
0.0755 35.94 56500 2.2752 0.6803 0.6518
0.077 36.26 57000 2.2815 0.6796 0.6518
0.077 36.58 57500 2.2861 0.6803 0.6514
0.0752 36.9 58000 2.2929 0.6771 0.6505
0.0752 37.21 58500 2.2859 0.6816 0.6537
0.0698 37.53 59000 2.3117 0.6796 0.6525
0.0698 37.85 59500 2.3038 0.6816 0.6511
0.0613 38.17 60000 2.3176 0.6765 0.6477
0.0613 38.49 60500 2.3131 0.6796 0.6493
0.0706 38.8 61000 2.3161 0.6777 0.6477
0.0706 39.12 61500 2.3127 0.6784 0.6484
0.0678 39.44 62000 2.3174 0.6765 0.6467
0.0678 39.76 62500 2.3223 0.6790 0.6487

Framework versions

  • Transformers 4.29.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
1
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.