Fine_Tuning_SC_Method_2_Epoch_13B

This model is a fine-tuned version of rafsankabir/Pretrained_E13B_Method2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4244
  • Accuracy: 0.6873
  • F1 Macro: 0.6544

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 40
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Macro
No log 1.27 500 1.0673 0.3976 0.1896
1.0138 2.54 1000 0.8217 0.6331 0.5569
1.0138 3.82 1500 0.7889 0.6662 0.6049
0.7305 5.09 2000 0.7821 0.6765 0.6382
0.7305 6.36 2500 0.7867 0.6918 0.6457
0.5856 7.63 3000 0.8236 0.6892 0.6623
0.5856 8.91 3500 0.8490 0.6835 0.6551
0.4723 10.18 4000 0.9057 0.6854 0.6533
0.4723 11.45 4500 0.9237 0.6796 0.6455
0.3896 12.72 5000 0.9814 0.6879 0.6499
0.3896 13.99 5500 0.9984 0.6745 0.6487
0.3299 15.27 6000 1.0226 0.6822 0.6545
0.3299 16.54 6500 1.0579 0.6758 0.6485
0.2783 17.81 7000 1.0932 0.6796 0.6487
0.2783 19.08 7500 1.1047 0.6950 0.6609
0.2455 20.36 8000 1.1643 0.6860 0.6559
0.2455 21.63 8500 1.1953 0.6841 0.6548
0.2181 22.9 9000 1.2043 0.6835 0.6516
0.2181 24.17 9500 1.2603 0.6867 0.6502
0.1894 25.45 10000 1.2652 0.6860 0.6552
0.1894 26.72 10500 1.2860 0.6790 0.6474
0.1757 27.99 11000 1.2892 0.6854 0.6541
0.1757 29.26 11500 1.3400 0.6803 0.6496
0.1599 30.53 12000 1.3630 0.6828 0.6493
0.1599 31.81 12500 1.3688 0.6854 0.6538
0.1531 33.08 13000 1.3962 0.6854 0.6534
0.1531 34.35 13500 1.4021 0.6841 0.6523
0.1452 35.62 14000 1.4029 0.6847 0.6524
0.1452 36.9 14500 1.4130 0.6886 0.6562
0.1391 38.17 15000 1.4203 0.6879 0.6553
0.1391 39.44 15500 1.4244 0.6873 0.6544

Framework versions

  • Transformers 4.29.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
184
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.