ppak10's picture
Updates README.md
5ea142c
metadata
library_name: transformers
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: defect-classification-llama-baseline-25-epochs
    results: []

defect-classification-llama-baseline-25-epochs

This model is a fine-tuned version of meta-llama/Llama-3.2-1B on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1655
  • Accuracy: 0.9421

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 512
  • eval_batch_size: 512
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.7994 1.0 1062 0.7879 0.8243
0.5403 2.0 2124 0.5089 0.8545
0.4235 3.0 3186 0.3855 0.8819
0.3556 4.0 4248 0.4072 0.8656
0.3133 5.0 5310 0.3077 0.8999
0.2998 6.0 6372 0.3031 0.9025
0.2842 7.0 7434 0.2610 0.9100
0.2773 8.0 8496 0.2443 0.9157
0.2413 9.0 9558 0.2339 0.9204
0.2394 10.0 10620 0.2241 0.9223
0.2305 11.0 11682 0.2230 0.9195
0.2119 12.0 12744 0.2129 0.9273
0.2106 13.0 13806 0.2186 0.9228
0.1973 14.0 14868 0.1961 0.9319
0.1993 15.0 15930 0.1903 0.9337
0.1863 16.0 16992 0.1888 0.9322
0.1883 17.0 18054 0.1966 0.9288
0.1879 18.0 19116 0.1794 0.9380
0.1856 19.0 20178 0.1786 0.9366
0.1808 20.0 21240 0.1838 0.9344
0.1711 21.0 22302 0.1749 0.9383
0.1689 22.0 23364 0.1694 0.9405
0.17 23.0 24426 0.1687 0.9411
0.1648 24.0 25488 0.1684 0.9403
0.1665 25.0 26550 0.1655 0.9421

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0