eskayML's picture
eskayML/interview_classfier
a2643b5 verified
|
raw
history blame
2.56 kB
metadata
license: apache-2.0
base_model: distilbert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: interview_classifier
    results: []

interview_classifier

This model is a fine-tuned version of distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1049
  • Accuracy: 0.9682

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 79 2.2043 0.1720
No log 2.0 158 1.8878 0.4331
No log 3.0 237 1.4540 0.6624
No log 4.0 316 1.1512 0.7134
No log 5.0 395 0.8028 0.8153
No log 6.0 474 0.5610 0.8854
1.5247 7.0 553 0.4031 0.9299
1.5247 8.0 632 0.3275 0.9172
1.5247 9.0 711 0.2420 0.9363
1.5247 10.0 790 0.1941 0.9490
1.5247 11.0 869 0.1656 0.9682
1.5247 12.0 948 0.1444 0.9682
0.3164 13.0 1027 0.1325 0.9682
0.3164 14.0 1106 0.1194 0.9682
0.3164 15.0 1185 0.1145 0.9682
0.3164 16.0 1264 0.1138 0.9682
0.3164 17.0 1343 0.1101 0.9682
0.3164 18.0 1422 0.1074 0.9682
0.1327 19.0 1501 0.1050 0.9682
0.1327 20.0 1580 0.1049 0.9682

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.2
  • Tokenizers 0.19.1