cdp-multi-classifier-weighted
This model is a fine-tuned version of alex-miller/ODABert. It achieves the following results on the evaluation set:
- Loss: 0.8564
- Accuracy: 0.9716
- F1: 0.8484
- Precision: 0.7788
- Recall: 0.9316
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
1.0497 | 1.0 | 11302 | 1.5640 | 0.9621 | 0.8011 | 0.7244 | 0.8958 |
0.9103 | 2.0 | 22604 | 1.4417 | 0.9663 | 0.8203 | 0.7522 | 0.9021 |
0.7629 | 3.0 | 33906 | 0.9562 | 0.9661 | 0.8235 | 0.7406 | 0.9272 |
0.6321 | 4.0 | 45208 | 0.9106 | 0.9697 | 0.8376 | 0.7720 | 0.9153 |
0.5464 | 5.0 | 56510 | 0.9811 | 0.9705 | 0.8419 | 0.7760 | 0.9200 |
0.5043 | 6.0 | 67812 | 0.9484 | 0.9700 | 0.8409 | 0.7677 | 0.9296 |
0.4647 | 7.0 | 79114 | 0.8569 | 0.9713 | 0.8465 | 0.7781 | 0.9281 |
0.4215 | 8.0 | 90416 | 0.8620 | 0.9703 | 0.8430 | 0.7682 | 0.9338 |
0.3794 | 9.0 | 101718 | 0.8569 | 0.9704 | 0.8437 | 0.7682 | 0.9357 |
0.344 | 10.0 | 113020 | 0.8305 | 0.9708 | 0.8448 | 0.7720 | 0.9328 |
0.3247 | 11.0 | 124322 | 0.7900 | 0.9707 | 0.8446 | 0.7709 | 0.9338 |
0.3159 | 12.0 | 135624 | 0.7838 | 0.9711 | 0.8463 | 0.7734 | 0.9344 |
0.3166 | 13.0 | 146926 | 0.8381 | 0.9710 | 0.8462 | 0.7727 | 0.9351 |
0.279 | 14.0 | 158228 | 0.8694 | 0.9718 | 0.8487 | 0.7821 | 0.9277 |
0.281 | 15.0 | 169530 | 0.8564 | 0.9716 | 0.8484 | 0.7788 | 0.9316 |
Framework versions
- Transformers 4.40.1
- Pytorch 2.0.1
- Datasets 2.19.0
- Tokenizers 0.19.1
- Downloads last month
- 11
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for devinitorg/cdp-multi-classifier-weighted
Base model
google-bert/bert-base-multilingual-uncased
Finetuned
alex-miller/ODABert