iati-disability-multi-classifier-weighted
This model is a fine-tuned version of alex-miller/ODABert on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7257
- Accuracy: 0.9124
- F1: 0.8335
- Precision: 0.7943
- Recall: 0.8767
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
0.9484 | 1.0 | 617 | 0.7796 | 0.8627 | 0.7541 | 0.6829 | 0.8418 |
0.6499 | 2.0 | 1234 | 0.6850 | 0.8976 | 0.8017 | 0.7770 | 0.8281 |
0.5907 | 3.0 | 1851 | 0.6667 | 0.9069 | 0.8182 | 0.7995 | 0.8378 |
0.5445 | 4.0 | 2468 | 0.6762 | 0.9086 | 0.8311 | 0.7719 | 0.9002 |
0.5034 | 5.0 | 3085 | 0.6412 | 0.9079 | 0.8238 | 0.7900 | 0.8605 |
0.4721 | 6.0 | 3702 | 0.6969 | 0.9092 | 0.8285 | 0.7846 | 0.8775 |
0.4565 | 7.0 | 4319 | 0.7236 | 0.9130 | 0.8339 | 0.7978 | 0.8735 |
0.4569 | 8.0 | 4936 | 0.6893 | 0.9114 | 0.8307 | 0.7953 | 0.8694 |
0.4233 | 9.0 | 5553 | 0.7279 | 0.9110 | 0.8294 | 0.7963 | 0.8654 |
0.432 | 10.0 | 6170 | 0.7257 | 0.9124 | 0.8335 | 0.7943 | 0.8767 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for alex-miller/iati-disability-multi-classifier-weighted
Base model
google-bert/bert-base-multilingual-uncased
Finetuned
alex-miller/ODABert