Herbal Multilabel Classification
This model is a fine-tuned version of medicalai/ClinicalBERT on a custom dataset. It achieves the following results on the evaluation set:
- Loss: 0.0108
- F1: 0.9834
- Roc Auc: 0.9930
- Accuracy: 0.9853
Model description
It is a multilabel classification model that deals with 10 herbal plants (Jackfruit, Sambong, Lemon, Jasmine, Mango, Mint, Ampalaya, Malunggay, Guava, Lagundi) which are abundant in the Philippines. The model classifies a herbal(s) that is/are applicable based on the input symptom of the user.
Intended uses & limitations
The model is created for the purpose of completing a University course. It will be integrated to a React Native mobile application for the project. The model performs well when the input of the user contains a symptom that has been trained to the model from the dataset. However, other words/inputs that do not present a significance to the purpose of the model would generate an underwhelming and inaccurate result.
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc | Accuracy |
---|---|---|---|---|---|---|
No log | 1.0 | 136 | 0.0223 | 0.9834 | 0.9930 | 0.9853 |
No log | 2.0 | 272 | 0.0163 | 0.9881 | 0.9959 | 0.9926 |
No log | 3.0 | 408 | 0.0137 | 0.9834 | 0.9930 | 0.9853 |
0.0216 | 4.0 | 544 | 0.0120 | 0.9834 | 0.9930 | 0.9853 |
0.0216 | 5.0 | 680 | 0.0108 | 0.9834 | 0.9930 | 0.9853 |
Framework versions
- Transformers 4.37.0
- Pytorch 2.1.2+cu118
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 20
Model tree for khygopole/NLP_HerbalMultilabelClassification
Base model
medicalai/ClinicalBERT