--- license: apache-2.0 base_model: indolem/indobertweet-base-uncased tags: - generated_from_keras_callback model-index: - name: damand2061/innermore-x-indobertweet-base-uncased results: [] datasets: - damand2061/innermore-x language: - id metrics: - accuracy - f1 - recall - precision pipeline_tag: token-classification inference: parameters: aggregation_strategy: "first" --- # damand2061/innermore-x-indobertweet-base-uncased This model is a fine-tuned version of [indolem/indobertweet-base-uncased](https://huggingface.co/indolem/indobertweet-base-uncased) on [Innermore-X dataset](https://huggingface.co/datasets/damand2061/innermore-x), an Indonesian NER Movie Reviews on X (Twitter). It achieves the following results on the evaluation set: - Train Loss: 0.0062 - Validation Loss: 0.1791 - Validation Precision: 0.8073 - Validation Recall: 0.7652 - Validation F1: 0.7857 - Validation Accuracy: 0.9622 - Epoch: 14 ## Model description This model is a fine-tuned version of [indolem/indobertweet-base-uncased](https://huggingface.co/indolem/indobertweet-base-uncased) on [Innermore-X dataset](https://huggingface.co/damand2061/innermore-x), an Indonesian NER Movie Reviews on X (Twitter). ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 0.0002, 'decay_steps': 420, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Validation Precision | Validation Recall | Validation F1 | Validation Accuracy | Epoch | |:----------:|:---------------:|:--------------------:|:-----------------:|:-------------:|:-------------------:|:-----:| | 0.8107 | 0.5161 | 0.1270 | 0.1348 | 0.1308 | 0.8446 | 0 | | 0.4721 | 0.3254 | 0.3232 | 0.2304 | 0.2690 | 0.9003 | 1 | | 0.3198 | 0.2431 | 0.4776 | 0.5087 | 0.4926 | 0.9211 | 2 | | 0.1784 | 0.1581 | 0.6741 | 0.6565 | 0.6652 | 0.9497 | 3 | | 0.1177 | 0.1304 | 0.7890 | 0.7478 | 0.7679 | 0.9627 | 4 | | 0.0666 | 0.1428 | 0.7545 | 0.7348 | 0.7445 | 0.9598 | 5 | | 0.0499 | 0.1526 | 0.7456 | 0.7391 | 0.7424 | 0.9584 | 6 | | 0.0339 | 0.1677 | 0.7945 | 0.7565 | 0.7751 | 0.9627 | 7 | | 0.0261 | 0.1598 | 0.6996 | 0.7087 | 0.7041 | 0.9540 | 8 | | 0.0178 | 0.1792 | 0.7668 | 0.7435 | 0.7550 | 0.9598 | 9 | | 0.0127 | 0.1943 | 0.8186 | 0.7261 | 0.7696 | 0.9593 | 10 | | 0.0102 | 0.1825 | 0.7890 | 0.7478 | 0.7679 | 0.9598 | 11 | | 0.0083 | 0.1765 | 0.8102 | 0.7609 | 0.7848 | 0.9622 | 12 | | 0.0062 | 0.1778 | 0.8018 | 0.7565 | 0.7785 | 0.9618 | 13 | | 0.0062 | 0.1791 | 0.8073 | 0.7652 | 0.7857 | 0.9622 | 14 | ### Framework versions - Transformers 4.38.2 - TensorFlow 2.15.0 - Tokenizers 0.15.2