--- base_model: zhihan1996/DNABERT-2-117M tags: - generated_from_trainer metrics: - precision - recall - accuracy model-index: - name: DNABERT-2-117M_ft_BioS73_1kbpHG19_DHSs_H3K27AC results: [] --- # DNABERT-2-117M_ft_BioS73_1kbpHG19_DHSs_H3K27AC This model is a fine-tuned version of [zhihan1996/DNABERT-2-117M](https://huggingface.co/zhihan1996/DNABERT-2-117M) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4847 - F1 Score: 0.8003 - Precision: 0.8578 - Recall: 0.75 - Accuracy: 0.8002 - Auc: 0.8937 - Prc: 0.8857 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 Score | Precision | Recall | Accuracy | Auc | Prc | |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:--------:|:------:|:------:| | 0.6224 | 0.3726 | 500 | 0.7088 | 0.7453 | 0.5976 | 0.9902 | 0.6388 | 0.8483 | 0.8530 | | 0.5527 | 0.7452 | 1000 | 0.4761 | 0.8021 | 0.7657 | 0.8422 | 0.7782 | 0.8540 | 0.8575 | | 0.5232 | 1.1177 | 1500 | 0.4819 | 0.7961 | 0.8003 | 0.7919 | 0.7835 | 0.8598 | 0.8622 | | 0.5198 | 1.4903 | 2000 | 0.4690 | 0.8087 | 0.7914 | 0.8268 | 0.7913 | 0.8641 | 0.8664 | | 0.5012 | 1.8629 | 2500 | 0.4830 | 0.7676 | 0.8429 | 0.7046 | 0.7723 | 0.8693 | 0.8699 | | 0.4683 | 2.2355 | 3000 | 0.4665 | 0.8259 | 0.7532 | 0.9141 | 0.7943 | 0.8768 | 0.8756 | | 0.4689 | 2.6080 | 3500 | 0.4254 | 0.8285 | 0.7695 | 0.8973 | 0.8017 | 0.8845 | 0.8808 | | 0.435 | 2.9806 | 4000 | 0.4477 | 0.8340 | 0.7540 | 0.9330 | 0.8017 | 0.8893 | 0.8834 | | 0.4208 | 3.3532 | 4500 | 0.4703 | 0.8134 | 0.8368 | 0.7912 | 0.8062 | 0.8892 | 0.8829 | | 0.4145 | 3.7258 | 5000 | 0.5007 | 0.7988 | 0.8421 | 0.7598 | 0.7958 | 0.8921 | 0.8850 | | 0.4031 | 4.0984 | 5500 | 0.4230 | 0.8363 | 0.7945 | 0.8827 | 0.8155 | 0.8921 | 0.8844 | | 0.3825 | 4.4709 | 6000 | 0.4299 | 0.8397 | 0.7611 | 0.9365 | 0.8092 | 0.8936 | 0.8853 | | 0.3875 | 4.8435 | 6500 | 0.5066 | 0.7971 | 0.8664 | 0.7381 | 0.7995 | 0.8946 | 0.8871 | | 0.3771 | 5.2161 | 7000 | 0.6022 | 0.7480 | 0.8870 | 0.6466 | 0.7674 | 0.8949 | 0.8869 | | 0.363 | 5.5887 | 7500 | 0.4847 | 0.8003 | 0.8578 | 0.75 | 0.8002 | 0.8937 | 0.8857 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.3.0+cu121 - Datasets 2.18.0 - Tokenizers 0.19.0