File size: 1,076 Bytes
8736038 2ff2376 8736038 2ff2376 633a973 2ff2376 633a973 2ff2376 633a973 2ff2376 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
---
license: cc-by-nc-4.0
language:
- en
tags:
- cybersecurity
---
This model represents an enhanced iteration of the [SecureBERT](https://huggingface.co/ehsanaghaei/SecureBERT) model, trained on a corpus eight times larger than its predecessor, over the course of 400 hours, leveraging the computational power of 8xA100 GPUs. The innovation, known as SecureBERT_Plus, brings forth an average improvment of 9% in the performance of the Masked Language Model (MLM) task. This advancement signifies a substantial stride towards achieving heightened proficiency in language understanding and representation learning within the cybersecurity knowledge.
SecureBERT is a domain-specific language model based on RoBERTa which is trained on a huge amount of cybersecurity data and fine-tuned/tweaked to understand/represent cybersecurity textual data.
Related cybersecurity language models:
[SecureGPT](https://huggingface.co/ehsanaghaei/SecureGPT)
[SecureDeBERTa](https://huggingface.co/ehsanaghaei/SecureDeBERTa)
[SecureBERT](https://huggingface.co/ehsanaghaei/SecureBERT)
|