YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

CYBERT

BERT model dedicated to the domain of cyber security. The model has been trained on a corpus of high-quality cyber security and computer science text and is unlikely to work outside this domain.

##Model architecture

The model architecture used is original Roberta and tokenizer to train the corpus is Byte Level.

##Hardware

The model is trained on GPU NVIDIA-SMI 510.54

Downloads last month
198
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for SynamicTechnologies/CYBERT

Finetunes
4 models

Spaces using SynamicTechnologies/CYBERT 2