YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

To produce BioELECTRA, we pretrain ELECTRA on a corpus of over 20 million abstracts from PubMed.

How to use the generator in transformers:

from transformers import ElectraForMaskedLM, ElectraTokenizerFast
import torch
generator = ElectraForMaskedLM.from_pretrained("molly-hayward/bioelectra-base-generator")
tokenizer = ElectraTokenizerFast.from_pretrained("molly-hayward/bioelectra-base-generator")
Downloads last month
104
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.