Model Card for ModernBert-DNA-v1-37M-virus (Mistral for DNA)

The ModernBert-DNA-v1-37M-virus Large Language Model (LLM) is a pretrained generative DNA sequence model with 37M parameters. It is derived from ModernBERT model, which was simplified for DNA: the number of layers and the hidden size were reduced. The model was pretrained using around 15071 viruses > 1kb. Virus genomes were split into 1kb sequences.

Virus genome database was downloaded from https://www.ncbi.nlm.nih.gov/labs/virus/vssi/#/virus?SeqType_s=Genome&VirusLineage_ss=taxid:10239&SourceDB_s=RefSeq. NB: the DNA sequence was used, not the RNA sequence.

Load the model from huggingface:

import torch
from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("RaphaelMourad/ModernBert-DNA-v1-37M-virus", trust_remote_code=True) 
model = AutoModel.from_pretrained("RaphaelMourad/ModernBert-DNA-v1-37M-virus", trust_remote_code=True)

Calculate the embedding of a DNA sequence

DNAseq = "TGATGATTGGCGCGGCTAGGATCGGCT"
inputs = tokenizer(DNAseq, return_tensors = 'pt')["input_ids"]
hidden_states = model(inputs)[0] # [1, sequence_length, 256]

# embedding with max pooling
embedding_max = torch.max(hidden_states[0], dim=0)[0]
print(embedding_max.shape) # expect to be 256

Troubleshooting

Ensure you are utilizing a stable version of Transformers, 4.34.0 or newer.

Notice

ModernBert-DNA-v1-37M-virus is a pretrained base model for DNA.

Contact

Raphaël Mourad. [email protected]

Downloads last month
1
Safetensors
Model size
36.8M params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.