LoRA fine-tuned checkpoint for peptide_HLA_MHC_affinity prediction.

How to Use

Download model

from huggingface_hub import snapshot_download
from pathlib import Path

model_name = "genbio-ai/AIDO.Protein-16B-peptide_HLA_MHC_affinity"
genbio_models_path = Path.home().joinpath('genbio_models', model_name)
genbio_models_path.mkdir(parents=True, exist_ok=True)
snapshot_download(repo_id=model_name, local_dir=genbio_models_path)

Load model for inference

import torch
from modelgenerator.tasks import SequenceClassification

ckpt_path = genbio_models_path.joinpath('model.ckpt')
model = SequenceClassification.load_from_checkpoint(ckpt_path, strict_loading=False).eval()

collated_batch = model.transform({"sequences": ["ACGT", "AGCT"]})
logits = model(collated_batch)
print(logits)
print(torch.argmax(logits, dim=-1))
Downloads last month
12
Inference API
Unable to determine this model's library. Check the docs .

Model tree for genbio-ai/AIDO.Protein-16B-peptide_HLA_MHC_affinity

Finetuned
(6)
this model

Dataset used to train genbio-ai/AIDO.Protein-16B-peptide_HLA_MHC_affinity

Collection including genbio-ai/AIDO.Protein-16B-peptide_HLA_MHC_affinity