|
--- |
|
license: mit |
|
--- |
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63df6223f351dc0745681f77/wnyIjhui5eXztEfPsEsWb.png) |
|
|
|
**PepMLM: Target Sequence-Conditioned Generation of Peptide Binders via Masked Language Modeling** |
|
|
|
In this work, we introduce **PepMLM**, a purely target sequence-conditioned *de novo* generator of linear peptide binders. |
|
By employing a novel masking strategy that uniquely positions cognate peptide sequences at the terminus of target protein sequences, |
|
PepMLM tasks the state-of-the-art ESM-2 pLM to fully reconstruct the binder region, |
|
achieving low perplexities matching or improving upon previously-validated peptide-protein sequence pairs. |
|
After successful *in silico* benchmarking with AlphaFold-Multimer, we experimentally verify PepMLM’s efficacy via fusion of model-derived peptides to E3 ubiquitin ligase domains, demonstrating endogenous degradation of target substrates in cellular models. |
|
In total, PepMLM enables the generative design of candidate binders to any target protein, without the requirement of target structure, empowering downstream programmable proteome editing applications. |
|
|
|
- Demo: HuggingFace Space Demo [Link](https://huggingface.co/spaces/TianlaiChen/PepMLM). |
|
- Colab Notebook: [Link](https://colab.research.google.com/drive/1u0i-LBog_lvQ5YRKs7QLKh_RtI-tV8qM?usp=sharing) |
|
- Preprint: [Link](https://arxiv.org/abs/2310.03842) |
|
|
|
**Graphical Summary**: |
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/63df6223f351dc0745681f77/Ov_GhpwQHCFDQd7qK5oyq.png) |
|
|
|
``` |
|
# Load model directly |
|
from transformers import AutoTokenizer, AutoModelForMaskedLM |
|
|
|
tokenizer = AutoTokenizer.from_pretrained("TianlaiChen/PepMLM-650M") |
|
model = AutoModelForMaskedLM.from_pretrained("TianlaiChen/PepMLM-650M") |
|
``` |