DistilDNA model
This is a distilled version of DNABERT by using DistilBERT technique. It has a BERT architecture with 6 layers and 768 hidden units, pre-trained on 6-mer DNA sequences. For more details on the pre-training scheme and methods, please check the original thesis report.
How to Use
The model can be used to fine-tune on a downstream genomic task, e.g. promoter identification.
import torch
from transformers import DistilBertForSequenceClassification
model = DistilBertForSequenceClassification.from_pretrained('Peltarion/dnabert-distilbert')
More details on how to fine-tune the model, dataset and additional source codes are available on github.com/joanaapa/Distillation-DNABERT-Promoter.
- Downloads last month
- 11
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.