Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification, pipeline

tokenizer = AutoTokenizer.from_pretrained("zeusfsx/title-instruction")
model = AutoModelForSequenceClassification.from_pretrained("zeusfsx/title-instruction")


classification = pipeline("text-classification", model=model, tokenizer=tokenizer, device="mps")  # for mac I used mps
Downloads last month
115
Safetensors
Model size
110M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.