T5 Small for Conversation Summarization

Usage

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

model_checkpoint = "ahlad/t5-small-finetuned-samsum"
tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)
model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint)

input_text = """
Emma: Did you finish the book I lent you? 
Liam: Yes, I couldn’t put it down! The twist at the end was insane. 
Emma: I know, right? I didn’t see it coming at all. What did you think of the main character? 
Liam: Honestly, I thought they were a bit frustrating at first, but they grew on me. 
Emma: Same here. I loved how they developed by the end. Are you up for another book from the series? 
Liam: Absolutely! Pass it my way.
"""

inputs = tokenizer(input_text, return_tensors="pt")

outputs = model.generate(**inputs)
summary = tokenizer.decode(outputs[0], skip_special_tokens=True)

print("Summary:", summary)
Downloads last month
14
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for ahlad/t5-small-finetuned-samsum

Base model

google-t5/t5-small
Finetuned
(1767)
this model

Dataset used to train ahlad/t5-small-finetuned-samsum