Usage

from transformers import T5Tokenizer, T5ForConditionalGeneration

model_path = "KameronB/sitcc-t5-base-v3.0"

# Load the model
model = T5ForConditionalGeneration.from_pretrained(model_path, use_safetensors=True)

# Load the tokenizer (if applicable)
tokenizer = T5Tokenizer.from_pretrained(model_path)


def summarize_ticket(ticket_text):
  # Tokenize the input text
  input_ids = tokenizer.encode("Summarize: " + ticket_text, return_tensors="pt")

  # Generate the summary
  summary_ids = model.generate(input_ids, min_length=10, max_length=100)

  # Decode and return the summary
  summary = tokenizer.decode(summary_ids[0], skip_special_tokens=True)
  return summary
Downloads last month
15
Safetensors
Model size
248M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for KameronB/sitcc-t5-base-v3.0

Finetuned
(677)
this model