metadata
license: mit
datasets:
- databricks/databricks-dolly-15k
language:
- en
metrics:
- rouge
widget:
- text: 'Please answer this question: What is Artifical Intelligence?'
library_name: transformers
Uploaded model
- Developed by: Vicky
- License: mit
- Finetuned from model : Bart summarization
Inference
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("Mr-Vicky-01/Facebook-Bart-Qna")
model = AutoModelForSeq2SeqLM.from_pretrained("Mr-Vicky-01/Facebook-Bart-Qna")
def generate_answer(text):
inputs = tokenizer([text], return_tensors='pt', truncation=True)
summary_ids = model.generate(inputs['input_ids'], max_length=512)
summary = tokenizer.decode(summary_ids[0], skip_special_tokens=True)
return summary
text_to_summarize = """Please answer this question: What is Artifical Intelligence?"""
summary = generate_answer(text_to_summarize)
print(summary)