|
--- |
|
license: mit |
|
datasets: |
|
- databricks/databricks-dolly-15k |
|
language: |
|
- en |
|
metrics: |
|
- rouge |
|
library_name: transformers |
|
--- |
|
# Uploaded model |
|
|
|
- **Developed by:** [Vicky](https://huggingface.co/Mr-Vicky-01) |
|
- **License:** mit |
|
- **Finetuned from model :** [Bart summarization](https://huggingface.co/Mr-Vicky-01/Bart-Finetuned-conversational-summarization) |
|
|
|
# Inference |
|
|
|
``` |
|
# Load model directly |
|
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM |
|
|
|
tokenizer = AutoTokenizer.from_pretrained("Mr-Vicky-01/Facebook-Bart-Qna") |
|
model = AutoModelForSeq2SeqLM.from_pretrained("Mr-Vicky-01/Facebook-Bart-Qna") |
|
|
|
def generate_answer(text): |
|
inputs = tokenizer([text], return_tensors='pt', truncation=True) |
|
summary_ids = model.generate(inputs['input_ids'], max_length=512) |
|
summary = tokenizer.decode(summary_ids[0], skip_special_tokens=True) |
|
return summary |
|
|
|
text_to_summarize = """Please answer this question: What is Artifical Intelligence?""" |
|
summary = generate_answer(text_to_summarize) |
|
print(summary) |
|
``` |