File size: 1,142 Bytes
2319afc 49efb57 07a5af3 ba9bfba 49efb57 2319afc 49efb57 fe78b85 49efb57 58eff45 de820dc 49efb57 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
---
license: mit
datasets:
- databricks/databricks-dolly-15k
language:
- en
metrics:
- rouge
widget:
- text: 'Please answer this question: What is Artifical Intelligence?'
library_name: transformers
---
# Uploaded model
- **Developed by:** [Vicky](https://huggingface.co/Mr-Vicky-01)
- **License:** mit
- **Finetuned from model :** [Bart summarization](https://huggingface.co/Mr-Vicky-01/Bart-Finetuned-conversational-summarization)
# Inference
```terminal
pip install transformers
```
```
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("Mr-Vicky-01/Facebook-Bart-Qna")
model = AutoModelForSeq2SeqLM.from_pretrained("Mr-Vicky-01/Facebook-Bart-Qna")
def generate_answer(text):
inputs = tokenizer([text], return_tensors='pt', truncation=True)
summary_ids = model.generate(inputs['input_ids'], max_length=512)
summary = tokenizer.decode(summary_ids[0], skip_special_tokens=True)
return summary
text_to_summarize = """Please answer this question: What is Artifical Intelligence?"""
summary = generate_answer(text_to_summarize)
print(summary)
``` |