Edit model card

Model Card for Hprophetnet-large

This model is a fine-tuned version of bart-large-cnn on Newsroom dataset to generate news headlines. To ask model to generate headliens "Headline: " should be appended to the beginning of the article.

Intended uses & limitations

You can use this model for headline generation task on English news articles.

Usage


article = """Two of the OPEC oil cartel’s 11 members, Nigeria and Venezuela, said today \
that they would voluntarily cut production in response to declining crude oil prices, which \
have fallen 20 percent from their peak two months ago.
The move, which would take less than 200,000 barrels of oil a day off the market, follows days \
of mixed signals from some OPEC officials, who have voiced increasing concern about the rapid \
drop in prices. Nigeria’s oil minister, Edmund Daukoru, who is president of OPEC this year, \
recently said the price of oil was very low.
Nigeria and Venezuela, which have generally been price hawks within the group, said their decision \
to cut production grew out of an informal deal reached at OPEC’s last meeting, earlier this month, \
to pare output if prices fell steeply. Some OPEC representatives have grown anxious at the slide in \
the oil futures markets, where prices for benchmark contracts have fallen from a midsummer high of \
$77.03 a barrel.
But traders shrugged off the announcement of the production cuts today. On the New York Mercantile \
Exchange, the most widely watched contract price light, low-sulfur crude for delivery next month \
traded this afternoon at $62.30 a barrel, down 0.7 percent.
Mr. Daukoru has been in contact with other OPEC ministers to discuss prices, which on Monday briefly \
slipped below $60 a barrel for the first time in six months. But the Organization of the Petroleum \
Exporting Countries, as the cartel is formally known, denied any shift in policy.
We are not currently concerned, a delegate from one of OPECs Gulf members said. The prices are \
currently manageable and fair. We are not overly alarmed by the prices. It is not a cause for alarm. \
It's the market working.
It is not unusual for oil prices to fall after Labor Day and the conclusion of the summer travel season. \
Demand tends to slow in the third quarter, and refiners reduce their output for seasonal maintenance; \
consumption picks up again with the first winter cold in the Western Hemisphere, and prices sometimes do as well.
We are not going to push extra oil in the market or force it down our customers throats, we just respond to demand, \
the delegate from the Gulf said.
Still, contradictory statements from senior OPEC representatives have sown doubt about the oil cartel's strategy. \
Whether OPEC countries actually reduce their output or not, the mixed messages have at least succeeded in one way: \
oil traders have been persuaded that OPEC is willing to step in to defend prices, and have traded on that belief, \
slowing the recent price decline.
While apparently fanciful, reports of an imminent output cut reflect two hard facts: stocks are building faster than \
expected, and several producers have an incredibly low pain threshold when it comes to price drops, Antoine Halff, an \
energy analyst with Fimat, wrote in a note to clients today. “However, more price declines will likely be needed before \
OPEC producers decide on any coordinated move.
Venezuela, which pumps about 2.5 million barrels a day, said it would cut its daily output by 50,000 barrels, or about 2 \
percent, starting Oct. 1. Nigeria said it would trim its exports by 5 percent on the same date, a reduction of about \
120,000 barrels a day from its current output of about 3.8 million barrels a day.
They are trying to influence the psychology of the market, said Larry Goldstein, a veteran oil analyst and the president \
of the Petroleum Industry Research Foundation in New York. Although they are reacting to the reduction in demand, they \
are trying to convince the market that they are actually anticipating it, by making cuts ahead of the market. But they \
are simply reacting to it, which is how markets should operate."""

import transformers
import os
import torch
#If you have more than one GPU, you can specify here which one to use
os.environ["CUDA_VISIBLE_DEVICES"]="5" 
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
print(device)

#appending the task identifier to the beginning of input
article = "Headline: " + article

model = AutoModelForSeq2SeqLM.from_pretrained("omidvaramin/HBART").to(device)
tokenizer = AutoTokenizer.from_pretrained("omidvaramin/HBART")
#encodign article using tokenizer
encoding = tokenizer(article
                     , max_length=1024
                     , truncation=True
                     ,return_tensors="pt"
                    ,padding='longest')

input_ids = encoding['input_ids']
attention_masks = encoding['attention_mask']

#transfering the data into GPU
input_ids = input_ids.to(device)
attention_masks = attention_masks.to(device)


#generate headlines using kbeam technique
beam_outputs = model.generate(
                            input_ids = input_ids,
                            attention_mask = attention_masks
                            ,do_sample = False
                            ,num_beams = 4
                            ,max_length = 20
                            ,min_length = 1
                            ,num_return_sequences = 1
                                )

result = tokenizer.batch_decode(beam_outputs,
                                skip_special_tokens=True)
print(result[0])
>>> [{'2 OPEC Nations Agree to Cut Oil Output'}]

BibTeX entry and citation info

@ARTICLE{10154027,
  author={Omidvar, Amin and An, Aijun},
  journal={IEEE Access}, 
  title={Learning to Generate Popular Headlines}, 
  year={2023},
  volume={11},
  number={},
  pages={60904-60914},
  doi={10.1109/ACCESS.2023.3286853}}
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.