Papers
arxiv:1908.10063

FinBERT: Financial Sentiment Analysis with Pre-trained Language Models

Published on Aug 27, 2019
Authors:

Abstract

Financial sentiment analysis is a challenging task due to the specialized language and lack of labeled data in that domain. General-purpose models are not effective enough because of the specialized language used in a financial context. We hypothesize that pre-trained language models can help with this problem because they require fewer labeled examples and they can be further trained on domain-specific corpora. We introduce Fin<PRE_TAG>BERT</POST_TAG>, a language model based on BERT, to tackle NLP tasks in the financial domain. Our results show improvement in every measured metric on current state-of-the-art results for two financial sentiment analysis datasets. We find that even with a smaller training set and fine-tuning only a part of the model, Fin<PRE_TAG>BERT</POST_TAG> outperforms state-of-the-art machine learning methods.

Community

Sign up or log in to comment

Models citing this paper 10

Browse 10 models citing this paper

Datasets citing this paper 2

Spaces citing this paper 134

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.