FinBERT Fine-Tuned on Financial Sentiment (Financial PhraseBank + GitHub Dataset)

πŸ“Œ Model Description

This model is a fine-tuned version of FinBERT (ProsusAI/finbert) trained for financial sentiment classification.
It can classify financial text into three categories:

  • Negative (0)
  • Neutral (1)
  • Positive (2)

πŸ“‚ Dataset Used

This model was trained on:
βœ… Financial PhraseBank - A widely used financial sentiment dataset.
βœ… GitHub Generated Sentiment Dataset - An additional dataset to test the model.

βš™οΈ Training Parameters

Parameter Value
Model Architecture FinBERT (based on BERT)
Batch Size 8
Learning Rate 2e-5
Epochs 3
Optimizer AdamW
Evaluation Metric F1-Score, Accuracy

πŸ“Š Model Performance

Dataset Accuracy F1 (Weighted) Precision Recall
Financial PhraseBank (Train) 95.21% 95.23% 95.32% 95.21%
GitHub Test Set 64.42% 64.34% 70.52% 64.42%

πŸš€ Intended Use

This model is designed for:
βœ… Financial Analysts & Investors to assess sentiment of financial sentences in ex. reports, news, and stock discussions.
βœ… Financial Institutions for NLP-based sentiment analysis in automated trading.
βœ… AI Researchers exploring financial NLP models.

⚠️ Limitations

⚠️ May not generalize well to datasets with very different financial language.
⚠️ Might require fine-tuning for specific financial domains (crypto, banking, startups).

πŸ“₯ Usage Example

You can use the model via Hugging Face Transformers:

from transformers import AutoModelForSequenceClassification, AutoTokenizer

model_name = "Driisa/finbert-finetuned-github"

# Load model and tokenizer
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)

# Example input
text = "The company's stock has seen significant growth this quarter."

# Tokenize and predict
inputs = tokenizer(text, return_tensors="pt", padding=True, truncation=True, max_length=128)
outputs = model(**inputs)

# Get predicted class
predicted_class = outputs.logits.argmax().item()
print(f"Predicted Sentiment: {['Negative', 'Neutral', 'Positive'][predicted_class]}")
Downloads last month
196
Safetensors
Model size
109M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support