vamossyd's picture
Update README.md
cb686e2
|
raw
history blame
766 Bytes
metadata
language:
  - en
tags:
  - text-classification
  - emotion
  - pytorch
license: mit
datasets:
  - emotion
metrics:
  - accuracy
  - precision
  - recall
  - f1

bert-base-uncased-emotion

Model description

bert-base-uncased finetuned on the unify-emotion-datasets (https://github.com/sarnthil/unify-emotion-datasets) [~250K texts with 7 labels -- neutral, happy, sad, anger, disgust, surprise, fear], then transferred to a small sample of 10K hand-tagged StockTwits messages. Optimized for extracting emotions from financial social media, such as StockTwits.

Sequence length 64, learning rate 2e-5, batch size 128, 8 epochs.

For inference, follow the Inference.ipynb notebook.

Training data

Data came from https://github.com/sarnthil/unify-emotion-datasets.