File size: 2,159 Bytes
4fc207e
 
 
b3b445a
 
 
83d3829
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c6ba233
83d3829
3186100
c6ba233
 
 
 
 
 
 
 
 
079199b
c6ba233
079199b
c6ba233
b3b445a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
---
license: cc-by-4.0
---

## SPT-ABSA

We continue to pre-train BERT-base via Sentiment-enhance pre-training (SPT).

- Title: An Empirical Study of Sentiment-Enhanced Pre-Training for Aspect-Based Sentiment Analysis
- Author: Yice Zhang, Yifan Yang, Bin Liang, Shiwei Chen, Bing Qin, and Ruifeng Xu
- Conference: ACL-2023 Finding (Long)

GitHub Repository: https://github.com/HITSZ-HLT/SPT-ABSA

### What Did We Do?

Aspect-Based Sentiment Analysis (ABSA) is an important problem in sentiment analysis.
Its goal is to recognize opinions and sentiments towards specific aspects from user-generated content.
Many research efforts leverage pre-training techniques to learn sentiment-aware representations and achieve significant gains in various ABSA tasks.
We conduct an empirical study of SPT-ABSA to systematically investigate and analyze the effectiveness of the existing approaches.

We mainly concentrate on the following questions: 
- (a) what impact do different types of sentiment knowledge have on downstream ABSA tasks?;
- (b) which knowledge integration method is most effective?; and
- (c) does injecting non-sentiment-specific linguistic knowledge (e.g., part-of-speech tags and syntactic relations) into pre-training have positive impacts?

Based on the experimental investigation of these questions, we eventually obtain a powerful sentiment-enhanced pre-trained model.
The powerful sentiment-enhanced pre-trained model has two versions, namely [zhang-yice/spt-absa-bert-400k](https://huggingface.co/zhang-yice/spt-absa-bert-400k) and [zhang-yice/spt-absa-bert-10k](https://huggingface.co/zhang-yice/spt-absa-bert-10k), which integrates three types of knowledge:
- aspect words: masking aspects' context and predicting them.
- review's rating score: rating prediction.
- syntax knowledge: 
  - part-of-speech,
  - dependency direction,
  - dependency distance.
 
### Experimental Results

<img width="75%" alt="image" src="https://github.com/HITSZ-HLT/SPT-ABSA/assets/9134454/38fc2db0-6ccf-47a7-a93c-cf54667e1a23">

<img width="75%" alt="image" src="https://github.com/HITSZ-HLT/SPT-ABSA/assets/9134454/20c5a976-014e-433f-a2ec-4bb259e5a382">