zhang-yice commited on
Commit
3186100
·
1 Parent(s): 4dd6dc7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -25,7 +25,7 @@ We mainly concentrate on the following questions:
25
  - (c) does injecting non-sentiment-specific linguistic knowledge (e.g., part-of-speech tags and syntactic relations) into pre-training have positive impacts?
26
 
27
  Based on the experimental investigation of these questions, we eventually obtain a powerful sentiment-enhanced pre-trained model.
28
- The powerful sentiment-enhanced pre-trained model is [zhang-yice/spt-absa-bert-400k](https://huggingface.co/zhang-yice/spt-absa-bert-400k) and [zhang-yice/spt-absa-bert-10k](https://huggingface.co/zhang-yice/spt-absa-bert-10k), which integrates three types of knowledge:
29
  - aspect words: masking aspects' context and predicting them.
30
  - review's rating score: rating prediction.
31
  - syntax knowledge:
 
25
  - (c) does injecting non-sentiment-specific linguistic knowledge (e.g., part-of-speech tags and syntactic relations) into pre-training have positive impacts?
26
 
27
  Based on the experimental investigation of these questions, we eventually obtain a powerful sentiment-enhanced pre-trained model.
28
+ The powerful sentiment-enhanced pre-trained model has two versions, namely [zhang-yice/spt-absa-bert-400k](https://huggingface.co/zhang-yice/spt-absa-bert-400k) and [zhang-yice/spt-absa-bert-10k](https://huggingface.co/zhang-yice/spt-absa-bert-10k), which integrates three types of knowledge:
29
  - aspect words: masking aspects' context and predicting them.
30
  - review's rating score: rating prediction.
31
  - syntax knowledge: