The international arab journal of information technology | |
T-LBERT with Domain Adaptation for Cross- Domain Sentiment Classification | |
article | |
Hongye Cao1  Qianru Wei1  Jiangbin Zheng1  | |
[1] School of Software, Northwestern Polytechnical University | |
关键词: Cross-domain; sentiment classification; topic model; attention; domain adaption; | |
DOI : 10.34028/iajit/20/1/15 | |
学科分类:计算机科学(综合) | |
来源: Zarqa University | |
【 摘 要 】
Cross-domain sentiment classification transfers the knowledge from the source domain to the target domain lackingsupervised information for sentiment classification. Existing cross-domain sentiment classification methods establishconnections by extracting domain-invariant features manually. However, these methods have poor adaptability to bridgeconnections across different domains and ignore important sentiment information. Hence, we propose a Topic LiteBidirectional Encoder Representations from Transformers (T-LBERT) model with domain adaption to improve theadaptability of cross-domain sentiment classification. It combines the learning content of the source domain and the topicinformation of the target domain to improve the domain adaptability of the model. Due to the unbalanced distribution ofinformation in the combined data, we apply a two-layer attention adaptive mechanism for classification. A shallow attentionlayer is applied to weigh the important features of the combined data. Inspired by active learning, we propose a deep domainadaption layer, which actively adjusts model parameters to balance the difference and representativeness between domains.Experimental results on Amazon review datasets demonstrate that the T-LBERT model considerably outperforms other state- of-the-art methods. T-LBERT shows stable classification performance on multiple metrics.
【 授权许可】
Unknown
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202307090002562ZK.pdf | 966KB | download |