NEUROCOMPUTING | 卷:275 |
Learning better discourse representation for implicit discourse relation recognition via attention networks | |
Article | |
Zhang, Biao1,3  Xiong, Deyi2  Su, Jinsong1,3  Zhang, Min2  | |
[1] Xiamen Univ, Xiamen 361005, Peoples R China | |
[2] Soochow Univ, Suzhou 215006, Peoples R China | |
[3] Minjiang Univ, Fujian Prov Key Lab Informat Proc & Intelligent C, Fuzhou 350121, Fujian, Peoples R China | |
关键词: Implicit discourse relation recognition; Attention network; Memory network; Convolutional neural network; | |
DOI : 10.1016/j.neucom.2017.09.074 | |
来源: Elsevier | |
【 摘 要 】
Different words in discourse arguments usually have varying contributions on the recognition of implicit discourse relations. Following this intuition, we propose two attention-based neural networks, namely inner attention model and outer attention model, to learn better discourse representation by automatically estimating the degrees of relevance of words to discourse relations. The former model only utilizes the information inside discourse arguments, while the latter model builds upon an outside semantic memory to exploit general world knowledge. Both models are capable of assigning more weights to relation-relevant words, and operate in an end-to-end manner. Upon these two models, we further propose a full attention model that combines their strengths into a unified framework. Extensive experiments on the PDTB data set show that our model significantly benefits from highlighting relation-relevant words and yields competitive and even better results against several state-of-the-art systems. (C) 2017 Elsevier B.V. All rights reserved.
【 授权许可】
Free
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
10_1016_j_neucom_2017_09_074.pdf | 726KB | download |