期刊论文详细信息
BMC Bioinformatics
An attention-based effective neural model for drug-drug interactions extraction
Research Article
Zhihao Yang1  Ling Luo1  Yijia Zhang1  Jian Wang1  Zhehuan Zhao1  Hongfei Lin1  Zhengguang Li2  Wei Zheng2 
[1] College of Computer Science and Technology, Dalian University of Technology, Dalian, China;College of Computer Science and Technology, Dalian University of Technology, Dalian, China;College of Software, Dalian JiaoTong University, Dalian, China;
关键词: Attention;    Recurrent neural network;    Long short-term memory;    Drug-drug interactions;    Text mining;   
DOI  :  10.1186/s12859-017-1855-x
 received in 2017-04-28, accepted in 2017-10-02,  发布年份 2017
来源: Springer
PDF
【 摘 要 】

BackgroundDrug-drug interactions (DDIs) often bring unexpected side effects. The clinical recognition of DDIs is a crucial issue for both patient safety and healthcare cost control. However, although text-mining-based systems explore various methods to classify DDIs, the classification performance with regard to DDIs in long and complex sentences is still unsatisfactory.MethodsIn this study, we propose an effective model that classifies DDIs from the literature by combining an attention mechanism and a recurrent neural network with long short-term memory (LSTM) units. In our approach, first, a candidate-drug-oriented input attention acting on word-embedding vectors automatically learns which words are more influential for a given drug pair. Next, the inputs merging the position- and POS-embedding vectors are passed to a bidirectional LSTM layer whose outputs at the last time step represent the high-level semantic information of the whole sentence. Finally, a softmax layer performs DDI classification.ResultsExperimental results from the DDIExtraction 2013 corpus show that our system performs the best with respect to detection and classification (84.0% and 77.3%, respectively) compared with other state-of-the-art methods. In particular, for the Medline-2013 dataset with long and complex sentences, our F-score far exceeds those of top-ranking systems by 12.6%.ConclusionsOur approach effectively improves the performance of DDI classification tasks. Experimental analysis demonstrates that our model performs better with respect to recognizing not only close-range but also long-range patterns among words, especially for long, complex and compound sentences.

【 授权许可】

CC BY   
© The Author(s). 2017

【 预 览 】
附件列表
Files Size Format View
RO202311099535471ZK.pdf 862KB PDF download
12864_2017_4020_Article_IEq4.gif 1KB Image download
12867_2017_86_Article_IEq1.gif 1KB Image download
12864_2016_2388_Article_IEq4.gif 1KB Image download
12864_2017_4020_Article_IEq6.gif 1KB Image download
12864_2017_3733_Article_IEq43.gif 1KB Image download
12864_2017_3777_Article_IEq4.gif 1KB Image download
12864_2017_4271_Article_IEq1.gif 1KB Image download
12864_2017_3781_Article_IEq3.gif 1KB Image download
12864_2017_3733_Article_IEq47.gif 1KB Image download
12870_2015_650_Article_IEq1.gif 1KB Image download
12864_2016_2793_Article_IEq52.gif 1KB Image download
12864_2017_3990_Article_IEq15.gif 1KB Image download
12864_2017_3487_Article_IEq45.gif 1KB Image download
12888_2017_1365_Article_IEq2.gif 1KB Image download
12864_2017_4190_Article_IEq1.gif 1KB Image download
12888_2015_697_Article_IEq1.gif 1KB Image download
12864_2017_3492_Article_IEq11.gif 1KB Image download
【 图 表 】

12864_2017_3492_Article_IEq11.gif

12888_2015_697_Article_IEq1.gif

12864_2017_4190_Article_IEq1.gif

12888_2017_1365_Article_IEq2.gif

12864_2017_3487_Article_IEq45.gif

12864_2017_3990_Article_IEq15.gif

12864_2016_2793_Article_IEq52.gif

12870_2015_650_Article_IEq1.gif

12864_2017_3733_Article_IEq47.gif

12864_2017_3781_Article_IEq3.gif

12864_2017_4271_Article_IEq1.gif

12864_2017_3777_Article_IEq4.gif

12864_2017_3733_Article_IEq43.gif

12864_2017_4020_Article_IEq6.gif

12864_2016_2388_Article_IEq4.gif

12867_2017_86_Article_IEq1.gif

12864_2017_4020_Article_IEq4.gif

【 参考文献 】
  • [1]
  • [2]
  • [3]
  • [4]
  • [5]
  • [6]
  • [7]
  • [8]
  • [9]
  • [10]
  • [11]
  • [12]
  • [13]
  • [14]
  • [15]
  • [16]
  • [17]
  • [18]
  • [19]
  • [20]
  • [21]
  • [22]
  • [23]
  • [24]
  • [25]
  • [26]
  • [27]
  • [28]
  • [29]
  • [30]
  • [31]
  • [32]
  • [33]
  • [34]
  • [35]
  • [36]
  • [37]
  • [38]
  • [39]
  • [40]
  • [41]
  • [42]
  • [43]
  文献评价指标  
  下载次数:3次 浏览次数:0次