BMC Bioinformatics | |
Dependency-based long short term memory network for drug-drug interaction extraction | |
Research | |
Chengkun Wu1  Xiaowei Guo1  Canqun Yang1  Wei Wang1  Xi Yang1  Xiang Zhang1  | |
[1] School of Computer Science, National University of Defense Technology, 410073, Changsha, China; | |
关键词: Relation extraction; Long short term memory; Dependency tree; Data imbalance; | |
DOI : 10.1186/s12859-017-1962-8 | |
来源: Springer | |
【 摘 要 】
BackgroundDrug-drug interaction extraction (DDI) needs assistance from automated methods to address the explosively increasing biomedical texts. In recent years, deep neural network based models have been developed to address such needs and they have made significant progress in relation identification.MethodsWe propose a dependency-based deep neural network model for DDI extraction. By introducing the dependency-based technique to a bi-directional long short term memory network (Bi-LSTM), we build three channels, namely, Linear channel, DFS channel and BFS channel. All of these channels are constructed with three network layers, including embedding layer, LSTM layer and max pooling layer from bottom up. In the embedding layer, we extract two types of features, one is distance-based feature and another is dependency-based feature. In the LSTM layer, a Bi-LSTM is instituted in each channel to better capture relation information. Then max pooling is used to get optimal features from the entire encoding sequential data. At last, we concatenate the outputs of all channels and then link it to the softmax layer for relation identification.ResultsTo the best of our knowledge, our model achieves new state-of-the-art performance with the F-score of 72.0% on the DDIExtraction 2013 corpus. Moreover, our approach obtains much higher Recall value compared to the existing methods.ConclusionsThe dependency-based Bi-LSTM model can learn effective relation information with less feature engineering in the task of DDI extraction. Besides, the experimental results show that our model excels at balancing the Precision and Recall values.
【 授权许可】
CC BY
© The Author(s). 2017
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202311092222727ZK.pdf | 944KB | download | |
12864_2017_3938_Article_IEq11.gif | 1KB | Image | download |
12864_2017_4186_Article_IEq13.gif | 1KB | Image | download |
12864_2016_3477_Article_IEq9.gif | 1KB | Image | download |
12864_2017_3670_Article_IEq5.gif | 1KB | Image | download |
12864_2017_3938_Article_IEq14.gif | 1KB | Image | download |
12864_2017_3670_Article_IEq7.gif | 1KB | Image | download |
12864_2017_3938_Article_IEq15.gif | 1KB | Image | download |
12864_2017_3669_Article_IEq3.gif | 1KB | Image | download |
12880_2015_Article_74_TeX2GIF_IEq3.gif | 1KB | Image | download |
12864_2015_2170_Article_IEq3.gif | 2KB | Image | download |
12711_2017_362_Article_IEq93.gif | 1KB | Image | download |
12864_2017_4004_Article_IEq6.gif | 1KB | Image | download |
12864_2017_4030_Article_IEq20.gif | 1KB | Image | download |
12864_2017_4004_Article_IEq7.gif | 1KB | Image | download |
12888_2016_811_Article_IEq1.gif | 1KB | Image | download |
12864_2017_4186_Article_IEq26.gif | 1KB | Image | download |
12864_2017_4132_Article_IEq1.gif | 1KB | Image | download |
12864_2017_4132_Article_IEq2.gif | 1KB | Image | download |
【 图 表 】
12864_2017_4132_Article_IEq2.gif
12864_2017_4132_Article_IEq1.gif
12864_2017_4186_Article_IEq26.gif
12888_2016_811_Article_IEq1.gif
12864_2017_4004_Article_IEq7.gif
12864_2017_4030_Article_IEq20.gif
12864_2017_4004_Article_IEq6.gif
12711_2017_362_Article_IEq93.gif
12864_2015_2170_Article_IEq3.gif
12880_2015_Article_74_TeX2GIF_IEq3.gif
12864_2017_3669_Article_IEq3.gif
12864_2017_3938_Article_IEq15.gif
12864_2017_3670_Article_IEq7.gif
12864_2017_3938_Article_IEq14.gif
12864_2017_3670_Article_IEq5.gif
12864_2016_3477_Article_IEq9.gif
12864_2017_4186_Article_IEq13.gif
12864_2017_3938_Article_IEq11.gif
【 参考文献 】
- [1]
- [2]
- [3]
- [4]
- [5]
- [6]
- [7]
- [8]
- [9]
- [10]
- [11]
- [12]
- [13]
- [14]
- [15]
- [16]
- [17]
- [18]
- [19]
- [20]
- [21]
- [22]
- [23]
- [24]
- [25]
- [26]
- [27]
- [28]
- [29]
- [30]
- [31]
- [32]
- [33]
- [34]
- [35]
- [36]
- [37]
- [38]
- [39]
- [40]
- [41]
- [42]
- [43]