| BMC Bioinformatics | |
| Long short-term memory RNN for biomedical named entity recognition | |
| Research Article | |
| Bo Chen1  Yafeng Ren2  Donghong Ji3  Chen Lyu3  | |
| [1] Department of Chinese Language & Literature, Hubei University of Art & Science, 24105, Xiangyang, Hubei, China;Guangdong Collaborative Innovation Center for Language Research & Services, Guangdong University of Foreign Studies, 510420, Guangzhou, Guangdong, China;School of Computer Science, Wuhan University, 430072, Wuhan, Hubei, China; | |
| 关键词: Biomedical named entity recognition; Word embeddings; Character representation; Recurrent neural network; LSTM; | |
| DOI : 10.1186/s12859-017-1868-5 | |
| received in 2016-12-21, accepted in 2017-10-16, 发布年份 2017 | |
| 来源: Springer | |
PDF
|
|
【 摘 要 】
BackgroundBiomedical named entity recognition(BNER) is a crucial initial step of information extraction in biomedical domain. The task is typically modeled as a sequence labeling problem. Various machine learning algorithms, such as Conditional Random Fields (CRFs), have been successfully used for this task. However, these state-of-the-art BNER systems largely depend on hand-crafted features.ResultsWe present a recurrent neural network (RNN) framework based on word embeddings and character representation. On top of the neural network architecture, we use a CRF layer to jointly decode labels for the whole sentence. In our approach, contextual information from both directions and long-range dependencies in the sequence, which is useful for this task, can be well modeled by bidirectional variation and long short-term memory (LSTM) unit, respectively. Although our models use word embeddings and character embeddings as the only features, the bidirectional LSTM-RNN (BLSTM-RNN) model achieves state-of-the-art performance — 86.55% F1 on BioCreative II gene mention (GM) corpus and 73.79% F1 on JNLPBA 2004 corpus.ConclusionsOur neural network architecture can be successfully used for BNER without any manual feature engineering. Experimental results show that domain-specific pre-trained word embeddings and character-level representation can improve the performance of the LSTM-RNN models. On the GM corpus, we achieve comparable performance compared with other systems using complex hand-crafted features. Considering the JNLPBA corpus, our model achieves the best results, outperforming the previously top performing systems. The source code of our method is freely available under GPL at https://github.com/lvchen1989/BNER.
【 授权许可】
CC BY
© The Author(s) 2017
【 预 览 】
| Files | Size | Format | View |
|---|---|---|---|
| RO202311094339489ZK.pdf | 1018KB | ||
| 12864_2015_2129_Article_IEq36.gif | 1KB | Image | |
| 12864_2015_2199_Article_IEq14.gif | 1KB | Image | |
| 12864_2017_4132_Article_IEq33.gif | 1KB | Image | |
| 12864_2017_4132_Article_IEq34.gif | 1KB | Image | |
| 12864_2017_4020_Article_IEq11.gif | 1KB | Image | |
| 12864_2016_3440_Article_IEq76.gif | 1KB | Image | |
| 12864_2017_3781_Article_IEq6.gif | 1KB | Image | |
| 12864_2017_4316_Article_IEq2.gif | 1KB | Image | |
| 12864_2017_3733_Article_IEq52.gif | 1KB | Image | |
| 12888_2015_697_Article_IEq1.gif | 1KB | Image | |
| 12864_2017_4190_Article_IEq3.gif | 1KB | Image | |
| 12864_2017_4190_Article_IEq4.gif | 1KB | Image | |
| 12864_2017_3655_Article_IEq3.gif | 1KB | Image | |
| 12864_2017_3655_Article_IEq4.gif | 1KB | Image | |
| 12864_2017_3487_Article_IEq2.gif | 1KB | Image | |
| 12896_2017_378_Article_IEq13.gif | 1KB | Image | |
| 12864_2016_2682_Article_IEq23.gif | 1KB | Image | |
| 12864_2017_3492_Article_IEq17.gif | 1KB | Image | |
| 12864_2015_2297_Article_IEq19.gif | 1KB | Image | |
| 12864_2016_3169_Article_IEq15.gif | 1KB | Image | |
| 12864_2017_3492_Article_IEq21.gif | 1KB | Image | |
| 12864_2017_3492_Article_IEq22.gif | 1KB | Image | |
| 12864_2017_3733_Article_IEq69.gif | 1KB | Image | |
| 12864_2017_3733_Article_IEq71.gif | 1KB | Image | |
| 12864_2016_3440_Article_IEq16.gif | 1KB | Image | |
| 12864_2015_2304_Article_IEq17.gif | 1KB | Image | |
| 12864_2016_3440_Article_IEq18.gif | 1KB | Image | |
| 12864_2015_2296_Article_IEq86.gif | 1KB | Image | |
| 12864_2017_3821_Article_IEq1.gif | 1KB | Image | |
| 12864_2017_3821_Article_IEq2.gif | 1KB | Image | |
| 12864_2017_4020_Article_IEq28.gif | 1KB | Image |
【 图 表 】
12864_2017_4020_Article_IEq28.gif
12864_2017_3821_Article_IEq2.gif
12864_2017_3821_Article_IEq1.gif
12864_2015_2296_Article_IEq86.gif
12864_2016_3440_Article_IEq18.gif
12864_2015_2304_Article_IEq17.gif
12864_2016_3440_Article_IEq16.gif
12864_2017_3733_Article_IEq71.gif
12864_2017_3733_Article_IEq69.gif
12864_2017_3492_Article_IEq22.gif
12864_2017_3492_Article_IEq21.gif
12864_2016_3169_Article_IEq15.gif
12864_2015_2297_Article_IEq19.gif
12864_2017_3492_Article_IEq17.gif
12864_2016_2682_Article_IEq23.gif
12896_2017_378_Article_IEq13.gif
12864_2017_3487_Article_IEq2.gif
12864_2017_3655_Article_IEq4.gif
12864_2017_3655_Article_IEq3.gif
12864_2017_4190_Article_IEq4.gif
12864_2017_4190_Article_IEq3.gif
12888_2015_697_Article_IEq1.gif
12864_2017_3733_Article_IEq52.gif
12864_2017_4316_Article_IEq2.gif
12864_2017_3781_Article_IEq6.gif
12864_2016_3440_Article_IEq76.gif
12864_2017_4020_Article_IEq11.gif
12864_2017_4132_Article_IEq34.gif
12864_2017_4132_Article_IEq33.gif
12864_2015_2199_Article_IEq14.gif
12864_2015_2129_Article_IEq36.gif
【 参考文献 】
- [1]
- [2]
- [3]
- [4]
- [5]
- [6]
- [7]
- [8]
- [9]
- [10]
- [11]
- [12]
- [13]
- [14]
- [15]
- [16]
- [17]
- [18]
- [19]
- [20]
- [21]
- [22]
- [23]
- [24]
- [25]
- [26]
- [27]
- [28]
- [29]
- [30]
- [31]
- [32]
- [33]
- [34]
- [35]
- [36]
- [37]
- [38]
- [39]
- [40]
- [41]
PDF