期刊论文详细信息
IEEE Access
Attention Retrieval Model for Entity Relation Extraction From Biological Literature
Kristian Schultz1  Saptarshi Bej1  Kristina Yordanova1  Olaf Wolkenhauer1  Prashant Srivastava1 
[1] Department of Systems Biology and Bioinformatics, Institute of Computer Science, University of Rostock, Rostock, Germany;
关键词: Attention models;    biological literature mining;    deep learning;    knowledge graphs;   
DOI  :  10.1109/ACCESS.2022.3154820
来源: DOAJ
【 摘 要 】

Natural Language Processing (NLP) has contributed to extracting relationships among biological entities, such as genes, their mutations, proteins, diseases, processes, phenotypes, and drugs, for a comprehensive and concise understanding of information in the literature. Self-attention-based models for Relationship Extraction (RE) have played an increasingly important role in NLP. However, self-attention models for RE are framed as a classification problem, which limits its practical usability in several ways. We present an alternative framework called the Attention Retrieval Model (ARM), which enhances the applicability of attention-based models compared to the regular classification approach, for RE. Given a text sequence containing related entities/keywords, ARM learns the association between a chosen entity/keyword with the other entities present in the sequence, using an underlying self-attention mechanism. ARM provides a flexible framework for a modeller to customise their model, facilitate data integration, and integrate expert knowledge to provide a more practical approach for RE. ARM can extract unseen relationships that are not annotated in the training data, analogous to zero-shot learning. To sum up, ARM provides an alternative self-attention-based deep learning framework for RE, that can capture directed entity relationships.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次