期刊论文详细信息
IEEE Access 卷:8
A Neural Relation Extraction Model for Distant Supervision in Counter-Terrorism Scenario
Rongchen Zhu1  Zeyu Wei1  Jiaqi Hou1  Chao Zhang2  Xin Li2  Chongqiang Zhu2 
[1] School of Information Technology and Cyber Security, People&x2019;
[2] s Public Security University of China, Beijing, China;
关键词: BERT;    relation extraction;    distant supervision;    selective attention mechanism;    BERT entity encoding;   
DOI  :  10.1109/ACCESS.2020.3042672
来源: DOAJ
【 摘 要 】

Natural language processing (NLP) is the best solution to extensive, unstructured, complex, and diverse network big data for counter-terrorism. Through the text analysis, it is the basis and the most critical step to quickly extract the relationship between the relevant entities pairs in terrorism. Relation extraction lays a foundation for constructing a knowledge graph (KG) of terrorism and provides technical support for intelligence analysis and prediction. This paper takes the distant-supervised relation extraction as the starting point, breaks the limitation of artificial data annotation. Combining the Bidirectional Encoder Representation from Transformers (BERT) pre-training model and the sentence-level attention over multiple instances, we proposed the relation extraction model named BERT-att. Experiments show that our model is more efficient and better than the current leading baseline model over each evaluative metrics. Our model applied to the construction of anti-terrorism knowledge map, it used in regional security risk assessment, terrorist event prediction and other scenarios.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:5次