期刊论文详细信息
BMC Bioinformatics
Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction
K. Vijay-Shanker1  Peng Su1 
[1] Department of Computer and Information Science, Biomedical Text Mining Lab, University of Delaware;
关键词: Deep learning;    Transformer;    BERT;    Text mining;    Biomedical relation extraction;   
DOI  :  10.1186/s12859-022-04642-w
来源: DOAJ
【 摘 要 】

Abstract Background Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature. Since the adaptation to the biomedical domain, the transformer-based BERT models have produced leading results on many biomedical natural language processing tasks. In this work, we will explore the approaches to improve the BERT model for relation extraction tasks in both the pre-training and fine-tuning stages of its applications. In the pre-training stage, we add another level of BERT adaptation on sub-domain data to bridge the gap between domain knowledge and task-specific knowledge. Also, we propose methods to incorporate the ignored knowledge in the last layer of BERT to improve its fine-tuning. Results The experiment results demonstrate that our approaches for pre-training and fine-tuning can improve the BERT model performance. After combining the two proposed techniques, our approach outperforms the original BERT models with averaged F1 score improvement of 2.1% on relation extraction tasks. Moreover, our approach achieves state-of-the-art performance on three relation extraction benchmark datasets. Conclusions The extra pre-training step on sub-domain data can help the BERT model generalization on specific tasks, and our proposed fine-tuning mechanism could utilize the knowledge in the last layer of BERT to boost the model performance. Furthermore, the combination of these two approaches further improves the performance of BERT model on the relation extraction tasks.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次