期刊论文详细信息
EAI Endorsed Transactions on Scalable Information Systems
Multi-attention mechanism based on gate recurrent unit for English text classification
article
Haiying Liu1 
[1] Zhengzhou University of Science and Technology
关键词: English text classification;    multi-attention mechanism;    GRU;    deep learning;   
DOI  :  10.4108/eai.27-1-2022.173166
学科分类:社会科学、人文和艺术(综合)
来源: Bern Open Publishing
PDF
【 摘 要 】

This article has been retracted, and the retraction notice can be found here: http://dx.doi.org/10.4108/eai.8-4-2022.173791. Text classification is one of the core tasks in the field of natural language processing. Aiming at the advantages and disadvantages of current deep learning-based English text classification methods in long text classification, this paper proposes an English text classification model, which introduces multi-attention mechanism based on gate recurrent unit (GRU) to focus on important parts of English text. Firstly, sentences and documents are encoded according to the hierarchical structure of English documents. Second, it uses the attention mechanism separately at each level. On the basis of the global object vector, the maximum pooling is used to extract the specific object vector of sentence, so that the encoded document vector has more obvious category features and can pay more attention to the most distinctive semantic features of each English text. Finally, documents are classified according to the constructed English document representation. Experimental results on public data sets show that this model has better classification performance for long English texts with hierarchical structure.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO202307110000925ZK.pdf 2790KB PDF download
  文献评价指标  
  下载次数:5次 浏览次数:0次