期刊论文详细信息
Agriculture
A Residual LSTM and Seq2Seq Neural Network Based on GPT for Chinese Rice-Related Question and Answer System
Jingjian Zhang1  Haiyan Zhao2  Qinghu Wang2  Haoriqin Wang2  Shicheng Qiao2  Huarui Wu3  Huaji Zhu3  Yisheng Miao3  Cheng Chen3 
[1] CangZhou Academy of Agriculture and Forestry Sciences, Cangzhou 061001, China;College of Computer Science and Technology, Inner Mongolia Minzu University, Tongliao 028043, China;Research Center of Information Technology, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China;
关键词: rice-related question and answer;    Residual Long Short-Term Memory;    question-and-answer communities;    seq2seq;   
DOI  :  10.3390/agriculture12060813
来源: DOAJ
【 摘 要 】

Rice has a wide planting area as one of the essential food crops in China. The problem of diseases and pests in rice production has always been one of the main factors affecting its quality and yield. It is essential to provide treatment methods and means for rice diseases and pests quickly and accurately in the production process. Therefore, we used the rice question-and-answer (Q&A) community as an example. This paper aimed at the critical technical problems faced by the agricultural Q&A community: the accuracy of the existing agricultural Q&A model is low, which is challenging to meet users’ requirements to obtain answers in real-time in the production process. A network based on Attention-ResLSTM-seq2seq was used to realize the construction of the rice question and answer model. Firstly, the text presentation of rice question-and-answer pairs was obtained using the GPT pre-training model based on a 12-layer transformer. Then, ResLSTM(Residual Long Short-Term Memory) was used to extract text features in the encoder and decoder, and the output project matrix and output gate of LSTM were used to control the spatial information flow. When the network contacts the optimal state, the network only retains the constant mapping value of the input vector, which effectually reduces the network parameters and increases the network performance. Next, the attention mechanism was connected between the encoder and the decoder, which can effectually strengthen the weight of the keyword feature information of the question. The results showed that the BLEU and ROUGE of the Attention-ResLSTM-Seq2seq model reached the highest scores, 35.3% and 37.8%, compared with the other six rice-related generative question answering models.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次