| NEUROCOMPUTING | 卷:458 |
| Intent-enhanced attentive Bert capsule network for zero-shot intention detection | |
| Article | |
| Xue, Siyuan1  Ren, Fuji1  | |
| [1] Tokushima Univ, Fac Engn, Tokushima 7700814, Japan | |
| 关键词: Zero-shot intent detection; Label embedding attention mechanism; Pre-trained language model; Attentive capsule network; Metric learning; | |
| DOI : 10.1016/j.neucom.2021.05.085 | |
| 来源: Elsevier | |
PDF
|
|
【 摘 要 】
Spoken language understanding (SLU) plays an indispensable role in the dialogue system. The traditional intention detection task is regarded as a classification problem where utterances are associated with predefined intents. However, the various expressions of user's intents and constantly emerging novel intents make the annotating time-consuming and labor-intensive, building massive obstacles for extending the model to new tasks. Identifying unexpected user intention and achieving the user's desire goal is a challenging task. Therefore, we conduct zero-shot intention detection based on a transformation-based learning manner. In this paper, we propose an intent-enhanced attentive capsule network (IE-BertCapsNet) further guides the aggregation process of the capsule network and generalizable useful features that can be adapted to emerging intentions. Coupling with the large margin cosine loss function, the proposed model can identify discriminative features by forcing the whole network to minimize inter-class distance and minimize intra-class distance. Finally, we leverage the IE-BertCapsNet's feature extraction ability and knowledge transferring capability to conduct zero-shot intent detection and generalized zero-shot intent detection. Extensive experiments on five benchmark task-oriented datasets in four languages demonstrate that the proposed model can achieve competitive performance that can better discriminate known intents and detect unknown intents. (c) 2021 Elsevier B.V. All rights reserved.
【 授权许可】
Free
【 预 览 】
| Files | Size | Format | View |
|---|---|---|---|
| 10_1016_j_neucom_2021_05_085.pdf | 2443KB |
PDF