期刊论文详细信息
NEUROCOMPUTING 卷:253
Emotion-modulated attention improves expression recognition: A deep learning model
Article
Barros, Pablo1  Parisi, German I.1  Weber, Cornelius1  Wermter, Stefan1 
[1] Univ Hamburg, Dept Informat, Knowledge Technol, Vogt Koelln Str 30, D-22527 Hamburg, Germany
关键词: Convolutional neural networks;    Deep learning;    Multimodal processing;    Emotional attention;    Emotion recognition;   
DOI  :  10.1016/j.neucom.2017.01.096
来源: Elsevier
PDF
【 摘 要 】

Spatial attention in humans and animals involves the visual pathway and the superior colliculus, which integrate multimodal information. Recent research has shown that affective stimuli play an important role in attentional mechanisms, and behavioral studies show that the focus of attention in a given region of the visual field is increased when affective stimuli are present. This work proposes a neurocomputational model that learns to attend to emotional expressions and to modulate emotion recognition. Our model consists of a deep architecture which implements convolutional neural networks to learn the location of emotional expressions in a cluttered scene. We performed a number of experiments for detecting regions of interest, based on emotion stimuli, and show that the attention model improves emotion expression recognition when used as emotional attention modulator. Finally, we analyze the internal representations of the learned neural filters and discuss their role in the performance of our model. (C) 2017 The Authors. Published by Elsevier B.V.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2017_01_096.pdf 7707KB PDF download
  文献评价指标  
  下载次数:11次 浏览次数:0次