期刊论文详细信息
Multimodal Technologies and Interaction
Recognition of Tactile Facial Action Units by Individuals Who Are Blind and Sighted: A Comparative Study
Abhik Chowdhury1  Troy McDaniel1  Sethuraman Panchanathan1  Bijan Fakhri1  Diep Tran1 
[1] Center for Cognitive Ubiquitous Computing (CUbiC), Arizona State University, Tempe, AZ 85281, USA;
关键词: social assistive aids;    nonverbal;    haptics;    vibrotactile;    tactile-vision sensory substitution;    sensory augmentation;    tactile facial action units;    technologies for individuals who are blind;   
DOI  :  10.3390/mti3020032
来源: DOAJ
【 摘 要 】

Given that most cues exchanged during a social interaction are nonverbal (e.g., facial expressions, hand gestures, body language), individuals who are blind are at a social disadvantage compared to their sighted peers. Very little work has explored sensory augmentation in the context of social assistive aids for individuals who are blind. The purpose of this study is to explore the following questions related to visual-to-vibrotactile mapping of facial action units (the building blocks of facial expressions): (1) How well can individuals who are blind recognize tactile facial action units compared to those who are sighted? (2) How well can individuals who are blind recognize emotions from tactile facial action units compared to those who are sighted? These questions are explored in a preliminary pilot test using absolute identification tasks in which participants learn and recognize vibrotactile stimulations presented through the Haptic Chair, a custom vibrotactile display embedded on the back of a chair. Study results show that individuals who are blind are able to recognize tactile facial action units as well as those who are sighted. These results hint at the potential for tactile facial action units to augment and expand access to social interactions for individuals who are blind.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:1次