期刊论文详细信息
ROBOMECH Journal
3D pointing gestures as target selection tools: guiding monocular UAVs during window selection in an outdoor environment
article
Medeiros, Anna C. S.1  Ratsamee, Photchara2  Orlosky, Jason2  Uranishi, Yuki2  Higashida, Manabu2  Takemura, Haruo2 
[1] Graduate School of Information Science and Technology, Osaka University;Cybermedia Center, Osaka University
关键词: Gestural interface;    Human–Drone interaction;    Gesture development process;    Pointing gesture;    Object selection;   
DOI  :  10.1186/s40648-021-00200-w
学科分类:社会科学、人文和艺术(综合)
来源: Springer
PDF
【 摘 要 】

Firefighters need to gain information from both inside and outside of buildings in first response emergency scenarios. For this purpose, drones are beneficial. This paper presents an elicitation study that showed firefighters’ desires to collaborate with autonomous drones. We developed a Human–Drone interaction (HDI) method for indicating a target to a drone using 3D pointing gestures estimated solely from a monocular camera. The participant first points to a window without using any wearable or body-attached device. Through the drone’s front-facing camera, the drone detects the gesture and computes the target window. This work includes a description of the process for choosing the gesture, detecting and localizing objects, and carrying out the transformations between coordinate systems. Our proposed 3D pointing gesture interface improves on 2D interfaces by integrating depth information with SLAM and solving ambiguity with multiple objects aligned on the same plane in a large-scale outdoor environment. Experimental results showed that our 3D pointing gesture interface obtained average F1 scores of 0.85 and 0.73 for precision and recall in simulation and real-world experiments and an F1 score of 0.58 at the maximum distance of 25 m between the drone and building.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO202108090001427ZK.pdf 3725KB PDF download
  文献评价指标  
  下载次数:6次 浏览次数:0次