期刊论文详细信息
ROBOMECH Journal
3D pointing gestures as target selection tools: guiding monocular UAVs during window selection in an outdoor environment
Jason Orlosky1  Haruo Takemura1  Yuki Uranishi1  Photchara Ratsamee1  Manabu Higashida1  Anna C. S. Medeiros2 
[1] Cybermedia Center, Osaka University, Toyonaka, Osaka, Japan;Graduate School of Information Science and Technology, Osaka University, Suita, Osaka, Japan;
关键词: Gestural interface;    Human–Drone interaction;    Gesture development process;    Pointing gesture;    Object selection;   
DOI  :  10.1186/s40648-021-00200-w
来源: Springer
PDF
【 摘 要 】

Firefighters need to gain information from both inside and outside of buildings in first response emergency scenarios. For this purpose, drones are beneficial. This paper presents an elicitation study that showed firefighters’ desires to collaborate with autonomous drones. We developed a Human–Drone interaction (HDI) method for indicating a target to a drone using 3D pointing gestures estimated solely from a monocular camera. The participant first points to a window without using any wearable or body-attached device. Through the drone’s front-facing camera, the drone detects the gesture and computes the target window. This work includes a description of the process for choosing the gesture, detecting and localizing objects, and carrying out the transformations between coordinate systems. Our proposed 3D pointing gesture interface improves on 2D interfaces by integrating depth information with SLAM and solving ambiguity with multiple objects aligned on the same plane in a large-scale outdoor environment. Experimental results showed that our 3D pointing gesture interface obtained average F1 scores of 0.85 and 0.73 for precision and recall in simulation and real-world experiments and an F1 score of 0.58 at the maximum distance of 25 m between the drone and building.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO202107030693907ZK.pdf 3725KB PDF download
  文献评价指标  
  下载次数:20次 浏览次数:3次