| IEEE Access | |
| Appearance-Based Gaze Estimator for Natural Interaction Control of Surgical Robots | |
| Yunhui Liu1  Hiuman Yip1  Xuebin Hou2  Peng Li2  Xingguang Duan3  Guoli Song4  | |
| [1] Department of Mechanical and Automation Engineering, The Chinese University of Hong Kong, Hong Kong;School of Mechanical Engineering and Automation, Harbin Institute of Technology (Shenzhen), Shenzhen, China;School of Mechatronical Engineering, Beijing Institute of Technology, Beijing, China;State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China; | |
| 关键词: Deep learning; surgical robot; gaze estimation; convolutional neural network; | |
| DOI : 10.1109/ACCESS.2019.2900424 | |
| 来源: DOAJ | |
【 摘 要 】
Robots are playing an increasingly important role in modern surgery. However, conventional human-computer interaction methods, such as joystick control and sound control, have some shortcomings, and medical personnel are required to specifically practice operating the robot. We propose a human-computer interaction model based on eye movement with which medical staff can conveniently use their eye movements to control the robot. Our algorithm requires only an RGB camera to perform tasks without requiring expensive eye-tracking devices. Two kinds of eye control modes are designed in this paper. The first type is the pick and place movement, with which the user uses eye gaze to specify the point where the robotic arm is required to move. The second type is user command movement, with which the user can use eye gaze to select the direction in which the user desires the robot to move. The experimental results demonstrate the feasibility and convenience of these two modes of movement.
【 授权许可】
Unknown