期刊论文详细信息
Sensors
Depth-Camera-Aided Inertial Navigation Utilizing Directional Constraints
Usman Qayyum1  Jonghyuk Kim2 
[1] Center of Excellence in Science & Applied Technology (CESAT), Islamabad 45550, Pakistan;Robotics Institute, University of Technology Sydney, Sydney, NSW 2006, Australia;
关键词: integrated inertial navigation;    depth camera;    directional constraints;    epipolar constraints;   
DOI  :  10.3390/s21175913
来源: DOAJ
【 摘 要 】

This paper presents a practical yet effective solution for integrating an RGB-D camera and an inertial sensor to handle the depth dropouts that frequently happen in outdoor environments, due to the short detection range and sunlight interference. In depth drop conditions, only the partial 5-degrees-of-freedom pose information (attitude and position with an unknown scale) is available from the RGB-D sensor. To enable continuous fusion with the inertial solutions, the scale ambiguous position is cast into a directional constraint of the vehicle motion, which is, in essence, an epipolar constraint in multi-view geometry. Unlike other visual navigation approaches, this can effectively reduce the drift in the inertial solutions without delay or under small parallax motion. If a depth image is available, a window-based feature map is maintained to compute the RGB-D odometry, which is then fused with inertial outputs in an extended Kalman filter framework. Flight results from the indoor and outdoor environments, as well as public datasets, demonstrate the improved navigation performance of the proposed approach.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次