Remote Sensing | |
Intensity/Inertial Integration-Aided Feature Tracking on Event Cameras | |
Feng Zhou1  Zeyu Li1  Yong Liu2  Xiaowan Li3  | |
[1] College of Geodesy and Geomatics, Shandong University of Science and Technology, Qingdao 266590, China;Institute of Cyber-Systems and Control, Zhejiang University, Hangzhou 310027, China;National Time Service Center, Chinese Academy of Sciences, Xi’an 710600, China; | |
关键词: event camera; feature tracking; intensity/inertial integration; | |
DOI : 10.3390/rs14081773 | |
来源: DOAJ |
【 摘 要 】
Achieving efficient and accurate feature tracking on event cameras is a fundamental step for practical high-level applications, such as simultaneous localization and mapping (SLAM) and structure from motion (SfM) and visual odometry (VO) in GNSS (Global Navigation Satellite System)-denied environments. Although many asynchronous tracking methods purely using event flow have been proposed, they suffer from high computation demand and drift problems. In this paper, event information is still processed in the form of synthetic event frames to better adapt to the practical demands. Weighted fusion of multiple hypothesis testing with batch processing (WF-MHT-BP) is proposed based on loose integration of event, intensity, and inertial information. More specifically, with inertial information acting as priors, multiple hypothesis testing with batch processing (MHT-BP) produces coarse feature-tracking solutions on event frames in a batch processing way. With a time-related stochastic model, a weighted fusion mechanism fuses feature-tracking solutions from event and intensity frames compared with other state-of-the-art feature-tracking methods on event cameras. Evaluation on public datasets shows significant improvements on accuracy and efficiency and comparable performances in terms of feature-tracking length.
【 授权许可】
Unknown