Robust Depth-Aided RGBD-Inertial Odometry for Indoor Localization
Published in This paper is accepted by Measurement, 2023
Recommended citation: Zhao X, Li Q, Wang C, Liu B. Robust Depth-Aided RGBD-Inertial Odometry for Indoor Localization[J]. Measurement, 2023, 209: 112487. https://www.sciencedirect.com/science/article/abs/pii/S0263224123000519
RGB-D cameras such as RealSense and Structure Sensors have been widely used in most robotics systems. This paper presents a system for estimating the trajectory of an RGB-D camera and IMU in indoor environments. The system uses a novel relative pose estimation method that utilizes depth measurements and epipolar constraints for initialization. An adaptive depth estimation method is also proposed, which fuses a depth uncertainty model and multi-view triangulation. In the backend, a sliding window framework is used to optimize the system state by minimizing the residuals of pre-integrated IMU, 3D features re-projection, and 2D features epipolar constraint. The effectiveness of the system is evaluated using publicly available datasets with ground truth trajectories.
Download paper in journal.