Robust Depth-Aided Visual-Inertial-Wheel Odometry for Mobile Robots

Published in This paper is accepted by IEEE Transactions on Industrial Electronics, 2023

Recommended citation: Zhao X, Li Q, Wang C, Liu B. Robust Depth-Aided Visual-Inertial-Wheel Odometry for Mobile Robots[J]. IEEE Transactions on Industrial Electronics, 2023. https://ieeexplore.ieee.org/abstract/document/10297569

This article introduces visual-depth-inertial-wheel odometry (VDIWO), a robust approach for real-time localization of mobile robots in indoor and outdoor scenarios. Notably, VDIWO achieves accurate localization without relying on prior information. This approach integrates the RGB-D camera, inertial measurement unit, and odometer measurements in a tightly coupled optimization framework. First, we introduce the depth measurement model based on Gaussian mixed model to predict the depth uncertainty of feature points. Then, we propose a hybrid depth estimation method that utilizes both depth measurement fusion and multiview triangulation to estimate the depth of landmarks and simultaneously identify high-quality landmarks. Furthermore, we integrate visual reprojection with depth measurement constraints and odometer preintegration constraints into the tightly coupled optimization framework to further enhance pose estimation accuracy. We evaluate the performance of the VDIWO method using OpenLORIS datasets and real-world experiments. The results demonstrate the high accuracy and robustness of VDIWO for state estimation of mobile robots.

Download paper in IEEE.