室内环境中基于点线的RGB-D视觉里程计
Point-line-based RGB-D Visual Odometry in Indoor Environments
本文提出了一种结合点线特征的鲁棒的RGB-D 视觉里程计,它通过使用室内场景中的几何结构规律来提高视觉里程计在室内场景中的定位精度。本文构建了鲁棒的线提取和匹配方法,提高了线的匹配的正确性。本文将场景建模为混合曼哈顿世界场景,通过线和表面法线来识别场景中存在的多个曼哈顿系,通过场景中是否满足曼哈顿世界假设来选择不同的位姿估计方法:解耦旋转和平移的位姿估计方法和基于特征跟踪的位姿估计方法。本文利用线与线之间的平行正交约束和线与曼哈顿系主导方向之间的约束,结合点和线的重投影误差来共同优化相机的位姿。在小范围的室内场景的合成数据集和真实世界数据集中进行了实验,在合成数据集中平均定位精度为1.5cm,在真实数据集中,平均定位精度为1.7cm,比最先进的方法提高了1倍。实验结果表明本文中的方法可以有效提高视觉里程计在室内场景中的定位精度和鲁棒性。
In this paper, a robust RGB-D visual odometry, based on point and line features, is proposed. It improves the localization accuracy of the visual odometry by using the geometric structure regularities in the indoor scene. This paper constructs a robust line extraction and matching method to improve the correctness of line matching. In this paper, the scene is modeled as a Mixture of Manhattan Frames, and multiple Manhattan systems in the scene are identified by line and surface normals. Different pose estimation methods are selected according to whether the Manhattan world hypothesis is satisfied in the scene: decoupled rotation and translation pose estimation method and based on feature-tracking pose estimation method. In this paper, the parallel and orthogonality constraints between lines and the constraints between lines and the dominant directions of Manhattan Frames are used to jointly optimize the camera\'s pose by combining the reprojection errors of points and lines. Experiments were carried out onsmall range of indoor scenessynthesized datasets and real-world datasets. The average localization accuracy in synthesized datasets is 1.5cm. In real-world datasets, average localization accuracy is 1.7 cm, which is twice higher than state-of-the-art methods. Experimental results show that the proposed method can effectively improve the localization accuracy and robustness of visual odometry in indoor scenes.
邓中亮、殷嘉徽
计算技术、计算机技术自动化技术、自动化技术设备遥感技术
同步定位与建图视觉里程计定位RGB-D 相机
SLAMvisual odometrylocalizationRGB-D camera
邓中亮,殷嘉徽.室内环境中基于点线的RGB-D视觉里程计[EB/OL].(2023-03-27)[2025-08-02].http://www.paper.edu.cn/releasepaper/content/202303-277.点此复制
评论