|国家预印本平台
首页|Multi-IMU Sensor Fusion for Legged Robots

Multi-IMU Sensor Fusion for Legged Robots

Multi-IMU Sensor Fusion for Legged Robots

来源:Arxiv_logoArxiv
英文摘要

This paper presents a state-estimation solution for legged robots that uses a set of low-cost, compact, and lightweight sensors to achieve low-drift pose and velocity estimation under challenging locomotion conditions. The key idea is to leverage multiple inertial measurement units on different links of the robot to correct a major error source in standard proprioceptive odometry. We fuse the inertial sensor information and joint encoder measurements in an extended Kalman filter, then combine the velocity estimate from this filter with camera data in a factor-graph-based sliding-window estimator to form a visual-inertial-leg odometry method. We validate our state estimator through comprehensive theoretical analysis and hardware experiments performed using real-world robot data collected during a variety of challenging locomotion tasks. Our algorithm consistently achieves minimal position deviation, even in scenarios involving substantial ground impact, foot slippage, and sudden body rotations. A C++ implementation, along with a large-scale dataset, is available at https://github.com/ShuoYangRobotics/Cerberus2.0.

Shuo Yang、Zixin Zhang、John Z. Zhang、Ibrahima Sory Sow、Zachary Manchester

自动化技术、自动化技术设备计算技术、计算机技术

Shuo Yang,Zixin Zhang,John Z. Zhang,Ibrahima Sory Sow,Zachary Manchester.Multi-IMU Sensor Fusion for Legged Robots[EB/OL].(2025-07-15)[2025-08-02].https://arxiv.org/abs/2507.11447.点此复制

评论