|国家预印本平台
首页|The Monado SLAM Dataset for Egocentric Visual-Inertial Tracking

The Monado SLAM Dataset for Egocentric Visual-Inertial Tracking

The Monado SLAM Dataset for Egocentric Visual-Inertial Tracking

来源:Arxiv_logoArxiv
英文摘要

Humanoid robots and mixed reality headsets benefit from the use of head-mounted sensors for tracking. While advancements in visual-inertial odometry (VIO) and simultaneous localization and mapping (SLAM) have produced new and high-quality state-of-the-art tracking systems, we show that these are still unable to gracefully handle many of the challenging settings presented in the head-mounted use cases. Common scenarios like high-intensity motions, dynamic occlusions, long tracking sessions, low-textured areas, adverse lighting conditions, saturation of sensors, to name a few, continue to be covered poorly by existing datasets in the literature. In this way, systems may inadvertently overlook these essential real-world issues. To address this, we present the Monado SLAM dataset, a set of real sequences taken from multiple virtual reality headsets. We release the dataset under a permissive CC BY 4.0 license, to drive advancements in VIO/SLAM research and development.

Mateo de Mayo、Daniel Cremers、Taihú Pire

计算技术、计算机技术

Mateo de Mayo,Daniel Cremers,Taihú Pire.The Monado SLAM Dataset for Egocentric Visual-Inertial Tracking[EB/OL].(2025-07-31)[2025-08-11].https://arxiv.org/abs/2508.00088.点此复制

评论