|国家预印本平台
首页|EgoZero: Robot Learning from Smart Glasses

EgoZero: Robot Learning from Smart Glasses

EgoZero: Robot Learning from Smart Glasses

来源:Arxiv_logoArxiv
英文摘要

Despite recent progress in general purpose robotics, robot policies still lag far behind basic human capabilities in the real world. Humans interact constantly with the physical world, yet this rich data resource remains largely untapped in robot learning. We propose EgoZero, a minimal system that learns robust manipulation policies from human demonstrations captured with Project Aria smart glasses, $\textbf{and zero robot data}$. EgoZero enables: (1) extraction of complete, robot-executable actions from in-the-wild, egocentric, human demonstrations, (2) compression of human visual observations into morphology-agnostic state representations, and (3) closed-loop policy learning that generalizes morphologically, spatially, and semantically. We deploy EgoZero policies on a gripper Franka Panda robot and demonstrate zero-shot transfer with 70% success rate over 7 manipulation tasks and only 20 minutes of data collection per task. Our results suggest that in-the-wild human data can serve as a scalable foundation for real-world robot learning - paving the way toward a future of abundant, diverse, and naturalistic training data for robots. Code and videos are available at https://egozero-robot.github.io.

Vincent Liu、Ademi Adeniji、Haotian Zhan、Siddhant Haldar、Raunaq Bhirangi、Pieter Abbeel、Lerrel Pinto

自动化技术、自动化技术设备计算技术、计算机技术

Vincent Liu,Ademi Adeniji,Haotian Zhan,Siddhant Haldar,Raunaq Bhirangi,Pieter Abbeel,Lerrel Pinto.EgoZero: Robot Learning from Smart Glasses[EB/OL].(2025-05-26)[2025-06-09].https://arxiv.org/abs/2505.20290.点此复制

评论