|国家预印本平台
首页|UniTracker: Learning Universal Whole-Body Motion Tracker for Humanoid Robots

UniTracker: Learning Universal Whole-Body Motion Tracker for Humanoid Robots

UniTracker: Learning Universal Whole-Body Motion Tracker for Humanoid Robots

来源:Arxiv_logoArxiv
英文摘要

Humanoid robots must achieve diverse, robust, and generalizable whole-body control to operate effectively in complex, human-centric environments. However, existing methods, particularly those based on teacher-student frameworks often suffer from a loss of motion diversity during policy distillation and exhibit limited generalization to unseen behaviors. In this work, we present UniTracker, a simplified yet powerful framework that integrates a Conditional Variational Autoencoder (CVAE) into the student policy to explicitly model the latent diversity of human motion. By leveraging a learned CVAE prior, our method enables the student to retain expressive motion characteristics while improving robustness and adaptability under partial observations. The result is a single policy capable of tracking a wide spectrum of whole-body motions with high fidelity and stability. Comprehensive experiments in both simulation and real-world deployments demonstrate that UniTracker significantly outperforms MLP-based DAgger baselines in motion quality, generalization to unseen references, and deployment robustness, offering a practical and scalable solution for expressive humanoid control.

Kangning Yin、Weishuai Zeng、Ke Fan、Zirui Wang、Qiang Zhang、Zheng Tian、Jingbo Wang、Jiangmiao Pang、Weinan Zhang

自动化技术、自动化技术设备计算技术、计算机技术

Kangning Yin,Weishuai Zeng,Ke Fan,Zirui Wang,Qiang Zhang,Zheng Tian,Jingbo Wang,Jiangmiao Pang,Weinan Zhang.UniTracker: Learning Universal Whole-Body Motion Tracker for Humanoid Robots[EB/OL].(2025-07-10)[2025-07-25].https://arxiv.org/abs/2507.07356.点此复制

评论