MediaPipe Hands: On-device Real-time Hand Tracking
MediaPipe Hands: On-device Real-time Hand Tracking
We present a real-time on-device hand tracking pipeline that predicts hand skeleton from single RGB camera for AR/VR applications. The pipeline consists of two models: 1) a palm detector, 2) a hand landmark model. It's implemented via MediaPipe, a framework for building cross-platform ML solutions. The proposed model and pipeline architecture demonstrates real-time inference speed on mobile GPUs and high prediction quality. MediaPipe Hands is open sourced at https://mediapipe.dev.
Fan Zhang、Andrei Tkachenka、George Sung、Andrey Vakunov、Chuo-Ling Chang、Valentin Bazarevsky、Matthias Grundmann
计算技术、计算机技术电子技术应用通信
Fan Zhang,Andrei Tkachenka,George Sung,Andrey Vakunov,Chuo-Ling Chang,Valentin Bazarevsky,Matthias Grundmann.MediaPipe Hands: On-device Real-time Hand Tracking[EB/OL].(2020-06-17)[2025-06-09].https://arxiv.org/abs/2006.10214.点此复制
评论