|国家预印本平台
首页|Grasp Prediction based on Local Finger Motion Dynamics

Grasp Prediction based on Local Finger Motion Dynamics

Grasp Prediction based on Local Finger Motion Dynamics

来源:Arxiv_logoArxiv
英文摘要

The ability to predict the object the user intends to grasp offers essential contextual information and may help to leverage the effects of point-to-point latency in interactive environments. This paper explores the feasibility and accuracy of real-time recognition of uninstrumented objects based on hand kinematics during reach-to-grasp actions. In a data collection study, we recorded the hand motions of 16 participants while reaching out to grasp and then moving real and synthetic objects. Our results demonstrate that even a simple LSTM network can predict the time point at which the user grasps an object with a precision better than 21 ms and the current distance to this object with a precision better than 1 cm. The target's size can be determined in advance with an accuracy better than 97%. Our results have implications for designing adaptive and fine-grained interactive user interfaces in ubiquitous and mixed-reality environments.

Dimitar Valkov、Pascal Kockwelp、Florian Daiber、Antonio Krüger

计算技术、计算机技术

Dimitar Valkov,Pascal Kockwelp,Florian Daiber,Antonio Krüger.Grasp Prediction based on Local Finger Motion Dynamics[EB/OL].(2025-06-12)[2025-07-02].https://arxiv.org/abs/2506.10818.点此复制

评论