Real-Time Imitation of Human Head Motions, Blinks and Emotions by Nao Robot: A Closed-Loop Approach
Real-Time Imitation of Human Head Motions, Blinks and Emotions by Nao Robot: A Closed-Loop Approach
This paper introduces a novel approach for enabling real-time imitation of human head motion by a Nao robot, with a primary focus on elevating human-robot interactions. By using the robust capabilities of the MediaPipe as a computer vision library and the DeepFace as an emotion recognition library, this research endeavors to capture the subtleties of human head motion, including blink actions and emotional expressions, and seamlessly incorporate these indicators into the robot's responses. The result is a comprehensive framework which facilitates precise head imitation within human-robot interactions, utilizing a closed-loop approach that involves gathering real-time feedback from the robot's imitation performance. This feedback loop ensures a high degree of accuracy in modeling head motion, as evidenced by an impressive R2 score of 96.3 for pitch and 98.9 for yaw. Notably, the proposed approach holds promise in improving communication for children with autism, offering them a valuable tool for more effective interaction. In essence, proposed work explores the integration of real-time head imitation and real-time emotion recognition to enhance human-robot interactions, with potential benefits for individuals with unique communication needs.
Keyhan Rayati、Amirhossein Feizi、Alireza Beigy、Pourya Shahverdi、Mehdi Tale Masouleh、Ahmad Kalhor
10.1109/ICRoM60803.2023.10412471
计算技术、计算机技术自动化技术、自动化技术设备电子技术应用
Keyhan Rayati,Amirhossein Feizi,Alireza Beigy,Pourya Shahverdi,Mehdi Tale Masouleh,Ahmad Kalhor.Real-Time Imitation of Human Head Motions, Blinks and Emotions by Nao Robot: A Closed-Loop Approach[EB/OL].(2025-04-28)[2025-05-06].https://arxiv.org/abs/2504.19985.点此复制
评论