|国家预印本平台
首页|ERR@HRI 2024 Challenge: Multimodal Detection of Errors and Failures in Human-Robot Interactions

ERR@HRI 2024 Challenge: Multimodal Detection of Errors and Failures in Human-Robot Interactions

ERR@HRI 2024 Challenge: Multimodal Detection of Errors and Failures in Human-Robot Interactions

来源:Arxiv_logoArxiv
英文摘要

Despite the recent advancements in robotics and machine learning (ML), the deployment of autonomous robots in our everyday lives is still an open challenge. This is due to multiple reasons among which are their frequent mistakes, such as interrupting people or having delayed responses, as well as their limited ability to understand human speech, i.e., failure in tasks like transcribing speech to text. These mistakes may disrupt interactions and negatively influence human perception of these robots. To address this problem, robots need to have the ability to detect human-robot interaction (HRI) failures. The ERR@HRI 2024 challenge tackles this by offering a benchmark multimodal dataset of robot failures during human-robot interactions (HRI), encouraging researchers to develop and benchmark multimodal machine learning models to detect these failures. We created a dataset featuring multimodal non-verbal interaction data, including facial, speech, and pose features from video clips of interactions with a robotic coach, annotated with labels indicating the presence or absence of robot mistakes, user awkwardness, and interaction ruptures, allowing for the training and evaluation of predictive models. Challenge participants have been invited to submit their multimodal ML models for detection of robot errors and to be evaluated against various performance metrics such as accuracy, precision, recall, F1 score, with and without a margin of error reflecting the time-sensitivity of these metrics. The results of this challenge will help the research field in better understanding the robot failures in human-robot interactions and designing autonomous robots that can mitigate their own errors after successfully detecting them.

Micol Spitale、Garima Kankariya、Neval Kara、Malte Jung、Chien-Ming Huang、Wendy Ju、Minja Axelsson、Hatice Gunes、Maia Stiber、Maria Teresa Parreira

自动化技术、自动化技术设备计算技术、计算机技术电子技术应用

Micol Spitale,Garima Kankariya,Neval Kara,Malte Jung,Chien-Ming Huang,Wendy Ju,Minja Axelsson,Hatice Gunes,Maia Stiber,Maria Teresa Parreira.ERR@HRI 2024 Challenge: Multimodal Detection of Errors and Failures in Human-Robot Interactions[EB/OL].(2024-07-08)[2025-08-16].https://arxiv.org/abs/2407.06094.点此复制

评论