Bilgilendirme: Kurulum ve veri kapsamındaki çalışmalar devam etmektedir. Göstereceğiniz anlayış için teşekkür ederiz.
 

Err@hri 2024 Challenge: Multimodal Detection of Errors and Failures in Human-Robot Interactions

Loading...
Publication Logo

Date

2024

Journal Title

Journal ISSN

Volume Title

Publisher

Assoc Computing Machinery

Open Access Color

HYBRID

Green Open Access

Yes

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Average
Influence
Average
Popularity
Top 10%

Research Projects

Journal Issue

Abstract

Despite the recent advancements in robotics and machine learning (ML), the deployment of autonomous robots in our everyday lives is still an open challenge. This is due to multiple reasons among which are their frequent mistakes, such as interrupting people or having delayed responses, as well as their limited ability to understand human speech, i.e., failure in tasks like transcribing speech to text. These mistakes may disrupt interactions and negatively influence human perception of these robots. To address this problem, robots need to have the ability to detect human-robot interaction (HRI) failures. The ERR@HRI 2024 challenge tackles this by offering a benchmark multimodal dataset of robot failures during human-robot interactions, encouraging researchers to develop and benchmark multimodal machine learning models to detect these failures. We created a dataset featuring multimodal non-verbal interaction data, including facial, speech, and pose features from video clips of interactions with a robotic coach, annotated with labels indicating the presence or absence of robot mistakes, user awkwardness, and interaction ruptures, allowing for the training and evaluation of predictive models. Challenge participants have been invited to submit their multimodal ML models for detection of robot errors, to be evaluated against various performance metrics such as accuracy, precision, recall, F1 score, with and without a margin of error reflecting the time-sensitivity of these metrics. The results of this challenge will help the research field in better understanding the robot failures in human-robot interactions and designing autonomous robots that can mitigate their own errors after successfully detecting them.

Description

Keywords

Robot Failure, Error Detection, Human-Robot Interaction, Multimodal Interaction, Benchmarking., FOS: Computer and information sciences, Computer Science - Robotics, Error Detection, Robot Failure, Multimodal Interaction, Robotics (cs.RO), Human-Robot Interaction, Benchmarking., 46 Information and Computing Sciences, 4608 Human-Centred Computing, Networking and Information Technology R&D (NITRD), Machine Learning and Artificial Intelligence, Bioengineering

Fields of Science

Citation

WoS Q

N/A

Scopus Q

N/A
OpenCitations Logo
OpenCitations Citation Count
1

Source

Companion International Conference on Multimodal Interaction -- NOV 04-08, 2024 -- San Jose, COSTA RICA

Volume

Issue

Start Page

652

End Page

656
PlumX Metrics
Citations

Scopus : 6

Captures

Mendeley Readers : 15

SCOPUS™ Citations

6

checked on Feb 25, 2026

Web of Science™ Citations

5

checked on Feb 25, 2026

Page Views

1

checked on Feb 25, 2026

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
3.52483115

Sustainable Development Goals

SDG data is not available