Go to page

Bibliographic Metadata

Systematic analysis of video data from different human-robot interaction studies : a categorisation of social signals during error situations / Manuel Giuliani, Nicole Mirnig, Gerald Stollnberger, Susanne Stadler, Roland Buchner, Manfred Tscheligi
AuthorGiuliani, Manuel In der Gemeinsamen Normdatei der DNB nachschlagen ; Mirnig, Nicole ; Stollnberger, Gerald ; Stadler, Susanne ; Buchner, Roland ; Tscheligi, Manfred
Published in
Frontiers in Psychology, 2015,
Document typeJournal Article
Keywords (EN)social signals / error situation / social norm violation / technical failure / humanrobot interaction / video analysis
URNurn:nbn:at:at-ubs:3-165 Persistent Identifier (URN)
 The work is publicly available
Systematic analysis of video data from different human-robot interaction studies [1.35 mb]
Abstract (English)

Humanrobot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in humanrobot interaction experiments. For that, we analyzed 201 videos of five humanrobot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of humanrobot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.