In the Face of Uncertainty: How Robots’ Appearance Shape Mind Perception.
Mon—HZ_12—Talks2—1505
Presented by: Yasmina Giebeler
Understanding how humans perceive robots is essential for improving human-robot interaction. While users readily attribute agency-related capacities (ability to act and plan) to robots, they are hesitant to explicitly ascribe experience (ability to feel and sense). However, as humans have demonstrated social behaviour towards robots, this reluctance may not only be due to lower expectations of finding this ability in robots, but also arise from a subjective feeling of violating social norms. This study aims to disentangle explicit and implicit processes underlying mind perception and how robots’ physical features shape perception of agency versus experience.
Participants evaluated 20 robots grouped by body and face prominence across eight items measuring capacities to act, plan, feel, and sense. Mouse cursor tracking captured decision-making dynamics. Data Collection will be finalized by December. We hypothesize that robots will generally be rated higher in agency than experience, with appearance influencing these ratings: Robots with pronounced facial features are expected to receive higher experience ratings, while those with complex body features will score higher in agency. Greater uncertainty is anticipated in attributing experience compared to agency, with differences in uncertainty when affirming versus denying mind-related capacities.
A 2x4 repeated measures ANOVA will assess explicit mind perception ratings, while a 2x2 ANOVA will analyze mouse trajectories to investigate decision-making uncertainty. Results will clarify how robot appearance influences mind perceptions, offering insights for designing robots that align with human expectations. This research will benefit the ethical and effective integration of robots into social contexts, improving interaction and acceptance.
Participants evaluated 20 robots grouped by body and face prominence across eight items measuring capacities to act, plan, feel, and sense. Mouse cursor tracking captured decision-making dynamics. Data Collection will be finalized by December. We hypothesize that robots will generally be rated higher in agency than experience, with appearance influencing these ratings: Robots with pronounced facial features are expected to receive higher experience ratings, while those with complex body features will score higher in agency. Greater uncertainty is anticipated in attributing experience compared to agency, with differences in uncertainty when affirming versus denying mind-related capacities.
A 2x4 repeated measures ANOVA will assess explicit mind perception ratings, while a 2x2 ANOVA will analyze mouse trajectories to investigate decision-making uncertainty. Results will clarify how robot appearance influences mind perceptions, offering insights for designing robots that align with human expectations. This research will benefit the ethical and effective integration of robots into social contexts, improving interaction and acceptance.
Keywords: Human-Robot Interaction, Social Cognition, Mind Perception, Social Robots, Implicit Measure