11:00 - 12:30
Parallel sessions 2
11:00 - 12:30
Room: HSZ - N2
Chair/s:
Nadia Said
The introduction of new technologies has always shaped societies. Artificial intelligence (AI) applications, especially AI chatbots, are already part of everyday human life. Robots – for example in healthcare but also in other service areas – are also becoming more and more common. Generally, perceptions of these new technologies are mixed. Whereas some of them are widely accepted (e.g., use of generative AI tools like ChatGPT or DeepL), others are highly controversial (e.g., use of AI in classrooms or robots as companions in elderly homes). This raises the question of which factors influence human perceptions of and ultimately human interaction with AI and robots? The aim of this symposium is to present novel insights into human-AI and human-robot interaction by taking three different perspectives: (i) how far do social perceptions also extend to robots and how does this influence the interaction with robots?, (ii) what factors shape humans’ interaction with generative AI tools and how does such an interaction impact them?, (iii) do people differ in their perception of AI and robots? To provide answers to these questions, the first talk investigates the perception of robots as social actors. More specifically, the talk focuses on how similar robots are perceived to be, for example, to human partners. The second talk then tackles the question whether established social heuristics (such as the bystander effect) govern human behavior toward robots. Moving from embodied artificial actors to generative AI tools, the third talk focuses on the influence of external factors (explainability, content, culture) on the perception of an AI chatbot. The fourth talk investigates factors influencing the choice to use generative AI as a cognitive offloading tool and its consequences for human memory and performance. In the final talk, the question of whether Artificial Intelligence and robots are perceived differently is discussed. Jointly, these talks provide a broad overview of human-AI and human-robot interaction by examining the topic from different perspectives.
Submission 578
ChatGPT and Machines? Understanding Mental Models and Preferences for AI and Robots in Germany
SymposiumTalk-05
Presented by: Asya Caroei
Asya Caroei 1, 2, Andreea Potinteu 1, 3, Nadia Said 1
1 University of Tübingen, Germany
2 University of Freiburg, Germany
3 Leibniz-Institut für Wissensmedien Tübingen (IWM), Germany
In Germany, artificial intelligence (AI) is becoming increasingly integrated into daily life, while the use of robots remains limited. While AI and robots could perform similar tasks in many application areas, they fundamentally differ in that robots have a physical presence, i.e., are embodied, as opposed to AI being an abstract algorithm running in the background of an application. Prior research generally links embodiment with positive user perceptions; however, recent evidence suggests that abstract, non-embodied AI systems are preferred over physically embodied robots. To examine this discrepancy, the present study surveyed a quota-based sample of the German population (N = 395). Participants evaluated AI and robots across service, security, and medical contexts providing free word associations used to assess mental models. Results revealed that although robots were, overall, perceived as more embodied and more positive, AI was consistently the preferred technology, particularly in service scenarios where human control is high. Robots were most associated with “Machine,” while AI was strongly linked to “ChatGPT,” indicating the influence of technological trends on public perception. Twelve percent of participants associated AI with “Danger,” representing a technology-skeptical subgroup that was older and had less formal education. These individuals showed no clear preference for AI or robots and tended to view both technologies as risky. Overall, Germans favor abstract AI systems over embodied robots despite more positive attitudes toward the latter. Context sensitivity and societal perceptions of control are crucial for the ethical and effective implementation of AI and robotics in Germany.