11:00 - 12:30
Parallel sessions 2
11:00 - 12:30
Room: HSZ - N1
Chair/s:
Stefan Brandenburg, Martin Baumann
The facilitated integration of technology into people's lives highlights the importance of examining its impact on experience and behavior. Experimental approaches help to determine the underlying psychological processes of this impact. This symposium summarizes experimental studies examining various contexts of technology use and psychological aspects of Engineering Psychology and Human Factors. Applying various experimental approaches these talks address major concepts of Engineering Psychology and Human Factors, such as situation awareness, cognitive load, technology adaptation in classical domains such as human-AI interaction, human-automation interaction, teleoperation, and highlight the value and the feasibility of rigorous experimental approaches also in complex and applied settings. The first talk by Alexander Reisinger examines how much lead time remote drivers need to effectively regain situation awareness and safely take control of highly automated vehicles during event-based remote driving tasks, highlighting the benefits of providing augmented visual information from the vehicle. The second talk by Andreas Schrank explores how different camera perspectives and visual augmentations influence remote assistants’ performance and situation awareness when supervising highly automated vehicles, showing that the optimal perspective depends on the driving scenario and that augmentation can compensate for poor visibility in adverse weather. The third talk by Matthias Arend introduces and validated a new implicit measure of situation awareness called SAMBA, comparing it with established explicit methods and showing that combining SAMBA with the traditional SAGAT approach can provide a more comprehensive and less intrusive assessment of operator awareness during teleoperation tasks. The fourth talk by Romy Müller examines how people evaluate AI image classifications using concept-based explainable AI, showing that participants preferred explanations with image snippets that precisely matched the original image and rated generalized or imprecise explanations significantly lower—indicating that users value precision over robustness in AI interpretations. The fifth talk by Judith Josupeit highlights the benefits of using virtual reality (VR) for rigorous experimental manipulations in applied contexts. In addition, the talk demonstrates how AI can be used in VR-experiments.
Submission 229
Another Point of View: The Effects of Camera Perspective and Augmentation on the Performance of Remote Assistants of Highly Automated Vehicles
SymposiumTalk-02
Presented by: Andreas Schrank
Andreas Schrank 1, Anneke-Sophie Kaas 1, 2, Carsten Borchert 1, Stefan Brandenburg 2, Michael Oehl 1
1 Institute of Transportation Systems, German Aerospace Center, Braunschweig, Germany
2 Cognitive Psychology and Human Factors, Chemnitz University of Technology, Chemnitz, Germany
When operated under real-world conditions, highly automated vehicles (HAVs, SAE Level 4) face many traffic situations they cannot cope with, e.g., adverse weather. Remote assistance may aid HAVs in such situations, making HAV operations more robust. In this task context, human-machine interfaces (HMIs) for remote assistants of HAVs often present traffic situations from a camera perspective similar to the driver’s. However, this first-person view has been associated with shortcomings including the occlusion of relevant objects on the road or the distortion of distance and angle perception. These shortcomings may affect the performance of the remote assistant. An experimental lab study with 37 participants was carried out to investigate if three different camera perspectives affect performance of remote assistants, their situation awareness, and other variables pertaining to them. The task revolves around determining the point in time when a complex left-turn driving task can be implemented by the supervised HAV at a busy urban intersection with mixed traffic. Additionally, the interplay of camera perspectives and video augmentation that visualizes additional sensor data was investigated in an environment with and without adverse weather due resulting from fog. Results indicated that certain performance indicators including decision time were affected by camera perspective. The positive and compensatory impact of augmentation under poor visibility conditions in adverse weather found in prior studies was replicated. Findings suggest that the most suitable perspective highly depends on the specific scenario. The results will help design context-sensitive HMIs for remote assistance of HAVs.