09:00 - 10:30
Parallel sessions 1
09:00 - 10:30
Room: HSZ - N9
Chair/s:
Maren Mayer, Tobias Rebholz
When making decisions or providing a judgment, individuals often seek and receive advice from others. They may ask a friend whether or not they should spend their holidays in Japan and how much they should plan to budget for such a stay. Advice taking is typically investigated in a judge-advisor system. The judge provides an initial judgment before being introduced to the advice of the advisor. Afterward, the judge provides their final judgment. In this symposium, we combine advances in advice-taking research, outlining new perspectives for the field.

The first contribution demonstrates that individual differences can affect whether and how individuals take advice, and how much these influences have been overlooked. The second contribution presents a meta-analysis on how advice taking varies across different study contexts and designs using a dual hurdle model. The third contribution compares advice taking based on aggregated versus non-aggregated advice from multiple advisors, investigating why aggregated advice is heeded more by judges. The last two contributions focus on advice taking from algorithms. The fourth contribution investigates algorithmic advice demonstrating that without explicit communication advice can shape competition and collaboration among individuals. Finally, the fifth contribution examines algorithmic and hybrid advice combining human and algorithmic advice, demonstrating no algorithm aversion but instead algorithm appreciation.
Submission 241
Strategic Algorithmic Advice Taking
SymposiumTalk-04
Presented by: Tobias R. Rebholz
Tobias R. Rebholz 1, 2, Maxwell Uphoff 1, 3, Christian H. R. Bernges 1, Florian Scholten 1
1 Department of Psychology, University of Tübingen, Germany
2 Fuqua School of Business, Duke University, United States
3 Department of Economics, University of Minnesota Twin Cities, United States
As algorithms increasingly mediate competitive decision-making, their influence extends beyond individual outcomes to shaping strategic market dynamics. In two preregistered experiments, we examined how algorithmic advice affects human behavior in classic economic games with unique, non-collusive, and analytically traceable equilibria. In Experiment 1 (N = 107), participants played a Bertrand price competition with individualized or collective algorithmic recommendations. Initially, collusively upward-biased advice increased prices, particularly when individualized, but prices gradually converged toward equilibrium over the course of the experiment. However, participants avoided setting prices above the algorithm’s recommendation throughout the experiment, suggesting that advice served as a soft upper bound for acceptable prices. In Experiment 2 (N = 129), participants played a Cournot quantity competition with equilibrium-aligned or strategically biased algorithmic recommendations. Here, individualized equilibrium advice supported stable convergence, whereas collusively downward-biased advice led to sustained underproduction and supracompetitive profits—hallmarks of tacit collusion. In both experiments, participants responded more strongly and consistently to individualized advice than collective advice, potentially due to greater perceived ownership of the former. These findings demonstrate that algorithmic advice can function as a strategic signal, shaping coordination even without explicit communication. The results echo real-world concerns about algorithmic collusion and underscore the need for careful design and oversight of algorithmic decision-support systems in competitive environments.