09:00 - 10:30
Parallel sessions 4
09:00 - 10:30
Room: HSZ - N4
Chair/s:
Nicola Schneider
The diffusion decision model (DDM) is a mathematical framework that jointly describes choice behavior and response time distributions, offering a process-level account of decision-making. Conceptualizing decisions as the accumulation of noisy evidence, the DDM has provided insights into the cognitive mechanisms underlying perception, attention, memory, and higher-order decision-making. Its flexibility and explanatory power have made it one of the most widely used tools in experimental psychology, bridging cognitive theory, mathematical modeling, and empirical research.
The increasing prominence of DDMs has spurred both conceptual and methodological developments. This symposium focuses on recent theoretical and computational advancements in the modeling of DDMs, including advances in estimation techniques, alternative stochastic dynamics to the Wiener process, and integrations with other modeling frameworks. Together, we aim to highlight new directions for enhancing theoretical and conceptual precision, modeling flexibility, and computational efficiency.
This symposium is the first part of a two-part series on DDMs at TeaP. While Part I emphasizes model development, theoretical extensions, and computational innovation, Part II turns to applied research, demonstrating how DDMs can help us better understand cognitive processes across different populations and domains. By being open to scholars from all areas of experimental psychology, the series offers a forum for presenting new ideas, establishing collaborations, and identifying future directions in the modeling of human cognition.
Submission 599
Using Pairwise Estimation for Diffusion Model Parameters: A Comparison Between Stan and BayesFlow
SymposiumTalk-01
Presented by: Shanqing Gao
Shanqing Gao 1, Yufei Wu 2, Andreas Voss 1, Francis Tuerlinckx 2
1 Heidelberg University, Germany
2 Leuven University, Belgium
The drift–diffusion model (DDM) is widely used in psychology and neuroscience to decompose the cognitive processes underlying decision making. Many research scenarios require rapid or repeated estimation of DDM parameters, such as analyzing large-scale datasets, conducting simulation-based power analyses, or implementing adaptive experimental designs. In these settings, the same model must be fitted many times across datasets or conditions, making amortized Bayesian inference methods (e.g., BayesFlow) an appealing alternative to traditional MCMC approaches. However, the high training cost and limited generalizability of amortized inference can restrict its applicability across different experimental designs.Composite likelihood provides a promising strategy to increase generalizability and reduce computational burden by approximating the full likelihood with likelihoods defined over subsets of the data. A simple and effective variant is pairwise composite likelihood, where likelihoods are computed over every unique pair of conditions rather than jointly across all conditions. This reduces training effort while preserving the fast amortized inference that BayesFlow offers.To evaluate this approach, we conducted parameter recovery studies using simulated data from simple DDMs, comparing composite likelihood with full likelihood estimation in both BayesFlow and Stan (as a benchmark). Results show that composite likelihood achieves parameter recovery comparable to full likelihood in both frameworks. Crucially, composite likelihood in BayesFlow greatly reduces inference time relative to Stan, offering a scalable and efficient solution for repeated DDM estimation.