Introduction
This project accompanies the “Bayesian Cognitive Modeling: A Practical Course” book by Lee & Wagenmakers (2013). It is not a substitute for the original book. You will need it to understand the examples, as well as read the theoretical parts to understand the machinery of Bayesian statistics. This project only implements the examples using BayesFlow
in Python, but does not provide much theoretical understanding of the examples.
What is BayesFlow
?
BayesFlow
(https://bayesflow.org/) is a Python library allowing Amortized Bayesian Inference (ABI) using deep learning generative networks. As such, it is on the crossroads between Bayesian statistics and Machine learning.
Who is this project for?
This project is mainly helpful as an additional study material to those who want to expand their skills regarding Bayesian statistics beyond R and MCMC sampling.
Prerequisites
You will need a computer with Python installed, and some programming/Python knowledge. It is not necessary to be a wizard in Python, but you should be able to understand basic concepts and data structures.
Familiarity with Bayesian statistics is recommended. If you are a complete novice in Bayesian statistics, this project if probably not for you. That said, you can get started by reading Lee & Wagenmakers (2013) from cover to cover, and then you will be ready for the materials in this book. Other introductory books into Bayesian statistics suitable to get started include:
- Bayesian Data Analysis (Gelman et al., 1995)
- Statistical Rethinking (McElreath, 2018)
- Doing Bayesian Data Analysis (Kruschke, 2014)
- A Student’s Guide to Bayesian Statistics (Lambert, 2018)
Familiarity with machine learning and generative neural networks is recommended but not necessary. You should be able to understand the examples without deeply understanding how the generative neural networks work.
A note on notation
The examples in the original book are accompanied by Bayesian Graphical models beautifully typeset using TikZ (Tantau, 2013). In this book, we only provide the symbolic representation of the model in terms of sampling statements such as
\[\begin{equation} \begin{aligned} \mu & \sim \text{Normal}(0, 1) \\ x & \sim \text{Normal}(\mu, 1). \end{aligned} \end{equation}\]
We will use the parametrization of probability distributions as implemented by the Python code, so that the notation in this book is consistent with the code shown in the examples. This means that we will diverge from the notation in the book, which uses the conventions for parametrizations of JAGS (Plummer et al., 2003).
Limitations
Currently it is not easy to estimate many models using BayesFlow
. For example, while multilevel and mixture models are feasible with ABI (Habermann et al., 2024; Kucharský & Bürkner, 2025), the current version of BayesFlow
does not provide convenient interface to estimate such models. This project provides solutions to examples which can be handled with BayesFlow
without writing significant amounts of custom code. We may update this project once the interface of BayesFlow
expands to cover the kinds of models that are currently left out of this project.