Bridging the gap between complex scientific research and the curious minds eager to explore it.

Computer Science, Machine Learning

Probabilistic Programming Languages and Bayesian Inference: An Overview

Probabilistic Programming Languages and Bayesian Inference: An Overview

In this article, we explore a technique called Mixed Inference with Sequential Monte Carlo (SMC) for solving complex problems in Bayesian inference. We use Gaussian graphical models as an example, which describe the relationship between variables and their dependencies using a tree-like structure. The main challenge is that these models are often too complex to solve exactly, leading to intractable solutions.
To address this issue, we propose extending SMC with the ability to perform exact computations on subsets of random variables. This allows us to focus our computation on smaller sub-problems, making the algorithm more efficient and scalable. We also introduce a new algorithm called Symbolic Belief Propagation (SBP), which combines the benefits of both SMC and delayed sampling methods.
The key insight behind SBP is that it divides the computation into two parts: exact computations on the leaves of the tree, and approximations for the remaining variables. By doing so, we can avoid resorting to simulations when dealing with complex models. Our experiments show that SBP outperforms existing methods in terms of computational efficiency and accuracy.
To better understand how SBP works, let’s consider an example of a runner’s position, speed, and altitude over time. We represent the relationship between these variables using a Gaussian graphical model, where each variable is connected to its neighboring variables through a probability distribution. The tree structure allows us to compute the posterior distribution of each variable at each time step, given the observations so far.
However, for complex models like this one, we cannot solve the problem exactly without resorting to approximations. SMC methods, such as the one we propose in this article, provide a way to overcome these limitations by launching multiple simulations of the model and using the particles to approximate the posterior distribution.
In summary, our main contribution is a new algorithm called Symbolic Belief Propagation (SBP) that extends Sequential Monte Carlo (SMC) with exact computations on subsets of random variables, making it more efficient and scalable for solving complex problems in Bayesian inference. By combining the benefits of both SMC and delayed sampling methods, we provide a more accurate and efficient solution for Gaussian graphical models.