Abstract:

Markov chain Monte Carlo (MCMC) has been the workhorse for sampling from difficult to compute probability distributions for several decades. A particularly interesting application is Bayesian inference, where the probability distribution to sample is the posterior distribution. Hamiltonian Monte Carlo (HMC) has recently received considerable attention due to its ability to efficiently sample high-dimensional posterior distributions. In tandem, data subsampling has been extensively used to overcome the computational bottlenecks of MCMC arising when evaluating the likelihood over the whole data set, or its gradient. However, while data subsampling has been successful in traditional MCMC algorithms such as Metropolis-Hastings, it has been demonstrated to be unsuccessful in the context of HMC, both in terms of poor sampling efficiency and in producing highly biased inferences. We show how to combine HMC with data subsampling to construct an algorithm that scales in both the number of data observations and the number of parameters, while remaining highly accurate. Moreover, we show how to apply our ideas to another class of algorithms termed Sequential Monte Carlo, which provides a useful alternative to MCMC that is parallelizable and simplifies model selection problems.
 

Tid: 11 december, 2019, kl. 13-14 Plats: B705

Välkommen.