科学研究
报告题目:

Wasserstein-based method for convergence complexity analysis of Markov chain Monte Carlo

报告人:

Dr. Jin Zhumengmeng(University of Florida)

报告时间:

报告地点:

腾讯会议ID:155 818 578

报告摘要:

Knowing how fast a Markov chain converges to its target distribution helps ensure that the simulation result is reliable. Recently, the emergence of big data has led to a growing interest in so-called convergence complexity analysis, which studies how the convergence rate of a Monte Carlo Markov chain (for an intractable Bayesian posterior distribution) scales as the underlying data set grows in size. Convergence complexity analysis on continuous state spaces is quite challenging and traditional methods that analyze convergence in total variation distance tend to fail in high-dimensional settings. Recent works have shown that convergence analysis based on Wasserstein distance are more robust to increasing dimensions. This talk will discuss a Wasserstein-based method introduced by Qin and Hobert (2022) who studied a Gibbs sampler for a simple Bayesian random effects model. They showed that, under regularity conditions, the geometric convergence rate of this Gibbs sampler converges to zero as the data set grows in size (which indicates immediate convergence). Then we present in the talk that similar behavior is also exhibited by Gibbs samplers for more general Bayesian models that possess both random effects and traditional continuous covariates, the so-called mixed models.