Yuansi Chen (Duke)

Mar 30, 2022

Title and Abstract

Localization schemes: A framework for proving mixing bounds for Markov chains

Our work is motivated by two recent and seemingly-unrelated techniques for proving mixing bounds for Markov chains: (i) the framework of spectral independence, introduced by Anari, Liu and Oveis Gharan, and its numerous extensions, which have given rise to several breakthroughs in the analysis of mixing times of discrete Markov chains and (ii) the stochastic localization technique which has proven useful in establishing mixing and expansion bounds for both log-concave measures and for measures on the discrete hypercube.

In this work, we present a framework which connects ideas from both techniques and allows us to unify proofs in the mixing time of MCMC algorithms on high dimensional distributions. In its center is the concept of a localization scheme which, to every probability measure , assigns a martingale of probability measures which localize in space as time evolves. This viewpoint provides tools for deriving mixing bounds for the dynamics through the analysis of the corresponding localization process. Generalizations of concepts of spectral independence naturally arise from our definitions. In particular we show via our framework that it is possible to recover the main theorems in the spectral independence frameworks via simple martingale arguments, while completely bypassing the theory of high-dimensional expanders. As applications, we discuss how to use it to obtain the first O(nlogn) bound for mixing time of the hardcore-model (of arbitrary degree) in the tree-uniqueness regime, under Glauber dynamics and to prove a KL-divergence decay bound for log-concave sampling via the Restricted Gaussian Oracle, which achieves optimal mixing under any exp(n)-warm start.

Bio

A graduate of École Polytechnique in Paris, Dr. Yuansi Chen obtained his PhD in Statistics at UC Berkeley in 2019. After a postdoctoral fellowship in Data Science at ETH Zürich, Dr. Chen joined in 2021 the faculty of the Department of Statistical Science at Duke University. While in Zürich, he made progress towards the resolution of a 25 year old conjecture proposed by Kannan, Lovász and Simonovits. His research interests lie on statistical machine learning, convergence analysis of MCMC sampling algorithms and high dimensional convex geometry.