 Award ID(s):
 2007287
 NSFPAR ID:
 10467012
 Publisher / Repository:
 SODA 2022
 Date Published:
 Format(s):
 Medium: X
 Sponsoring Org:
 National Science Foundation
More Like this

Wootters, Mary ; Sanita, Laura (Ed.)The SwendsenWang algorithm is a sophisticated, widelyused Markov chain for sampling from the Gibbs distribution for the ferromagnetic Ising and Potts models. This chain has proved difficult to analyze, due in part to the global nature of its updates. We present optimal bounds on the convergence rate of the SwendsenWang algorithm for the complete dary tree. Our bounds extend to the nonuniqueness region and apply to all boundary conditions. We show that the spatial mixing conditions known as Variance Mixing and Entropy Mixing, introduced in the study of local Markov chains by Martinelli et al. (2003), imply Ω(1) spectral gap and O(log n) mixing time, respectively, for the SwendsenWang dynamics on the dary tree. We also show that these bounds are asymptotically optimal. As a consequence, we establish Θ(log n) mixing for the SwendsenWang dynamics for all boundary conditions throughout the tree uniqueness region; in fact, our bounds hold beyond the uniqueness threshold for the Ising model, and for the qstate Potts model when q is small with respect to d. Our proofs feature a novel spectral view of the Variance Mixing condition inspired by several recent rapid mixing results on highdimensional expanders and utilize recent work on block factorization of entropy under spatial mixing conditions.more » « less

We introduce a notion called entropic independence that is an entropic analog of spectral notions of highdimensional expansion. Informally, entropic independence of a background distribution $\mu$ on $k$sized subsets of a ground set of elements says that for any (possibly randomly chosen) set $S$, the relative entropy of a single element of $S$ drawn uniformly at random carries at most $O(1/k)$ fraction of the relative entropy of $S$. Entropic independence is the analog of the notion of spectral independence, if one replaces variance by entropy. We use entropic independence to derive tight mixing time bounds, overcoming the lossy nature of spectral analysis of Markov chains on exponentialsized state spaces. In our main technical result, we show a general way of deriving entropy contraction, a.k.a. modified logSobolev inequalities, for downup random walks from spectral notions. We show that spectral independence of a distribution under arbitrary external fields automatically implies entropic independence. We furthermore extend our theory to the case where spectral independence does not hold under arbitrary external fields. To do this, we introduce a framework for obtaining tight mixing time bounds for Markov chains based on what we call restricted modified logSobolev inequalities, which guarantee entropy contraction not for all distributions, but for those in a sufficiently large neighborhood of the stationary distribution. To derive our results, we relate entropic independence to properties of polynomials: $\mu$ is entropically independent exactly when a transformed version of the generating polynomial of $\mu$ is upper bounded by its linear tangent; this property is implied by concavity of the said transformation, which was shown by prior work to be locally equivalent to spectral independence. We apply our results to obtain (1) tight modified logSobolev inequalities and mixing times for multistep downup walks on fractionally logconcave distributions, (2) the tight mixing time of $O(n\log n)$ for Glauber dynamics on Ising models whose interaction matrix has eigenspectrum lying within an interval of length smaller than $1$, improving upon the prior quadratic dependence on $n$, and (3) nearlylinear time $\widetilde O_{\delta}(n)$ samplers for the hardcore and Ising models on $n$node graphs that have $\delta$relative gap to the treeuniqueness threshold. In the last application, our bound on the running time does not depend on the maximum degree $\Delta$ of the graph, and is therefore optimal even for highdegree graphs, and in fact, is sublinear in the size of the graph for highdegree graphs.more » « less

Abstract We establish rapid mixing of the randomcluster Glauber dynamics on random
regular graphs for all$$\varDelta $$ $\Delta $ and$$q\ge 1$$ $q\ge 1$ , where the threshold$$p $p<{p}_{u}(q,\Delta )$ corresponds to a uniqueness/nonuniqueness phase transition for the randomcluster model on the (infinite)$$p_u(q,\varDelta )$$ ${p}_{u}(q,\Delta )$ regular tree. It is expected that this threshold is sharp, and for$$\varDelta $$ $\Delta $ the Glauber dynamics on random$$q>2$$ $q>2$ regular graphs undergoes an exponential slowdown at$$\varDelta $$ $\Delta $ . More precisely, we show that for every$$p_u(q,\varDelta )$$ ${p}_{u}(q,\Delta )$ ,$$q\ge 1$$ $q\ge 1$ , and$$\varDelta \ge 3$$ $\Delta \ge 3$ , with probability$$p $p<{p}_{u}(q,\Delta )$ over the choice of a random$$1o(1)$$ $1o\left(1\right)$ regular graph on$$\varDelta $$ $\Delta $n vertices, the Glauber dynamics for the randomcluster model has mixing time. As a corollary, we deduce fast mixing of the Swendsen–Wang dynamics for the Potts model on random$$\varTheta (n \log n)$$ $\Theta (nlogn)$ regular graphs for every$$\varDelta $$ $\Delta $ , in the tree uniqueness region. Our proof relies on a sharp bound on the “shattering time”, i.e., the number of steps required to break up any configuration into$$q\ge 2$$ $q\ge 2$ sized clusters. This is established by analyzing a delicate and novel iterative scheme to simultaneously reveal the underlying random graph with clusters of the Glauber dynamics configuration on it, at a given time.$$O(\log n)$$ $O(logn)$ 
Abstract The Swendsen–Wang algorithm is a sophisticated, widely‐used Markov chain for sampling from the Gibbs distribution for the ferromagnetic Ising and Potts models. This chain has proved difficult to analyze, due in part to its global nature. We present optimal bounds on the convergence rate of the Swendsen–Wang algorithm for the complete ‐ary tree. Our bounds extend to the non‐uniqueness region and apply to all boundary conditions. We show that the spatial mixing conditions known as
variance mixing andentropy mixing imply spectral gap and mixing time, respectively, for the Swendsen–Wang dynamics on the ‐ary tree. We also show that these bounds are asymptotically optimal. As a consequence, we establish mixing for the Swendsen–Wang dynamics forall boundary conditions throughout (and beyond) the tree uniqueness region. Our proofs feature a novel spectral view of the variance mixing condition and utilize recent work on block factorization of entropy. 
We consider spin systems with nearest‐neighbor interactions on an
n ‐vertexd ‐dimensional cube of the integer lattice graph. We study the effects that the strong spatial mixing condition (SSM) has on the rate of convergence to equilibrium of nonlocal Markov chains. We prove that when SSM holds, the relaxation time (i.e., the inverse spectral gap) of generalblock dynamics isO (r ), wherer is the number of blocks. As a second application of our technology, it is established that SSM implies anO (1) bound for the relaxation time of the Swendsen‐Wang dynamics for the ferromagnetic Ising and Potts models. We also prove that formonotone spin systems SSM implies that the mixing time of systematic scan dynamics is. Our proofs use a variety of techniques for the analysis of Markov chains including coupling, functional analysis and linear algebra.