# Fastest Mixing Markov Chain

## Introduction

This example is derived from the results in Boyd, Diaconis, and Xiao (2004), section 2. Let \(\mathcal{G} = (\mathcal{V}, \mathcal{E})\) be a connected graph with vertices \(\mathcal{V} = \{1,\ldots,n\}\) and edges \(\mathcal{E} \subseteq \mathcal{V} \times \mathcal{V}\). Assume that \((i,i) \in \mathcal{E}\) for all \(i = 1,\ldots,n\), and \((i,j) \in \mathcal{E}\) implies \((j,i) \in \mathcal{E}\). Under these conditions, a discrete-time Markov chain on \(\mathcal{V}\) will have the uniform distribution as one of its equilibrium distributions. We are interested in finding the Markov chain, constructing the transition probability matrix \(P \in {\mathbf R}_+^{n \times n}\), that minimizes its asymptotic convergence rate to the uniform distribution. This is an important problem in Markov chain Monte Carlo (MCMC) simulations, as it directly affects the sampling efficiency of an algorithm.

The asymptotic rate of convergence is determined by the second largest eigenvalue of \(P\), which in our case is \(\mu(P) := \lambda_{\max}(P - \frac{1}{n}{\mathbf 1}{\mathbf 1}^T)\) where \(\lambda_{\max}(A)\) denotes the maximum eigenvalue of \(A\). As \(\mu(P)\) decreases, the mixing rate increases and the Markov chain converges faster to equilibrium. Thus, our optimization problem is

\[ \begin{array}{ll} \underset{P}{\mbox{minimize}} & \lambda_{\max}(P - \frac{1}{n}{\mathbf 1}{\mathbf 1}^T) \\ \mbox{subject to} & P \geq 0, \quad P{\mathbf 1} = {\mathbf 1}, \quad P = P^T \\ & P_{ij} = 0, \quad (i,j) \notin \mathcal{E}. \end{array} \]

The element \(P_{ij}\) of our transition matrix is the probability of moving from state \(i\) to state \(j\). Our assumptions imply that \(P\) is nonnegative, symmetric, and doubly stochastic. The last constraint ensures transitions do not occur between unconnected vertices.

The function \(\lambda_{\max}\) is convex, so this problem is solvable
in `CVXR`

. For instance, the code for the Markov chain in Figure 2
below (the triangle plus one edge) is

```
P <- Variable(n,n)
ones <- matrix(1, nrow = n, ncol = 1)
obj <- Minimize(lambda_max(P - 1/n))
constr1 <- list(P >= 0, P %*% ones == ones, P == t(P))
constr2 <- list(P[1,3] == 0, P[1,4] == 0)
prob <- Problem(obj, c(constr1, constr2))
result <- solve(prob)
```

where we have set \(n = 4\). We could also have specified \(P{\mathbf 1} = {\mathbf 1}\) with `sum_entries(P,1) == 1`

, which uses the
`sum_entries`

atom to represent the row sums.

## Example

In order to reproduce some of the examples from Boyd, Diaconis, and Xiao (2004), we create functions to build up the graph, solve the optimization problem and finally display the chain graphically.

```
## Boyd, Diaconis, and Xiao. SIAM Rev. 46 (2004) pgs. 667-689 at pg. 672
## Form the complementary graph
antiadjacency <- function(g) {
n <- max(as.numeric(names(g))) ## Assumes names are integers starting from 1
a <- lapply(1:n, function(i) c())
names(a) <- 1:n
for(x in names(g)) {
for(y in 1:n) {
if(!(y %in% g[[x]]))
a[[x]] <- c(a[[x]], y)
}
}
a
}
## Fastest mixing Markov chain on graph g
FMMC <- function(g, verbose = FALSE) {
a <- antiadjacency(g)
n <- length(names(a))
P <- Variable(n, n)
o <- rep(1, n)
objective <- Minimize(norm(P - 1.0/n, "2"))
constraints <- list(P %*% o == o, t(P) == P, P >= 0)
for(i in names(a)) {
for(j in a[[i]]) { ## (i-j) is a not-edge of g!
idx <- as.numeric(i)
if(idx != j)
constraints <- c(constraints, P[idx,j] == 0)
}
}
prob <- Problem(objective, constraints)
result <- solve(prob)
if(verbose)
cat("Status: ", result$status, ", Optimal Value = ", result$value, ", Solver = ", result$solver)
list(status = result$status, value = result$value, P = result$getValue(P))
}
disp_result <- function(states, P, tol = 1e-3) {
if(!("markovchain" %in% rownames(installed.packages()))) {
rownames(P) <- states
colnames(P) <- states
print(P)
} else {
P[P < tol] <- 0
P <- P/apply(P, 1, sum) ## Normalize so rows sum to exactly 1
mc <- new("markovchain", states = states, transitionMatrix = P)
plot(mc)
}
}
```

### Results

Table 1 from Boyd, Diaconis, and Xiao (2004) is reproduced below.

We reproduce the results for various rows of the table.

```
g <- list("1" = 2, "2" = c(1,3), "3" = c(2,4), "4" = 3)
result1 <- FMMC(g, verbose = TRUE)
```

`## Status: optimal , Optimal Value = 0.7071073 , Solver = SCS`

`disp_result(names(g), result1$P)`

```
g <- list("1" = 2, "2" = c(1,3,4), "3" = c(2,4), "4" = c(2,3))
result2 <- FMMC(g, verbose = TRUE)
```

`## Status: optimal , Optimal Value = 0.636361 , Solver = SCS`

`disp_result(names(g), result2$P)`

```
g <- list("1" = c(2,4,5), "2" = c(1,3), "3" = c(2,4,5), "4" = c(1,3), "5" = c(1,3))
result3 <- FMMC(g, verbose = TRUE)
```

`## Status: optimal , Optimal Value = 0.4285698 , Solver = SCS`

`disp_result(names(g), result3$P)`

```
g <- list("1" = c(2,3,5), "2" = c(1,4,5), "3" = c(1,4,5), "4" = c(2,3,5), "5" = c(1,2,3,4,5))
result4 <- FMMC(g, verbose = TRUE)
```

`## Status: optimal , Optimal Value = 0.25 , Solver = SCS`

`disp_result(names(g), result4$P)`

## Extensions

It is easy to extend this example to other Markov chains. To change
the number of vertices, we would simply modify `n`

, and to add or
remove edges, we need only alter the constraints in `constr2`

. For
instance, the bipartite chain in Figure Figure 3 is produced by
setting \(n = 5\) and

`constr2 <- list(P[1,3] == 0, P[2,4] == 0, P[2,5] == 0, P[4,5] == 0)`

## Session Info

`sessionInfo()`

```
## R version 4.0.2 (2020-06-22)
## Platform: x86_64-apple-darwin19.5.0 (64-bit)
## Running under: macOS Catalina 10.15.7
##
## Matrix products: default
## BLAS/LAPACK: /usr/local/Cellar/openblas/0.3.10_1/lib/libopenblasp-r0.3.10.dylib
##
## locale:
## [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
##
## attached base packages:
## [1] stats graphics grDevices datasets utils methods base
##
## other attached packages:
## [1] markovchain_0.8.5 CVXR_1.0-9
##
## loaded via a namespace (and not attached):
## [1] gmp_0.6-0 Rcpp_1.0.5 highr_0.8 compiler_4.0.2
## [5] tools_4.0.2 digest_0.6.25 bit_1.1-15.2 evaluate_0.14
## [9] lattice_0.20-41 pkgconfig_2.0.3 rlang_0.4.7 Matrix_1.2-18
## [13] igraph_1.2.5 gurobi_9.0.3.1 Rglpk_0.6-4 yaml_2.2.1
## [17] parallel_4.0.2 expm_0.999-5 blogdown_0.19 xfun_0.15
## [21] cccp_0.2-4 Rmpfr_0.8-1 stringr_1.4.0 knitr_1.28
## [25] stats4_4.0.2 bit64_0.9-7 grid_4.0.2 R6_2.4.1
## [29] rmarkdown_2.3 bookdown_0.19 matlab_1.0.2 magrittr_1.5
## [33] scs_1.3-3 rcbc_0.1.0.9001 codetools_0.2-16 htmltools_0.5.0
## [37] assertthat_0.2.1 Rcplex_0.3-3 stringi_1.4.6 Rmosek_9.2.3
## [41] RcppParallel_5.0.1 slam_0.1-47
```

## Source

## References

Boyd, S., P. Diaconis, and L. Xiao. 2004. “Fastest Mixing Markov Chain on a Graph.” *SIAM Review* 46 (4): 667–89.