Gradient flow on graphs via Metropolis

The problem of finding minimizers of functions on graphs arises naturally in many different domains, for example, exponential random graphs and extremal combinatorics. In Euclidean setting, there are two widely used approaches for minimizing a given objective function. First approach involves gradient based algorithms like gradient descent and stochastic gradient descent. And, the second approach involves running MCMC algorithms, like Metropolis, to sample from a suitable Gibbs measure which is concentrated around the minimizers. We develop these two approaches for minimizing functions on large graphs or more generally on the space of graphons. In particular, we introduce gradient flows on the space of graphons and show that the gradient flow of suitable functions can be approaximated by Euclidean gradient descent algorithms. For the second appaorach, we introduce a Metropolis chain on stochastic block models and show that in suitable limiting regime the trajectory of the Metropolis chain is close to a deterministic curve on the space of graphons which can be interpreted as the gradient flow.