Random graphs

 

Unimodular Random Measured Metric Spaces

Unimodular random measured metric spaces are a common generalization of various other notions. This includes the discrete cases like unimodular graphs and stationary point processes, as well as the non-discrete cases like stationary random measures and the continuum metric spaces arising as scaling limits of graphs. Several general results are proved on such spaces, e.g., on weak limits, re-rooting invariance, random walks, ergodic decomposition, amenability and balancing transport kernels. In addition, one can generalize the Palm theory of point processes and random measures to such unimodular spaces.

Coupling from the past for the null recurrent Markov chain

The Doeblin Graph of a countable state space Markov chain describes the joint pathwise evolutions of the Markov dynamics starting from all possible initial conditions, with two paths coalescing when they reach the same point of the state space at the same time. Its Bridge Doeblin subgraph only contains the paths starting from a tagged point of the state space at all possible times. In the irreducible, aperiodic, and positive recurrent case, the following results are known: the Bridge Doeblin Graph is an infinite tree that is unimodularizable. Moreover, it contains a single bi-infinite path, which allows one to build a perfect sample of the stationary state of the Markov chain. The present paper is focused on the null recurrent case. It is shown that when assuming irreducibility and aperiodicity again, the Bridge Doeblin Graph is either an infinite tree or a forest made of a countable collection of infinite trees. In the first case, the infinite tree in question has a single end, is not unimodularizable in general, but is always locally unimodular. These key properties are used to study the stationary regime of several measure-valued random dynamics on this Bridge Doeblin Tree. The most important ones are the taboo random dynamics, which admits as steady state a random measure with mean measure equal to the invariant measure of the Markov chain, and the potential random dynamics which is a random extension of the classical potential measure, with a mean measure equal to infinity at every point of the state space.

  • F. Baccelli, M.-O. Haji-Mirsadeghi, and S. Khaniha, ‘Coupling from the Past for the Null Recurrent Markov Chain’, (April 2022). https://arxiv.org/abs/2203.13585

Unimodular Hausdorff and Minkowski Dimensions

This work introduces two new notions of dimension, namely the unimodular Minkowski and Hausdorff dimensions, which are inspired from the classical analogous notions.

These dimensions are defined for unimodular discrete spaces, introduced in this work, which provide a common generalization to stationary point processes under their Palm version and unimodular random rooted graphs. The use of unimodularity in the definitions of dimension is novel. Also, a toolbox of results is presented for the analysis of these dimensions. In particular, analogues of Billingsley’s lemma and Frostman’s lemma are presented. These lemmas are instrumental in deriving upper bounds on dimensions, whereas lower bounds are obtained from specific coverings. The notions of unimodular Hausdorff measure and unimodular dimension function are also introduced. This toolbox is used to analyze the dimensions of a set of examples pertaining to point processes, branching processes, random graphs, random walks, and self-similar discrete random spaces.

The companion paper  is focused on the connections between these notions and the polynomial growth rate of the underlying space. It is shown that bounding the dimension is closely related to finding suitable equivariant weight functions (i.e., measures) on the underlying discrete space. The main results are unimodular versions of the mass distribution principle, Billingsley’s lemma and Frostman’s lemma, which allow one to derive upper bounds on the unimodular Hausdorff dimension from the growth rate of suitable equivariant weight functions. These results allow one to compute or bound both types of unimodular dimensions in a large set of examples in the theory of point processes, unimodular random graphs, and self-similarity. Further results of independent interest are also presented, like a version of the max-flow min-cut theorem for unimodular one-ended trees and a weak form of pointwise ergodic theorems for all unimodular discrete spaces.

  • F. Baccelli., M.-O. Haji-Mirsadeghi, and A. Khezeli, “Unimodular Hausdorff and Minkowski Dimensions”. Electronic Journal of Probability, 26, 1-64, 2021. https://arxiv.org/abs/1807.02980

Random Binary Trees

It is known that the size of the largest common subtree (i.e., the maximum agreement subtree) of two independent random binary trees with n given labeled leaves is of order between n^0.366 and n^1/2. A. Khezeli improved the lower bound to order n^0.4464 by constructing a common subtree recursively and by proving a lower bound for its asymptotic growth. The construction is a modification of an algorithm proposed by D. Aldous by splitting the tree at the centroid and by proceeding recursively.

Matrix reconstruction and directed random graphs

This work of Charles Bordenave (Aix-Marseille University), Simon Coste and Raj Rao Nadakuditi (University of Michigan at Ann Arbor) introduces a new and promising spectral method for statistical reconstruction tasks such as matrix completion, based on the analysis of the asymptotic behaviour of the eigenvalues of random graphs in the difficult, very sparse regime.

The underlying random graph is the simplest possible example of a directed random graph G: on n vertices, each of the n^2 possible bonds is present with probability d/n where d is a fixed parameter, accounting for the mean degree of each vertex. This graph converges in a weak sense towards a possibly infinite, directed random graph : the directed Poisson-Galton-Watson tree. We are interested in the eigenvalues of the adjacency matrix A of such graph, with possible weights arising from Hermitian hidden matrix P with low rank. The eigenvalues of A do not bring a rich information on the model, since they are mostly polluted by the highest degrees of G. However, if we randomly split A into A_1 + A_2, it turns out that the eigenvalues of the (non-Hermitian) matrix A_1*A_2 capture a substantial amount of information related to the SVD of P. More precisely, we show the existence of an asymptotic threshold t, which depends only on P and d, such that every eigenvalue of A_1*A_2 greater than t is asymptotically equal to the eigenvalues of P*P, and the associated eigenvectors are aligned as well. This allowed us to perform statistical reconstruction with weak recovery even in the regime where d is really smaller than log n, which was the information-theoretic limit for exact reconstruction well known in the literature.

Key elements in this analysis are some fine concentration analysis for functionals of random graphs: some observables defined on the graph G can efficiently be coupled with their counterparts on the limiting GW tree, and can be formulated as Kesten-Stigum martingales. While computing the expectation or variance of such observables on G is not doable, the corresponding computations on the limiting GW tree are exactly solvable.

Finding similar procedures for higher-order random graphs (eg, random tensors / tensor completion) is a promising future direction ; the extension of our results for inhomogeneous random graphs was recently done in a work by Massoulié and Stephan.

  • C. Bordenave, S. Coste and R. R. Nadakuditi. ‘Detection thresholds in very sparse matrix completion’. To appear in Foundations of Computational Mathematics. https://arxiv.org/abs/2005.06062

Comments are closed.