Spectral Graph Matching and Regularized Quadratic Relaxations II: Erdős-Rényi Graphs and Universality

07/20/2019
by   Zhou Fan, et al.
Duke University
Yale University
0

We analyze a new spectral graph matching algorithm, GRAph Matching by Pairwise eigen-Alignments (GRAMPA), for recovering the latent vertex correspondence between two unlabeled, edge-correlated weighted graphs. Extending the exact recovery guarantees established in the companion paper for Gaussian weights, in this work, we prove the universality of these guarantees for a general correlated Wigner model. In particular, for two Erdős-Rényi graphs with edge correlation coefficient 1-σ^2 and average degree at least polylog(n), we show that GRAMPA exactly recovers the latent vertex correspondence with high probability when σ≲ 1/polylog(n). Moreover, we establish a similar guarantee for a variant of GRAMPA, corresponding to a tighter quadratic programming relaxation of the quadratic assignment problem. Our analysis exploits a resolvent representation of the GRAMPA similarity matrix and local laws for the resolvents of sparse Wigner matrices.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

07/20/2019

Spectral Graph Matching and Regularized Quadratic Relaxations I: The Gaussian Model

Graph matching aims at finding the vertex correspondence between two unl...
04/08/2022

Seeded graph matching for the correlated Wigner model via the projected power method

In the graph matching problem we observe two graphs G,H and the goal is ...
10/22/2021

Testing network correlation efficiently via counting trees

We propose a new procedure for testing whether two networks are edge-cor...
01/29/2014

Graph matching: relax or not?

We consider the problem of exact and inexact matching of weighted undire...
07/24/2018

Tractable Graph Matching via Soft Seeding

The graph matching problem aims to discover a latent correspondence betw...
07/14/2021

Correlated Stochastic Block Models: Exact Graph Matching with Applications to Recovering Communities

We consider the task of learning latent community structure from multipl...
07/15/2019

Seedless Graph Matching via Tail of Degree Distribution for Correlated Erdos-Renyi Graphs

The graph matching problem refers to recovering the node-to-node corresp...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Given two (weighted) graphs, graph matching aims at finding a bijection between the vertex sets that maximizes the total edge weight correlation between the two graphs. It reduces to the graph isomorphism problem when the two graphs can be matched perfectly. Let and be the (weighted) adjacency matrices of the two graphs on vertices. Then the graph matching problem can be formulated as solving the following quadratic assignment problem (QAP) [PRW94, BCPP98]:

(1)

where denotes the set of permutation matrices in and denotes the matrix inner product. The QAP is NP-hard to solve or to approximate within a growing factor [MMS10].

In the companion paper [FMWX19], we proposed a computationally efficient spectral graph matching method, called GRAph Matching by Pairwise eigen-Alignments (GRAMPA). Let us write the spectral decompositions of and as

(2)

Given a tuning parameter , GRAMPA first constructs an similarity matrix111In [FMWX19], is defined without the factor in the numerator. We include here for convenience in the proof; this does not affect the algorithm as the rounded solution is invariant to rescaling .

(3)

where is the all-ones matrix. Then it outputs a permutation matrix by “rounding” to a permutation matrix, for example, by solving the following linear assignment problem (LAP)

(4)

Let be the latent true matching, and denote the entries of and as and . A Gaussian Wigner model is studied in [FMWX19], where are i.i.d. pairs of correlated Gaussian variables such that for a noise level , and and are independent standard Gaussian. It is shown that GRAMPA exactly recovers the vertex correspondence with high probability when . Simulation results in [FMWX19, Section 4.1] further show that the empirical performance of GRAMPA under the Gaussian Wigner model is very similar to that under the Erdős-Rényi model where

are i.i.d. pairs of correlated centered Bernoulli random variables, suggesting that the performance of GRAMPA enjoys universality.

In this paper, we prove a universal exact-recovery guarantee for GRAMPA, under a general Wigner matrix model for the weighted adjacency matrix: Let

be a symmetric random matrix in

, where the entries are independent. Suppose that

(5)

and

(6)

where is an -dependent sparsity parameter and is an absolute positive constant.

Of particular interest are the following special cases:

  • Bounded case: The entries are bounded in magnitude by . Then (6) is fulfilled for and all .

  • Sub-Gaussian case: The sub-Gaussian norm of each entry satisfies

    (7)

    It is easily checked that (6) is satisfied for and all large .

  • Erdős-Rényi graphs with edge probability . We may center and scale the adjacency matrix such that for , which satisfies (5) and (6) for (cf. lmm:AB).

With the moment conditions eq:centerscale and eq:momentcond specified, we are ready to introduce the correlated Wigner model, which encompasses the correlated Erdős-Rényi graph model proposed in

[PG11] as a special case.

Definition 1.1 (Correlated Wigner model).

Let be a positive integer, an (-dependent) noise parameter, a latent permutation on , and the corresponding permutation matrix such that . Suppose that are independent pairs of random variables such that both and satisfy (5) and (6),

(8)

and for a constant , any , and all ,

(9)

where denotes the spectral norm.

The parameter measures the effective noise level in the model. In the special case of sparse Erdős-Rényi model, and are the centered and normalized adjacency matrices of two Erdős-Rényi graphs, which differ by a fraction of edges approximately.

In this paper, we prove the following exact recovery guarantee for GRAMPA:

Theorem (Informal statement).

For the correlated Wigner model, if and for any fixed constant and a sufficiently small constant , then GRAMPA with recovers exactly with high probability for large . If furthermore and are sub-Gaussian and satisfy (7), then this holds with .

This theorem generalizes the exact recovery guarantee for GRAMPA proved in [FMWX19] for the Gaussian Wigner model, albeit at the expense of a slightly stronger requirement for than in the Gaussian case. The requirement that and is the state-of-the-art for polynomial time algorithms on sparse Erdős-Rényi graphs [DMWX18], although we note that the recent work of [BCL18] provided an algorithm with super-polynomial runtime that achieves exact recovery when under the much weaker condition of .

The analysis in [FMWX19] relies heavily on the rotational invariance of Gaussian Wigner matrices, and does not extend to non-Gaussian models. Here, instead, our universality analysis uses a resolvent representation of the GRAMPA similarity matrix eq:specrep via a contour integral (cf. Proposition 3.2). Capitalizing on local laws for the resolvent of sparse Wigner matrices [EKYY13a, EKYY13b], we show that the similarity matrix eq:specrep is with high probability diagonal dominant in the sense that . This enables rounding procedures as simple as thresholding to succeed.

From an optimization point of view, GRAMPA can also be interpreted as solving a regularized quadratic programming (QP) relaxation of the QAP. More precisely, the QAP eq:QAP can be equivalently written as

(10)

and the similarity matrix in eq:specrep is a positive scalar multiple of the solution to

s.t. (11)

(See [FMWX19, Corollary 2.2].) This is a convex relaxation of the program eq:qap-equiv with an additional ridge regularization term. As a result, our analysis immediately yields the same exact recovery guarantees for algorithms that round the solution to eq:regQP instead of . In Section 6, we study a tighter relaxation of the QAP eq:qap-equiv that imposes row-sum constraints, and establish the same exact recovery guarantees (up to universal constants) by employing similar technical tools.

Organization

The rest of the paper is organized as follows. In Section 2, we state the main exact recovery guarantees for GRAMPA under the correlated Wigner model, as well as the results specialized to the (sparse) Erdős-Rényi model. We start the analysis by introducing the key resolvent representation of the GRAMPA similarity matrix in Section 3. As a preparation for the main proof, Section 4 provides the needed tools from random matrix theory. The proof of correctness for GRAMPA is then presented in Section 5. In Section 6, we extend the theoretical guarantees to a tighter QP relaxation. Finally, Section 7 is devoted to proving the resolvent bounds which form the main technical ingredient to our proofs.

Notation

Let . Let . In a Euclidean space or , let be the

-th standard basis vector, and let

be the all-ones vector. Let denote the all-ones matrix, and let denote the identity matrix. The subscripts are often omitted when there is no ambiguity.

The inner product of is defined as . Similarly, for matrices, . Let and for vectors. Let , , and for matrices.

Let and . We use to denote positive constants that may change from line to line. For sequences of positive real numbers and , we write (resp. ) if there is a constant such that (resp. ) for all , if both relations and hold, and if as . We write if and if .

2 Exact recovery guarantees for GRAMPA

In this section, we state the the exact recovery guarantees for GRAMPA, making the earlier informal statement precise.

Theorem 2.1.

Fix constants and , and let . Consider the correlated Wigner model with where . Then there exist -dependent constants and a deterministic quantity satisfying as , such that for all , with probability at least , the matrix in eq:specrep satisfies

(12)

If there is a universal constant for which and are sub-Gaussian with , then the above holds also with .

As an immediate corollary, we obtain the following exact recovery guarantee for GRAMPA.

Corollary 2.2 (Universal graph matching).

Under the conditions of Theorem 2.1, there exist constants such that for all , if

(13)

then with probability at least ,

(14)

and hence which solves the linear assignment problem (4) equals .

Proof.

Let and , where is the constant given in thm:diag-dom. Then under assumption eq:assump_sigma_main, we have

so . We also have and and for all large , so that . This implies . ∎

An important application of the above universality result is matching two correlated sparse Erdős-Rényi graphs. Let be an Erdős-Rényi graph with vertices and edge probability , denoted by . Let and be two copies of Erdős-Rényi graphs that are i.i.d. conditional on , each of which is obtained from by deleting every edge of with probability independently where . Then we have that marginally where . Equivalently, we may first sample an Erdős-Rényi graph , and then define by

Suppose that we observe a pair of graphs and , where is an unknown permutation matrix. We then wish to recover the permutation matrix .

We transform the adjacency matrices and so that they satisfy the moment conditions (5) and (6): Define the centered, rescaled versions of and by

(15)

Then (5) clearly holds, and we check the following additional properties.

Lemma 2.3.

For all large , the matrices and satisfy (6), (8), and eq:diffnorm with and

Proof.

Assume without loss of generality that is the identity matrix. For any we have

Thus, the moment condition (6) is satisfied. In addition, we have that for all ,

where the last equality holds by the choice of . Thus, (8) is satisfied. Moreover, let It follows that and

where the last inequality is due to . Thus, by applying lmm:normbound and where the upper bound follows from , there exists a constant such that for any , with probability at least for all , we have and hence . Thus eq:diffnorm is satisfied. ∎

Combining lmm:AB with cor:general immediately yields a sufficient condition for GRAMPA to exactly recover in the correlated Erdős-Rényi graph model.

Corollary 2.4 (Erdős-Rényi graph matching).

Suppose that either

  1. (dense case)

    for constants and , or

  2. (sparse case)

    for constants and .

There exist -dependent constants such that if and , then with probability at least ,

and hence the solution to the linear assignment problem (4) coincides with .

Proof.

For (a), pick and any such that . For (b), pick any such that and . Then all conditions of thm:diag-dom and cor:general are satisfied for large , and the result follows. ∎

3 Resolvent representation

For a real symmetric matrix with spectral decomposition (2), its resolvent is defined by

for . Then we have the matrix symmetry , conjugate symmetry , and the following Ward identity.

Lemma 3.1 (Ward identity).

For any and any real symmetric matrix ,

Proof.

By the definition of and conjugate symmetry, it holds

The following resolvent representation of is central to our analysis.

Proposition 3.2.

Consider symmetric matrices and with spectral decompositions (2), and suppose that . Then the matrix defined in eq:specrep admits the following representation

(16)

where

(17)

is the rectangular contour with vertices .

Proof.

We have

(18)

by Lemma 3.1. Consider the function defined by . Then each entry is analytic in the region . Since

encloses each eigenvalue

of , the Cauchy integral formula yields entrywise equality

(19)

Substituting this into (18), we obtain

(20)

which completes the proof in view of the definition of . ∎

4 Tools from random matrix theory

Before proving our main results, we introduce the relevant tools from random matrix theory. In particular, the resolvent bounds in Theorem 4.5 constitute an important technical ingredient in our analysis.

4.1 Concentration inequalities

We start with some known concentration inequalities in the literature.

Lemma 4.1 (Norm bounds).

For any constant and a universal constant , if , then with probability at least ,

Proof.

See [EKYY13b, Lemma 4.3], where we fix the parameter in [EKYY13b, Eq. (2.4)]. The notational identification is . ∎

Lemma 4.2 (Concentration inequalities).

Let be independent random vectors with independent entries, satisfying

(21)

For any constant and universal constants , if , then:

  1. For each , with probability at least ,

    (22)
  2. For any deterministic vector , with probability at least ,

    (23)

    Furthermore, for any even integer ,

    (24)
  3. For any deterministic matrix , with probability at least ,

    (25)

    and

    (26)
Proof.

See [EKYY13b, Lemma 3.7, Lemma 3.8, and Lemma A.1(i)], where again we fix . ∎

Next, based on the above lemma, we state concentration inequalities for a bilinear form that apply to our setting directly.

Lemma 4.3 (Concentration of bilinear form).

Let be random vectors such that the pairs for are independent, with

Let be any deterministic matrix.

  1. For any constant , suppose (21) holds where . Then there are universal constants such that with probability at least ,

    (27)
  2. Suppose that are sub-Gaussian with for a constant . Then for any , there exists a constant only depending on and such that with probability at least ,

    (28)
Proof.

In view of the polarization identity

it suffices to analyze the two terms separately. Note that

which yields the desired expectation Thus it remains to study the deviation.

To prove the concentration bound eq:diag, we obtain from eq:quad1 that, there is a universal constant such that with probability at least ,

from which eq:diag easily follows.

The sub-Gaussian concentration bound eq:bilinear_wigner follows from the Hanson-Wright inequality [HW71, RV13]. More precisely, note that , so taking in [FMWX19, Lemma A.2] yields that with probability at least ,

which completes the proof. ∎

4.2 The Stieltjes transform

Denote the semicircle density and its Stieltjes transform by

(29)

respectively, where is defined for , and is defined with a branch cut on so that as . We have the conjugate symmetry .

We record the following basic facts about the Stieltjes transform.

Proposition 4.4.

For each , the Stieltjes transform is the unique value satisfying

(30)

Setting , uniformly over with ,

(31)

For , the continuous extensions

from and both exist. For all , these satisfy

(32)
Proof.

(30) follows from the definition of . (31) follows from [EKYY13a, Lemma 4.3] and continuity and conjugate symmetry of . For the existence of (and hence also ), see e.g. the more general statement of [Bia97, Corollary 1]. The first claim of (32) follows from continuity and (30), the second from conjugate symmetry, the third from the Stieltjes inversion formula, and the last from the fact that the two roots of (30) at are and , so that . ∎

4.3 Resolvent bounds

For a fixed constant and all large , we bound the resolvent over the spectral domain

Here, is the union of two strips in the upper and lower half planes, and is the union of two strips in the left and right half planes.

Theorem 4.5 (Resolvent bounds).

Suppose has independent entries satisfying (5) and (6). Fix a constant which defines the domain , fix , and set

Suppose . Then for some constants depending on and , and for all , with probability , the following hold simultaneously for every :

  1. (Entrywise bound) For all ,

    (33)

    For all ,

    (34)
  2. (Row sum bound) For all ,

    (35)
  3. (Total sum bound)

    (36)

The proof follows ideas of [EKYY13b], and we defer this to Section 7. As the spectral parameter is allowed to converge to the interval with increasing , this type of result is often called a “local law” in the random matrix theory literature. The focus of the above is a bit different from the results stated in [EKYY13b], as we wish to obtain explicit logarithmic bounds for