1 Introduction
In [Kho02], Subhash Khot put forward a family of conjectures known as the “to games conjectures”. A binary constraint where s take values in alphabet is said to be dtod if for every value to , there are exactly values for that satisfy and viceversa. For any , the “to games conjecture” roughly says that for every , there is some finite alphabet such that it is NPhard to distinguish, given a constraint satisfaction problem with dtod constraints, whether it is possible to satisfy at least fraction of the constraints, or if every assignment satisfies at most fraction of the constraints. ^{1}^{1}1For , the conjectures are often stated in their perfect completeness variant, where we replace with in the first case. In this work (as well as all the line of works following [KMS17]), we refer to the imperfect completeness version as stated above. The case of corresponds to the more famous Unique Games Conjecture, but until recently there was no constant for which the corresponding dtod conjecture was known to be true.
Dinur, Khot, Kindler, Minzer, and Safra [DKK16], building on ideas of Khot, Minzer and Safra [KMS17], recently initiated an approach towards proving the to conjecture, based on a certain combinatorial hypothesis positing the soundness of the “Grassmann agreement test”.
In this work we show that their hypothesis follows from a certain natural hypothesis characterizing the structure of nonexpanding sets in the degree two shortcode graph [BGH15]. Following our work, Khot, Minzer and Safra [KMS18] proved the latter hypothesis thus completing the proof of the 2to2 games conjecture. This has several implications to hardness of approximation including improving on the NPhardness of approximation for Vertex Cover along with a host of other improved NPhardness results. Perhaps more importantly, this also gives a strong evidence for the truth of the Unique Games Conjecture itself. We defer to [DKK16, DKK18, KMS18] for a detailed discussion on the importance of the 2to2 games conjecture, as well as the reduction of this conjecture to showing the soundness of the Grassmann agreement tester.
1.1 Our Results
Our main result reduces the task of proving the “Grassmann agreement hypothesis” of Dinur et al [DKK16, Hypothesis 3.6] to characterizing the structure of nonexpanding sets in the associated Grassmann graph.

We describe the related Shortcode test and the associated agreement and expansion hypothesis and relate them to the Grassmann versions above.
The above, combined with the work of [DKK16, KMS18], suffices to prove the 2to2 conjecture. However we note that it is possible to directly obtain a proof of the 2to2 conjecture (see the recent exposition at [BCS18]) using the “Inverse Shortcode Hypothesis” without going through the Grassmann graph at all. We think the shortcode view provides a natural way to understand the reduction and suggests potential extensions, see Section 1.6.
1.2 Grassmann Graph and DKKMS Consistency Test
To state our results formally, we need to define the Grassman and shortcode graphs, which we now do. The Grassmann graph with parameters has vertices given by all dimensional subspaces (denoted by ) of Two subspaces of have an edge between them if .
Let be the set of all linear functions For every , let be the map that assign to every , the restriction of the linear function to the subspace . Let be the set of all such maps.
The Grassmann Consistency test is a twoquery test for described below:
frametitle= Test 0: Grassmann Consistency Test , innertopmargin=10pt, frametitleaboveskip=frametitlealignment=
 Given:

a map from that maps any to a linear function on .
 Operation:


Pick an edge of uniformly at random.

Receive .

Accept if otherwise reject.

It is easy to see the following completeness of the Grassmann graph test.
Fact 1.1 (Completeness).
The DKKMS hypothesis conjectures a precise version of soundness of the Grassmann Consistency Test.
Hypothesis 1.2 (DKKMS Soundness Hypothesis).
For every , there exists , and an integer such that following holds for sufficiently large
Let such that Then, there exist subspaces of dimensions and respectively and a such that
1.3 Shortcode Graph and Consistency Test
We now define the closely related Degree 2 Shortcode graph and a immediate analog of the Grassmann consistency test on this graph. For parameters as before, the vertices of the degree 2 shortcode graph are elements of , that is, all matrices on with dimensions . Two vertices and have an edge between them if is a rank 1 matrix over the field . The 2 query codeword test on this graph is entirely analogous to the one above for the Grassmann graph:
frametitle= Test 1: Degree 2 Shortcode Consistency Test , innertopmargin=10pt, frametitleaboveskip=frametitlealignment=
 Given:

a map from .
 Operation:


Receive .

Accept if .
Just as the Grassmann consistency test, the above shortcode consistency test is "2to2" constraint and the following completeness is easy to establish.
Fact 1.3 (Completeness).
Let be any affine linear function. Let be the map that evaluates on each row the input matrix. Then, passes the shortcode consistency test with probability 1.
The analogous soundness hypothesis can now be stated as:
Hypothesis 1.4 (Degree 2 Shortcode Soundness Hypothesis).
For every , there exists , and an integer such that following holds for sufficiently large
Let such that Then, there exists linear constraints and for and a such that
1.4 Soundness vs SmallSet Expansion in Grasmann/Shortcode Graphs
Recall that for a regular graph , the expansion of a set of vertices is the probability, that a random walk beginning at a uniformly random vertex in steps out of . That is,
The DKKMS Soundness Hypothesis implies a natural characterization small nonexpanding sets in the noted below as Hypothesis 1.6. Similarly, the degree 2 shortcode soundness hypothesis implies a natural characterization of nonexpanding sets in . We include a brief overview of the argument here and refer the reader to the more extensive commentary in Section 1.3 of [DKK16] for further details.
Suppose are “nonexpanding” sets that cover a constant fraction of vertices in We construct a labeling strategy by choosing uniformly random linear functions and setting if and is a random linear function otherwise. Clearly, doesn’t agree with a single linear function on significantly more than fraction of the vertices in On the other hand, if s are sufficiently nonexpanding, then, a random edge will lie inside one of the s with a nontrivially large probability and thus will satisfy the Grassmann consistency test. In this, case, we will hope that there are subspaces of constant dimension and codimension, respectively such that restricting to subspaces (where is the subset such that ) implies that for some fixed global linear function . This can happen in the above example for only if there are as above such that one of the is (i.e. independent of ). Thus, Hypothesis 1.2 forces that the nonexpanding sets to be “structured” (in the sense of having a large density inside for some of constant dimension and codimension, respectively.) This can be interpreted as saying that the nonexpansion of any set of vertices in can be “explained” away by a more than typical density in one of the canonical nonexpanding sets (i.e., those that contain a subspace and are contained inside a subspace of constant dimension and codimension, respectively.)
To formally state the Grassmann Expansion Hypothesis, we define the special nonexpanding sets (referred to as “zoomin” and “zoomouts” in [DKK18]):
Definition 1.5 (Nice Sets in Grassmann Graph).
A subset of vertices in is said to be nice if there are subspace of of dimension and codimension respectively such that and
Hypothesis 1.6 (Grassmann Expansion Hypothesis).
For every , there exists depending only on such that if satisfies , then, there are subspaces over of dimension and codimension satisfying respectively, such that
Analogously, we can define nice sets in the degree 2 shortcode graph and state the expansion hypothesis. We call , a right affine subspace of matrices in if there are pairs and every satisfies . We define a left affine subspace analogously.
Definition 1.7 (Nice Sets in Degree 2 Shortcode Graph).
A subset is said to be nice if it is an intersection of a left and right affine subspace in with sum of the dimensions .
Hypothesis 1.8 (Inverse Shortcode Hypothesis).
For every , there exist depending only on such that for every subset , if , then, there exists an nice set such that .
While Hypotheses 1.2 and 1.4 posit soundness of a specific “codeword consistency” test associated with the Grassmann/Shortcode graphs, Hypotheses 1.6 and 1.8 ask for a purely graph theoretic property: a characterization of nonexpanding sets in and . As such, it appears easier to attack and [DKK16] thus suggested understanding the structure of nonexpanding sets in as a natural first step. As we show in this note, proving Hypothesis 1.8 is in fact enough to show Hypothesis 1.2. In a follow up work [KMS18], this result was used in to complete the proof of the DKKMS soundness hypothesis.
1.5 Our Results
We are now ready to state our main results formally.
First, we show that the soundness of the shortcode consistency test follows from the expansion hypothesis for the shortcode graph.
Theorem 1.9.
Second, we show that the soundness hypothesis for the shortcode consistency test implies the soundness hypothesis for the Grassmann consistency test. This reduces the DKKMS soundness hypothesis to establishing the expansion hypothesis for the Shortcode graph.
Theorem 1.10.
The degree 2 Shortcode Soundness Hypothesis implies the Grassmann Soundness Hypothesis 1.2.
Finally, we relate the expansion hypothesis of the Grassmann graph to the expansion hypothesis for the degree 2 shortcode graph.
1.6 Discussion
Working with the shortcode consistency test (and consequently, the shortcode expansion hypothesis) makes an approach to proving Hypothesis 1.2 somewhat more tractable. This is because unlike the Grassmann graph, Degree 2 shortcode graph is a Cayley graph on (isomorphic to ) under the group operation of addition with the set of all rank 1 matrices forming the set of generators. Thus studying expansion of sets of vertices can be approached via powerful methods from Fourier analysis. Indeed, this is the route taken by the recent breakthrough [KMS18] that proves the shortcode expansion hypothesis and completes the proof of the 2to2 games conjecture (with imperfect completeness).
Perhaps equally importantly, the shortcode consistency test suggests immediate extensions (higher degree shortcode graphs) that provide a natural path to proving the Unique Games Conjecture. We discuss this approach here.
First, the Grassmann/shortcode consistency tests as stated above are “2to2” tests. That is, for any reply for the first query, there are two admissible replies for the other query. However, it is simple to modify the tests and make them unique or “1to1” at the cost of making the completeness instead of . For concreteness, we describe this simple modification below.
frametitle= Test 2: Unique Degree 2 Shortcode Consistency Test , innertopmargin=10pt, frametitleaboveskip=frametitlealignment=
 Given:

a map from .
 Operation:


Pick and a rank 1 matrix for vectors , all uniformly at random from their respective domains. Let .

Receive .

Accept if .

frametitle= Test 3: Unique Degree 3 Shortcode Consistency Test , innertopmargin=10pt, frametitleaboveskip=frametitlealignment=
 Given:

a map from .
 Operation:


Pick
and a rank 1 tensor
for vectors , and all uniformly at random from their respective domains. Let . 
Receive .

Accept if .

It is easy to check that the any strategy that passes the 2to2 test can be modified to obtain a success probability of in passing the “unique” test above (see proof of Lemma 2.2 below). This is one of the several ways that the NP hardness of “2to2” games implies the NP hardness of unique games  that is, distingushing between instances where at least the constraints are satisfiable from those where at most fraction of constraints are satisfiable.
A natural strategy, thus, to try to show NP hardness of unique games is to use some variant of the shortcode consistency test above that has completeness instead of . Indeed, the degree 2 shortcode consistency test suggests natural analogs with higher completeness  by moving to higher degree shortcode graphs. For concreteness, consider the following test on degree 3 shortcode graphs, where it is easy to argue a completeness of .
Let be the set of all tensors over . Recall that a rank 1 tensor is defined by 3 vectors , and and can be written as .
To see why there’s a natural analog of the strategy in case of the degree 2 shortcode consistency test that gives a completeness of , we show:
Lemma 1.12 (Completeness).
Let and . Let be the map that assigns to any tensor , the value . Then, passes the test with probability .
Proof.
Let be such that is rank 1 tensor. Then, passes the test only if . If , then . Since are independently chosen in the test, the probability that is . ∎
Thus, the degree 3 shortcode consistency test gives a natural analog of the degree 2 shortcode consistency test with higher completeness. Indeed, degree r version gives a test with completeness of as expected. One can also frame expansion hypotheses similar to the ones for the degree 2 case that posit a characterization of the nonexpanding sets in higher degree shortcode graphs.
While our current efforts to compose this test with the “outerPCP” in order to get a reduction to Unique Games problem (with higher completeness) have not succeeded, it seems a natural avenue for launching an attack on the UGC.^{2}^{2}2There are indeed very serious obstacles that must be overcome before carrying this out. Specifically, the reduction of [DKK16] uses a careful interplay between smoothness properties of the outer PCP and efficiency or “blow up” properties of the test (i.e., the number of potential queries by the verifier as a function of the number of honest strategies). The tensor based test has too much of a blowup to be able to be simply “plugged in” in the outer PCP used by [DKK16].
2 SmallSetExpansion vs Soundness
In this section, we establish that the inverse shortcode hypothesis (Hypothesis 1.8) implies the soundness of the degree 2 shortcode consistency test 1.4.
From 2to2 to Unique Tests
For the sake of exposition, it will be easier to work with Test 1.6, the “unique” version of the degree 2 shortcode consistency test. Thus, we restate the soundness hypothesis for Test 1.6 and show that it is enough to establish Hypothesis 1.4.
Hypothesis 2.1 (Soundness of Test 1.6).
For every , there exists , and an integer such that following holds for sufficiently large
Let such that Then, there exists linear constraints and for and a such that
Proof.
Let be the labeling strategy for Test 1.3. We will first obtain a good labeling strategy for Test 1.6 by modifying slightly.
Choose uniformly at random from . For any , let . We claim that if passes the Test 1.3 with probability , then passes Test 1.6 with probability at least .
To see this, take any such that in . That is, for vectors . We will argue that with probability . This will imply that in expectation over the choice of , satisfies at least the constraints satisfied by in Test 1.3 completing the proof.
This is simple to see: since passes the test, or . WLOG, say the first happens. Observe that passes the unique test on if or . Since , thus passes if which happens with probability .
∎
Expansion to Soundness
We will now show that Hypothesis 1.8 implies Hypothesis 2.1. This completes the proof of Theorem 1.9. A similar argument can be used to directly establish that Hypothesis 1.6 implies Hypothesis 1.2. We do not include it here explicitly. Instead, we relate the expansion and soundness hypothesis for the degree 2 shortcode test to the analogs for the Grassmann test as we believe this could shed light on showing expansion hypotheses for higher degree shortcode tests discussed in the next section.
Proof.
Let be the labeling function as in the assumption in Hypothesis 2.1. Then, we know that For any , let be the set of all matrices with . Then, by an averaging argument, there must be a such that
Apply Hypothesis 1.8 to to obtain nice subset of such that . Let be a affine constraint satisfied by every . Consider the affine linear strategy that maps any to . Observe that for every , by this choice. As a result, when are such that , . Thus, is the “decoded” strategy that satisfies the requirements of Hypothesis 2.1 as required. This completes the proof.
∎
3 Relating Grassmann Graphs to Degree 2 Shortcode Graphs
In this section, we show a formal relationship between the Grassmann and the degree Shortcode tests. In particular, we will prove Theorems 1.10 and 1.11.
3.1 A homomorphism from into
Key to the relationship between the two tests is an embedding of the degree 2 shortcode graph into . We describe this embedding first. As justified in the previous section, it is without loss of generality to work with the “unique” versions of both the tests.
To describe the above embedding, we need the notion of projection of a subspace of to a set of coordinates.
Definition 3.1 (Projection of a Subspace).
Given a subspace , the projection of to a set of coordinates , written as is the subspace of defined by taking the vectors obtained by keeping only the coordinates indexed by for every vector
Let be the set tuples of linearly independent elements of , i.e. each forms a basis for the vector space . We will use to denote the standard basis
We will now describe a class of graph homomorphisms from into . Each element of this class can be described by a basis of .
For each basis , let be the set of all subspaces such that the projection of to the first coordinates when written w.r.t. the basis is fulldimensional. Our embedding will map each element of into a distinct element of such that the edge structure within in is preserved under this embedding.
Definition 3.2 (Homomorphism from into ).
Let be defined as follows. Write every vector in the basis. For any and for , let be the unique vector in such that . We call to be the canonical basis for .
Define to be the matrix with the row given by the projection of on the last coordinates for each . When the basis is clear from the context, we will omit the subscript and write .
It is easy to confirm that is a bijection from into . This is because canonical basis for a subspace is unique.
Next, we prove some important properties of the homomorphism that will be useful in the proof of Theorem 1.10.
First, we show that the map is indeed a homomorphism as promised and thus, preserves edge structure.
Lemma 3.3 ( is a homomorphism).
For defined above and any , in iff in .
Proof.
Let be arbitrary nonzero vectors that define a rank 1 matrix . Consider the matrix . Then, and thus . We claim that Suppose are the rows of . Then, the rows of are given by . Thus, is spanned by where is the standard basis element on the first coordinates and the notation
indicates the concatenation of the vectors in the ordered pair to get a
dimensional vector. In particular, every element of can be written as and any such vector is contained in if implying thatOn the other hand, let be a subspace in such that and let and be the matrices obtained via the map . Then, and must differ in at least one row, say, WLOG, the last row of and are and respectively. Notice that since the vector with in the first coordinates is unique in , neither of belong to the intersection . Further, for every vector , either or must be contained in the intersection (as the extra linear equation that satisfies over and above is satisfied by exactly one of and . Thus, by letting to every one of the canonical basis elements of that are not in , we get a set of elements that are all 1) contained in 2) for every . This then has to be the canonical basis of (by uniqueness of the canonical basis) and further, the corresponding can be written as where is the set of such that is not in . ∎
Next, we want to argue that expansion of sets is preserved up to constant factors under the map . Towards this, we first show that contains a fraction of the vertices of as we next show.
Lemma 3.4 (Projections of Subspaces).
Let for Then, for large enough and
Further, let for some . Then, at least fraction of the neighbors of in are contained in .
Proof.
We can sample a random subspace of dimension as follows: Choose uniformly random and independent points from . If they are linearly independent, let be the subspace spanned by them.
We can estimate the probability that the sampled points are linearly independent as:
Next, we estimate the probability that the projection to first coordinates of the sampled vectors is linearly independent. By a similar reasoning as above, this probability is at least (the limit of this product for large .)
By a union bound, thus, a random subspace has a full dimensional projection on with probability at least for any
For the remaining part, assume that  the standard basis. Notice that a random neighbor of can be sampled as follows: choose a uniformly random basis for , say . Replace by a uniformly random vector outside of in . Since , the projection of to the first coordinates is linearly independent. would thus satisfy the same property whenever is such that the projection of to the first coordinates is not in the span of the projection to the first coordinates of . The chance of this happening is exactly . This completes the proof. ∎
As a consequence of above, we can now obtain that the preimages of nonexpanding sets under are nonexpanding in .
Lemma 3.5.
Let be a subset satisfying . Then, satisfies: .
Proof.
Let the basis used to construct . Then, . By Lemma 3.4, 1/2 the neighbors of are contained in . By assumption, fraction of these neighbors are contained inside . This finishes the proof. ∎
Via a similar application of Lemma 3.4, we can establish an appropriate converse.
Lemma 3.6.
Let be a subset satisfying . Then, for a uniformly random choice of basis for , and .
Finally, we show that nice sets in get mapped to nice sets in and viceversa.
Lemma 3.7.
Let be an nice set in . Then, is an nice set in . Conversely, if is an nice set in then for some nice set in .
Proof.
WLOG, assume that . We will assume that is the set of all subspaces in contained in a subspace of codimension . The general case is analogous. Equivalently, if form a basis for , then, for every and ever for every .
Consider the canonical basis for  recall that this means that the projection of to the first coordinates equal . Thus, for every , we can write for some vectors of dimensions.
Then, is the matrix with rows by our construction. In particular, this means that the satisfies the constrain: where is the vector with th coordinate equal to . Thus, we have shown that for every , satisfies a set of affine linear equations.
Conversely, observe that if any satisfies the affine linear equation as above, the set of all for where is the th row of , must span a subspace in . This yields that is an nice set.
The converse follows from entirely similar ideas. Suppose is an nice set. WLOG, we restrict to the case where is the set of all matrices satisfying linear constraints for some choice of linearly independent constraints . Letting be the rows of , this implies that every vector in the span of for satisfies the linear equation where . This immediately yields that is contained in a subspace of codimension . Conversely, it is easy to check that for every subspace of dimension contained in , satisfies the affine linear constraints above.
This completes the proof.
∎
3.2 Shortcode Test vs Grassmann Test
We now employ the homomorphism constructed in the previous subsection to relate the soundness and expansion hypothesis in shortcode and Grassmann tests.
First, we show that the soundness hypothesis for degree 2 shortcode consistency test implies the soundness hypothesis for the Grassmann consistency test and complete the proof of Theorem 1.10.
Lemma 3.8.
Proof.
Let be the assumed labeling strategy in Hypothesis 1.2. We will construct a labeling strategy for from so that we can apply the conclusion of 2.1. We will first choose an embedding of the type we constructed before in order to construct .
Let be chosen uniformly at random and let as in the previous subsection. For any , let , a linear function restricted to . Let be the canonical basis for , i.e., the projection of to the first coordinates (when written in basis ) equals for every . Set where . Since is a onto, this defines a labeling strategy for all of .
Next, we claim that if passes the Grassmann consistency test with probability then passes the degree 2 shortcode consistency test with probability .
Before going on to the proof of this claim, observe that this completes the proof of the lemma. To see this, we first apply Hypothesis 2.1 to conclude that there’s an nice set in and an affine function defined by such that the labeling strategy passes the degree 2 shortcode consistency test with probability for all in . It it easy to construct the an analogous linear strategy for the Grassmann consistency test: For any with the canonical basis defined above, set . Extend linearly to the span of all such vectors. Finally, extend to all vectors by taking any linear extension. From Lemma 3.4, 1/2 the neighbors of vertices in are contained in . From Lemma LABEL:lem:nicesetsmaptonicesets, for some nice set in . Finally, by an argument similar to the one in Lemma 3.4, with high probability over the draw of . Combining the above three observations yileds that passes the Grassmann consistency test when restricted to the nice set with probability .
We now complete the proof of the claim. This follows immediately if we show that for any chosen from ,
Without loss of generality, we assume that is the standard basis . First, notice that is of dimension for for all but fraction of pairs . Thus, we can assume that
Let , the basis change matrix corresponding to and let be the row of and let be the matrix formed by taking the first rows of . Fix for some . Assume now that is of dimension . Let be a basis for . Let and for some that linearly independent of each other and of any vector in . We estimate the probability that . Then, this is the probability that are mapped by into respectively, satisfying . It is easy to check that the probability of this over the random choice of is at least . This proves the claim.
By taking large enough (compared to ), this probability can be made larger than, say, (say). This finishes the proof.
∎
Next, we show that the Grassmann Expansion Hypothesis (Hypothesis 1.6) is equivalent to the Inverse Shortcode Hypothesis (Hypothesis 1.8) and complete the proof of Theorem 1.11.
Lemma 3.9.
Proof.
Let be such that . Then, by Lemma 3.5, has an expansion of in .
References
 [BCS18] Mitali Bafna, ChiNing Chou, and Zhao Song, An exposition of dinurkhotkindlerminzersafra proof for the 2to2 games conjecture, http://boazbarak.org/dkkmsnotes.pdf.
 [BGH15] Boaz Barak, Parikshit Gopalan, Johan Håstad, Raghu Meka, Prasad Raghavendra, and David Steurer, Making the long code shorter, SIAM J. Comput. 44 (2015), no. 5, 1287–1324. MR 3416138
 [DKK16] Irit Dinur, Subhash Khot, Guy Kindler, Dor Minzer, and Muli Safra, Towards a proof of the 2to1 games conjecture?, Electronic Colloquium on Computational Complexity (ECCC) 23 (2016), 198.
 [DKK18] , On nonoptimally expanding sets in grassmann graphs, STOC 24 (2018), 94.
 [Kho02] Subhash Khot, On the power of unique 2prover 1round games, Proceedings of the 17th Annual IEEE Conference on Computational Complexity, Montréal, Québec, Canada, May 2124, 2002, 2002, p. 25.

[KMS17]
Subhash Khot, Dor Minzer, and Muli Safra, On independent sets, 2to2
games, and grassmann graphs
, Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing, STOC 2017, Montreal, QC, Canada, June 1923, 2017, 2017, pp. 576–589.
 [KMS18] , Pseudorandom sets in grassmann graph have nearperfect expansion, Electronic Colloquium on Computational Complexity (ECCC) 25 (2018), 6.
Comments
There are no comments yet.