Larger Corner-Free Sets from Combinatorial Degenerations

11/16/2021
by   Matthias Christandl, et al.
University of Amsterdam
0

There is a large and important collection of Ramsey-type combinatorial problems, closely related to central problems in complexity theory, that can be formulated in terms of the asymptotic growth of the size of the maximum independent sets in powers of a fixed small (directed or undirected) hypergraph, also called the Shannon capacity. An important instance of this is the corner problem studied in the context of multiparty communication complexity in the Number On the Forehead (NOF) model. Versions of this problem and the NOF connection have seen much interest (and progress) in recent works of Linial, Pitassi and Shraibman (ITCS 2019) and Linial and Shraibman (CCC 2021). We introduce and study a general algebraic method for lower bounding the Shannon capacity of directed hypergraphs via combinatorial degenerations, a combinatorial kind of "approximation" of subgraphs that originates from the study of matrix multiplication in algebraic complexity theory (and which play an important role there) but which we use in a novel way. Using the combinatorial degeneration method, we make progress on the corner problem by explicitly constructing a corner-free subset in F_2^n × F_2^n of size Ω(3.39^n/poly(n)), which improves the previous lower bound Ω(2.82^n) of Linial, Pitassi and Shraibman (ITCS 2019) and which gets us closer to the best upper bound 4^n - o(n). Our new construction of corner-free sets implies an improved NOF protocol for the Eval problem. In the Eval problem over a group G, three players need to determine whether their inputs x_1, x_2, x_3 ∈ G sum to zero. We find that the NOF communication complexity of the Eval problem over F_2^n is at most 0.24n + O(log n), which improves the previous upper bound 0.5n + O(log n).

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

04/02/2021

Communication Complexity, Corner-Free Sets and the Symmetric Subrank of Tensors

We develop and apply new combinatorial and algebraic tools to understand...
12/17/2018

Barriers for fast matrix multiplication from irreversibility

The determination of the asymptotic algebraic complexity of matrix multi...
03/29/2021

An upper bound on the size of Sidon sets

In this entry point into the subject, combining two elementary proofs, w...
02/15/2019

Universally Sparse Hypergraphs with Applications to Coding Theory

For fixed integers r> 2,e> 2,v> r+1, an r-uniform hypergraph is called G...
04/06/2022

Nearly Tight Spectral Sparsification of Directed Hypergraphs by a Simple Iterative Sampling Algorithm

Spectral hypergraph sparsification, which is an attempt to extend well-k...
07/25/2020

Bounding the trace function of a hypergraph with applications

An upper bound on the trace function of a hypergraph H is derived and it...
07/21/2020

Improved lower and upper bounds on the tile complexity of uniquely self-assembling a thin rectangle non-cooperatively in 3D

We investigate a fundamental question regarding a benchmark class of sha...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

This paper is about constructing special combinatorial objects, namely “corner-free sets” in , motivated (besides their inherent interest) by central problems in communication complexity, specifically in the study of the number on the forehead (NOF) model of communication introduced by Chandra, Furst and Lipton [15]. There has been much interest in (and progress on) the corner problem, variations of the problem, and connections to NOF communication, in particular in the recent works of Shraibman [45], Linial, Pitassi and Shraibman [34], Viola [50], Alon and Shraibman [5], and Linial and Shraibman [35, 36]. In the recent work of Linial and Shraibman [35] a construction of large corner-free sets in was obtained in an elegant manner by designing efficient NOF communication protocols for a specific communication problem (much like the upcoming Eval problem). We take a different, algebraic approach to the corner problem, and make progress on the corner problem over by introducing in this area a new algebraic method via combinatorial degeneration.

NOF communication complexity

The NOF model is very rich in terms of connections to Ramsey theory and additive combinatorics [9, 45, 34, 35, 36], as well as applications to boolean models of compution such as branching programs and boolean circuits [15, 10]. The goal in the NOF model is for players to compute a fixed given function on inputs where player has access to input for all but no access to input . For , this model coincides with the standard two-party communication model of Yao [51], but when , the shared information between the players makes this model surprisingly powerful [29, 6, 1, 16], and fundamental problems remain open. For instance, a sufficiently strong lower bound for an explicit function  for players with would imply a breakthrough result in complexity theory, namely a lower bound on the complexity class .

NOF complexity of the Eval problem

A central open problem in the theory of NOF communication is to construct an explicit function for which randomized protocols are significantly more efficient than deterministic ones [8]. A well-studied candidate for this separation (for ) is the function , which is defined by if and only if , where the additions are all in . Thus the Eval problem naturally generalizes the equality problem for . It is known that in the randomized setting, the standard protocol for the two-party equality problem that uses bits of communication works in the same way for three parties for the Eval problem. However, in the deterministic setting, the communication complexity remains wide open: the best known lower bound follows from the work of Lacey and McClain [33] and, before this work, the best upper bound was  [1].

Corner problem in combinatorics, and connection to the Eval problem

Chandra, Furst and Lipton [15] found that the deterministic communication complexity of many problems in the NOF model can be recast as Ramsey theory problems. In particular, and this leads to the problem of interest in this paper, the (deterministic) communication complexity of  can be characterized in terms of corner-free subsets of , as follows. We call any triple of elements for a corner. A subset is called corner-free if it does not contain any nontrivial corners (where nontrivial means that ). Denoting by the size of the largest corner-free set in , the communication complexity of equals up to a additive term, which provides the close connection between the Eval problem in NOF communication and the corner problem in combinatorics. In particular, large corner-free sets in correspond to efficient protocols for .

General paradigm: Shannon capacity of hypergraphs

The point of view we will take (and the general setting in which the methods we introduce will apply) is to regard the corner problem as a Shannon capacity problem of directed hypergraphs. Namely, the size of the largest corner-free set in can be characterized as the independence number of a (naturally defined) directed -uniform hypergraph with vertices.111As usual an independent set of a hypergraph is a subset of vertices such that no hyperedge has all its vertices in . This hypergraph has a recursive form: it is obtained by taking the -th power of a fixed (directed) hypergraph on vertices. (We discuss this in more detail in Section 2.) The asymptotic growth of as is characterized by the Shannon capacity of the corner hypergraph .222In the setting of directed graphs, also the term Sperner capacity (typically applied to the complement graph) [27, 26] is used for what we call Shannon capacity. That is, we have . In this way, proving the strict upper bound is equivalent to proving a linear lower bound on the communication complexity of . Many other Ramsey type problems can be expressed as the Shannon capacity of some fixed hypergraph, such as the Cap Set problem that saw a recent breakthrough by Ellenberg and Gijswijt [25] following Croot, Lev and Pach [22], and the Uniquely Solvable Puzzle (USP) problems that were put forward in the “group-theoretic approach” to the matrix multiplication problem [20, 4].

1.1 Is the complexity of the Eval problem maximal?

Let us discuss the open problem that motivates our work, and that is central in NOF communication complexity and combinatorics (throught the aforementioned connections). This problem asks whether or not the complexity of the Eval probem is “maximal”, or in other words, whether or not there are corner-free sets in  that have “sub-maximal” size:

Problem .

Are the following three statements (which we know are equivalent333The equivalence among the three formulations is standard and follows from Lemma 2.1, Proposition 2.1 and Lemma 2.1 further on in the paper. We will mainly use the formulation in terms of Shannon capacity (see Definition 2.1 for a precise definition).) true?

  • .

Here the best capacity lower bound before our work was by Linial, Pitassi and Shraibman [34, Cor. 24 in the ITCS version], obtained by explicit construction of an independent set in the second power of the relevant hypergraph, which in turn leads to the bounds and .

In the above we may naturally generalize to or even to , where is an arbitrary abelian group, so that Problem 1.1 is a special case of the more general problem:

Problem .

Are the following three statements (which we know are equivalent) true?

  • .

Our goal in this paper, motivated by the connections as remarked earlier, is to make progress on above problems via new algebraic methods.

1.2 Lower bounds for the corner problem (and other problems) from combinatorial degeneration

Our main result is progress on Problem 1.1 by proving new lower bounds for the corner problem over the groups and , which we arrive at via a new method to lower bound the Shannon capacity of directed hypergraphs. Equivalently, in the language of communication complexity, we obtain improved protocols for the Eval problem.

The lower bound of Linial, Pitassi and Shraibman [34] for the corner problem was obtained by explicit construction of an independent set (i.e. a set that does not contain edges) in the second power of a hypergraph, which is the natural approach for such lower bounds. We improve on this bound by observing that it is actually sufficient to construct a set which does not contain “cycles”. For graphs, the notion of cycle is clear but for hypergraphs there are many possible definitions, and we initiate a careful study of this (and believe that this will be a worthwile avenue for further study independently). Here, to get new bounds we use the notion of combinatorial degeneration

to model such a “cycle”. We will say more about this in a moment.

Using the combinatorial degeneration method on the corner hypergraphs that characterize the corner problem we find new bounds for Problem 1.1 for the groups and . These are as follows (in the three equivalent forms): [Thm. 2.3] For the corner and Eval problem over we have:

[Thm. 2.3] For the corner and Eval problem over we have:

  • .

Let us discuss on a high level the history and ideas behind the combinatorial degeneration method. Combinatorial degeneration is an existing concept from algebraic complexity theory. It was (in a slightly different form) introduced and studied by Strassen in [46, Section 6].444Degeneration of tensors is a powerful approximation notion in the theory of tensors. Combinatorial degeneration is the “combinatorial” or “torus” version of this kind of approximation. Combinatorial degeneration was introduced by Bürgisser, Clausen and Shokrollahi [14, Definition 15.29] based on the notion of M-degeneration for tensors defined and studied by Strassen in [46]. (For the formal definition of combinatorial degeneration, see Definition 7.) Strassen’s original application of combinatorial degeneration was to study matrix multiplication, namely to prove the fundamental result that surprisingly many independent scalar multiplications can be reduced (in an appropriate algebraic manner) to matrix multiplication [46, Theorem 6.6].555Strassen’s result is asymptotically optimal. Strassen’s proof resembles Behrend’s construction of arithmetic-progression-free sets. Also note that this is precisely the opposite of the problem of reducing matrix multiplication to as few independent scalar multiplications as possible. The latter corresponds precisely to the arithmetic complexity of matrix multiplication. Strassen then used this result to prove his Laser method [46, Section 7], vastly generalizating the method that Coppersmith and Winograd had introduced in their construction of matrix multiplication algorithms [21].666The book [14, Definition 15.29 and Lemma 15.31] gives a different proof of the Laser method which relies even more strongly on combinatorial degeneration.

Combinatorial degeneration was used more broadly to construct large induced matchings in the setting of important combinatorial problems, namely the Sunflower problem by Alon, Shpilka and Umans [4, Lemma 3.9] and the Cap Set problem by Kleinberg, Sawin and Speyer [31]. These results are often referred to as the “multicolored” versions of the problem at hand, as opposed to the “single color” version. These ideas were developed further in the context of matrix multiplication barriers by Alman and Williams [2, Lemma 6] and in the study of tensors by Christandl, Vrana and Zuiddam [18, Theorem 4.11].

Crucially, all of the above applications use combinatorial degeneration to construct induced matchings in (-uniform -partite) hypergraphs. However, we use combinatorial degeneration in a novel manner to construct independent sets in hypergraphs instead of induced matchings. In this context an independent set should be thought of as a symmetric induced matching. Constructing large independent sets is a much harder task than constructing large induced matchings, as witnessed by the fact that the “multicolored” cap set problem is solved [31] while its “single color” version is not. Similarly, for the corner problem, as we will discuss in Section 1.4, the asymptotic growth of the largest induced matching can be shown to be maximal, whereas the main question of study of this paper is whether the same holds for the largest independent set. We expect our new way of using combinatorial degeneration to be useful in the study of other problems besides the corner problem as well, and thus think it is of independent interest.

On a more technical level, combinatorial degeneration is a notion that compares sets of -tuples by means of algebraic conditions. Our “universe” is where are finite sets. Then for sets we say that is a combinatorial degeneration of , and write , if there are maps such that for every , if , then , and if , then . Thus the maps together are able to distinguish between the elements in the set (which may be thought of as our “goal” set, i.e. a set with good properties) and the elements in the difference . As a quick example of a combinatorial degeneration, let

Then we find a combinatorial degeneration by defining the maps simply by setting , and , .

We apply the idea of combinatorial degeneration in the following fashion to get Shannon capacity lower bounds:

[Combinatorial degeneration method, Theorem 2.3] Let be a directed -uniform hypergraph. Let be a subset of vertices. Define the sets

and

Suppose that is a combinatorial degeneration. Then we get the Shannon capacity lower bound . In other words, whereas in the statement of Theorem 1.2 may not be an independent set, we can via the algebraic conditions of combinatorial degeneration construct an independent set in the th power of the hypergraph of size approaching . Namely, the algebraic conditions allow us to select such an independent set using a natural type analysis of the labels given by the maps . Thus we may think of a set as above as an approximative independent set, which asymptotically we can turn into an actual independent set by means of Theorem 1.2.

We note that, whereas it is relatively simple to verify for a given set that holds (with the notation of Theorem 1.2

) via linear programming, it is seems much harder to

find a large set for which , given . We obtain our best lower bounds via an integer linear programming approach. The resulting combinatorial degenerations that we find are explicit and checkable by hand.

We have yet to develop structural understanding of how the above combinatorial degenerations that exhibit the new capacity lower bounds arise (and we feel that deeper understanding of this may lead to more progress or even solve the corner problem), and leave the investigation of further generalizations and improvements to future work. As a partial remedy to our limited understanding, we introduce the acyclic method as a tool to construct combinatorial degenerations. While the acyclic method does not recover the bounds of Theorem 1.2 and Theorem 1.2, it has the merits of being transparent and simple to apply. The acyclic method involves another notion of a set wihtout “cycles”, which implies a combinatorial degeneration, but whose conditions are simpler to check.

1.3 Lower bounds for the corner problem from the probabilistic method

We employ the probabilistic method to find the following very general bound for the corner problem over arbitrary abelian groups. [Prop. 2.2] For the corner and Eval problem over an arbitrary abelian group  we have

  • .

This general bound applied to the special cases and does not quite match the bounds in Theorem 1.2 and Theorem 1.2, respectively. However, applied to the special case we do recover the lower bound  of [34, Cor. 24 in the ITCS version].

Using the same techniques we gain insight about the high-dimensional version of the corner problem and Eval problem and what happens when the number of players grows. For an arbitrary abelian group , a -dimensional corner over is naturally defined as a set of points in of the form

where . A subset is called corner-free if it does not contain any nontrivial corners (where nontrivial again means ). We denote by the size of the largest (-dimensional) corner-free set. Just like the case corner-free sets correspond to independent sets in a naturally defined -uniform directed hypergraph . With the probabilistic method (extending Theorem 1.3), we find that when the goes to infinity, the capacity of becomes essentially maximal. As a consequence if grows with (e.g., ) we find that the NOF complexity of the corresponding -player Eval problem becomes sub-linear.

[Rem. 2.2] Let be a finite abelian group. Then

Thus we learn that to prove a linear lower bound on for any given (say for ) it is important to keep constant.

1.4 Limitations of tensor methods for proving upper bounds for the corner problem

Our second result is a strong limitation of current tensor methods to effectively upper bound the Shannon capacity of hypergraphs. This limitation is caused by induced matchings and applies to various combinatorial problems including the corner problem. We use a method of Strassen to show that these limitations are indeed very strong for the corner problem.

In order to elaborate on these results let us first give an overview of upper bound methods. The general question of upper bounds on the Shannon capacity of hypergraphs is particularly well-studied in the special setting of undirected graphs, from which the name “Shannon capacity” comes: it in fact corresponds to the zero-error capacity of a channel [42]. Even for undirected graphs, it is not clear how to compute the Shannon capacity in general, but some methods were developed to give upper bounds. The difficulty is to find a good upper bound on the largest independent set that behaves well under the product . For undirected graphs, the best known methods are the Lovász theta function [37] and the Haemers bound which is based on the matrix rank [30]. For hypergraphs, we only know of algebraic methods that are based on various notions of tensor rank, and in particular the slice rank [49] (which was used and studied extensively in combinatorics, in the context of cap sets [48, 31], sunflowers [41] and right-corners [40]), and similar notions like the analytic rank [28, 38, 13], the geometric rank [32], and the G-stable rank [23]. Even though the slice rank is not multiplicative under

it is possible to give good upper bounds on the asymptotic slice rank via an asymptotic analysis

[49], which is closely related to the Strassen support functionals [47] or the more recent quantum functionals [18].

Most of the rank-based bounds actually give upper bounds on the size of induced matchings and not only on the size of independent sets. It is simple and instructive to see this argument in the setting of undirected graphs. For a given graph , let be the adjacency matrix in which we set all the diagonal coefficients to . Then for any independent set , the submatrix of

is the identity matrix and as a result

. As the matrix rank is multiplicative under tensor product, we get . Observe that this argument works equally well if we consider an induced matching instead of an independent set. An induced matching of size of the graph can be defined by two lists of vertices and of size such that for any we have

In other words, the submatrix is an identity matrix, which also implies that . As such, the matrix rank is an upper bound on the asymptotic maximum induced matching. Tensor rank methods such as the subrank, slice rank, analytic rank, geometric rank and G-stable rank also provide upper bounds on the asymptotic maximum induced matching.

Using a result of Strassen [47], we show that there is an induced matching of the -th power of of size . This establishes a barrier on many existing tensor methods (such as slice rank, subrank, analytic rank, etc.) to make progress on Problem 1.1. In fact, this result holds more generally for any abelian group :

[Cor. 3.3] For any abelian group , the hypergraph has an induced matching of size . In other words, for any , there exist lists of size such that the following holds. For any

(1)

We prove this result by establishing in Theorem 3.3 that the adjacency tensor of the hypergraph is tight (see Definition 3.2). Strassen showed in [47] that for tight sets, the asymptotic induced matching number is characterized by the support functionals. By computing the support functionals for the relevant tensors, we establish the claimed result in Corollary 3.3. Note that if we could ensure that , this would solve Problem 1.1. We computed the maximum independent set and maximum induced matching for for small powers (see Table 1) and we found that the maximum independent set is strictly smaller than the maximum induced matching for and . This motivates the search for methods that go beyond the maximum induced matching barrier. For comparison, we also give the analogous numbers for the cap set hypergraph  (which is an undirected hypergraph), where, interestingly, the maximum independent set and the maximum induced matching are equal.

independence number induced matching number
1
2
3
independence number induced matching number
1
2
3
Table 1: Independence number and induced matching number for small powers of the cap set hypergraph and corner hypergraph . Interestingly, the independence number and induced matching number of powers of the cap set hypergraph are exactly equal for the powers . For the corner hypergraph we see that they are different already for the second and third power.

2 Lower bounds from the combinatorial degeneration method

In this section we discuss three methods to prove lower bounds on the Shannon capacity of directed -uniform hypergraphs: the probabilistic method, the combinatorial degeneration method and the acyclic set method. We apply these methods to the corner problem—the problem of constructing large corner-free sets—which as a consequence gives new NOF communication protocols for the Eval problem. We begin by discussing the corner problem and its relation to NOF communication complexity.

2.1 Corner problem, cap set problem and number on the forehead communication

Hypergraphs

We recall the definition of directed -uniform hypergraphs and basic properties of Shannon capacity on directed -uniform hypergraphs. A directed -uniform hypergraph is a pair where is a finite set of elements called vertices, and is a set of -tuples of elements of which are called hyperedges or edges. If the set of edges is invariant under permuting the coefficients of its elements, then we may also think of as an undirected -uniform hypergraph.

Let be a directed -uniform hypergraph with vertices. The adjacency tensor of is defined as

The strong product of a pair of directed -uniform hypergraphs and is denoted and defined as follows. It is a directed -uniform hypergraph with vertex set and the following edge set: Any vertices form an edge if one of the following three conditions holds:

  1. and

  2. and

  3. and

An independent set in a directed -uniform hypergraph is a subset of the vertices that induces no edges, meaning for every there is an such that . The independence number of , denoted by , is the maximal size of an independent set in .

If and are independent sets in two directed -uniform hypergraphs and , respectively, then is an independent set in the strong product . Therefore, we have the supermultiplicativity property . For any directed -uniform hypergraph , let denote the -fold product of with itself. The Shannon capacity of a directed -uniform hypergraph is defined as

By Fekete’s lemma we can write . The following proposition can be deduced directly from the definition of Shannon capacity. Suppose is a directed -uniform hypergraph with vertices and there is an independent set of size in . Then .

Corner problem

Let be a finite Abelian group. A corner in is a three-element set of the form for some and . The element is called the center of this corner. Let  be the size of the largest subset such that no three elements in form a corner. The corner problem asks to determine given .

Trivially, we have the upper bound . The best-known general upper bound on  comes from [43, 44], and reads

where is an absolute constant. In the finite field setting, in [33] the following better upper bound for with was obtained:

We may phrase the corner problem as a hypergraph independence problem. We define to be the directed 3-uniform hypergraph with and . Then by construction: . As a consequence,

Let correspond to addition in . Then with

Under the labeling and we will think of as the hypergraph with and . Closely related to is the minimum number of colors needed to color  so that no corner is monochromatic, which we denote by . Then: [[15, 34]] Let be a finite Abelian group. There is a constant , such that for every ,

For , the current upper bound in the literature is  [34], which we will improve on.

Number on the forehead communication

The corner problem is closely related to the Number On the Forehead (NOF) communication model [15]. In this model, players wish to evaluate a function on a given input . The input is distributed among the players in a way that player  sees every for . This scenario is visualized as being written on the forehead of Player . The computational power of everyone is unlimited, but the number of exchanged bits has to be minimized. Let be the minimum number of bits they need to communicate to compute the function in the NOF model with players. Many questions that have been thoroughly analyzed for the two-player case remain open in the general case of or more players, where lower bounds on communication complexity are much more difficult to prove. The difficulty in proving lower bounds arises from the overlap in the inputs known to different players.

One interesting function in this context is the family of Eval functions. The function outputs on inputs if and only if . The trivial algorithm gives that . For two players Yao [51] proved that (for nontrivial ). But, for three players it is an open problem whether .

[[9]]

From Lemma 2.1 and Proposition 2.1 it follows that would imply that , and also that lower bounds on give upper bounds on . For , the best-known upper bound on is [1] which we improve on.

Three-term arithmetic progressions and the cap set problem

A three-term arithmetic progression in is a three-element set of the form for some and . Let be the size of the largest subset such that no three elements in form a three-term arithmetic progression.

Following [52, Corollary 3.24] there is a simple relation between corner-free sets and three-term-arithmetic-progression-free sets:

Proof.

Let be a subset that is free of three-term arithmetic progressions. Define the subset . Then is a corner-free set of size . Indeed, if are elements of , then are in and these elements form a three-term arithmetic progression. ∎

A three-term-arithmetic-progression-free subset of is also called a cap set. The notorious cap set problem is to determine how grows when  goes to infinity. A priori we have that . Using Fourier methods and the density increment argument of Roth, the upper bound was obtained by Meshulam [39], and improved only as late as 2012 to for some positive constant by Bateman and Katz in [7]. Until recently it was not known whether grows like or like for some . Gijswijt and Ellenberg solved this question in 2017, showing that  [25]. The best lower bound is by Edel [24]. In particular, using Lemma 2.1, this implies the lower bound for the corner problem. We will improve this lower bound in Theorem 2.3.

We may phrase the cap set problem as a hypergraph independence problem by defining the undirected -uniform hypergraph consisting of three vertices and a single edge . The independence number equals , and thus the Shannon capacity of determines the rate of growth of .

2.2 Probabilistic method

We start off with a simple and general method for obtaining lower bounds on the Shannon capacity. For any element , the set is an independent set of , and therefore we have , which we think of as the trivial lower bound. By using a simple probabilistic argument (which does not use much of the structure of ), we show the following nontrivial lower bound for . For any finite Abelian group , we have .

Proof.

Let and . Recall that the hypergraph has vertices given by the elements of and edges given by the corners in . Let and choose the subset of randomly by choosing any element to be in the set

with probability

. Let be the directed subhypergraph of induced by . We have . Let be any edge of . Then is of the form

for some and . Since , and are different, and for each the probability of being in is , we have that . Therefore, since , we have . On the other hand, for any hypergraph we have . Therefore

Thus find the lower bound . ∎

The idea in the proof of Proposition 2.2 to apply the probabilistic method to lower bound the number of remaining elements afther a “pruning” procedure (in this case, pruning vertices that induce edges) goes back to [21]. A similar probabilistic method construction is the driving component in the recent new upper bound on the matrix multiplication exponent  [3].

In terms of the corner problem, the lower bound on the Shannon capacity in Proposition 2.2 for corresponds to the upper bound (via Proposition 2.1). This upper bound is similar to the bound provided in [34, Corollary 26 in the ITCS version]. The proof of Proposition 2.2 directly extends from 2-dimensional corners to -dimensional corners, which are sets of the form

Just like the Eval problem on 3 players is closely related to 2-dimensional corners in , the Eval function on players is closely related to -dimensional corners in . By a similar argument as the proof of Lemma 2.1 we have that the player NOF complexity is upper bounded by , where is minimum number of colors that we can use to color such that no -dimensional corner is monochromatic. Letting denote the size of the largest -dimensional corner free set in , we have similar to Proposition 2.1 the relation between and given by

which is proved in [34]. From a similar probabilistic method argument as in the proof of Proposition 2.2, choosing each independently at random with probability , we get

as a consequence one has , where is directed -uniform hypergraph that construct for the -dimensional corner. Furthermore from the lower bound of , we have

If we take (for instance), then , that is, we obtain a sublinear upper bound for in .

2.3 Combinatorial degeneration method

We now introduce the combinatorial degeneration method for lower bounding Shannon capacity. Combinatorial degeneration is an existing concept from algebraic complexity theory introduced by Strassen in [46, Section 6, in particular Theorem 6.1]777The precise connection to [46] is as follows. Strassen defines the notion of M-degeneration on tensors. In our terminology, a tensor is an -degeneration of another tensor, if the support of the first is a combinatorial degeneration of the support of the second. The terminology “combinatorial degeneration”, which does not refer to tensors, but rather directly to their supports (hence the adjective “combinatorial”), was introduced in [14, Definition 15.29].. ‘’ In that original setting it was used as part of the construction of fast matrix multiplication algorithms [14, Definition 15.29 and Lemma 15.31], and, in a broader setting, combinatorial degeneration was used to construct large induced matchings in [4, Lemma 3.9], [2, Lemma 5.1] and [18, Theorem 4.11]. However, we will be using it in a novel manner in order to construct independent sets instead of induced matchings. We will subsequently apply the combinatorial degeneration method to get new bounds for the corner problem. We expect the method to be useful in the study of other problems besides the corner problem as well. First we must define combinatorial degeneration. [Combinatorial degeneration] Let be finite sets. Let . We say that is a combinatorial degeneration of , and write , if there are maps () such that for every , if , then , and if , then .

As a quick example of a combinatorial degeneration, let

Then we have a combinatorial degeneration by picking the maps , and , .

We apply combinatorial degeneration in the following fashion to get Shannon capacity lower bounds:

[Combinatorial degeneration method] Let be a directed -uniform hypergraph. Let . Let and let and suppose that . Then .

Proof.

Let be the maps given by the combinatorial degeneration . Let be any multiple of . Let . Suppose for every that the elements in the tuple

are uniformly distributed over

, so that every element of  appears times in . Then, using that for every and the uniformity of , we have

(2)

For every , since , we have