1 Introduction
Hyperfinite and expander graph sequences are perhaps the two most fundamental concepts studied in the theory of sparse graph limits. Hyperfinite graph sequences were explicitly introduced in [MR2455943] (and implicitly they are present in earlier works, e.g. in [MR584516]). Expander graph sequences (frequently informally referred to as “expander graphs”) have been studied since at least the 70’s in many different branches of mathematics and computer science (see [MR2247919] for a survey with some historical information). Both notions (or their close relatives) are broadly used in combinatorics, group theory, ergodic theory, and operator algebras.
In this article we study the analogues of hyperfinite and expander graph sequences in the context of oriented graphs, particularly directed acyclic graphs. We call these analogues “hypershallow” and ”extender” graph sequences, respectively. Our main result (see Theorem 5 below) is a stochastic construction of graph sequences which are not hypershallow (we do not know any deterministic construction of such graph sequences). As a side note, let us mention that the question whether nonhypershallow graph sequences exist was partially motivated by the techniques presented in [afshani_et_al:LIPIcs:2019:10586] and in [MR584516] for obtaining conditional lower bounds in circuit complexity. We will discuss this in Section 4.
Let us now precisely define hypershallow graph sequences and state our main result.
Basic conventions
The set of natural numbers is . The cardinality of a set is denoted by . We use the shorthand for denoting a sequence . "Either… or…" is nonexclusive.
A graph is a pair where is a nonempty finite set, and is a subset which is disjoint from the diagonal. We say that is undirected if is a symmetric subset of . A path of length in a graph is a tuple , such that for we have either or .
A path is simple if for . It is a directed path if for all we have . A cycle is a path such that . We say that is a dag (which stands for directed acyclic graph) if it does not have directed cycles.
If , then we define , . If and are graphs such that and , then we say that is a subgraph of . Furthermore, for we let , and if is a sequence of graphs, then we say that has bounded indegree if for some and all we have .
We are now ready to define hypershallow graph sequences.
Definition 1.

Let be a graph and let be a proper subset. We define as the maximal such that there exists a directed simple path in disjoint from .

Let be a sequence of dags with bounded indegree. We say that is hypershallow if with and , such that we have .
Remark 2.
Let us take a moment to explicitly state the analogy between the definitions of hypershallow and hyperfinite graph sequences.

We first recall the definition of hyperfinite graph sequences. If is an undirected graph and , then we note that is the maximum of lengths of simple paths disjoint from .
We define a sequence of bounded degree undirected graphs to be hyperfinite if with and , such that we have .
This is easily seen to be equivalent to the definition of hyperfiniteness in [MR2455943].
From this point of view, and with our convention that undirected graphs form a subclass of all graphs, within the class of bounded degree undirected graphs the hypershallow sequences are exactly the same as hyperfinite sequences.

Let us explain the choice of the word “hypershallow”, again by analogy with the word “hyperfinite”. One of the simplest classes of undirected graph sequences consists of those sequences which have uniformly finite connected components, i.e. such that we have that the connected components of are of size at most . We recall that the expression “hyperfinite graph sequence” is meant to suggest that we are dealing with “the next simplest thing”: informally, a sequence is hyperfinite if it is possible to obtain from a sequence with uniformly finite connected components by removing an arbitrarily small proportion of vertices from .
The motivation to use the word “hypershallow” is similar. For a dag let denote the maximum of lengths of directed paths in . One of the simplest classes of bounded indegree dag sequences consists of the “uniformly shallow” sequences, i.e. such that we have . The name “hypershallow graph sequence” is meant to suggest that we are dealing with “the next simplest thing”: after removing a small proportion of vertices we get a sequence which is uniformly shallow.
The following definition allows us, informally speaking, to capture “how badly” a sequence of graphs fails at being hypershallow.
Definition 3.

Let be a dag, let . We say that is an extender if for every with we have .

Let be a bounded indegree sequence of dags and let be a sequence of positive real numbers with . We say that is a extender sequence if and , , is an extender.
Remark 4.
It is easy to check that a bounded indegree sequence of dags is not hypershallow if and only if it contains a subsequence which is a extender for some with .
We are now ready to state our main theorem.
Theorem 5.
There exists a sequence of bounded degree dags which is an extender, with .
Our proof of this theorem is probabilistic. The most important part of the proof consists of studying the random graphs which will be introduced in Section 3. We do not know of a nonprobabilistic way of constructing a bounded indegree nonhypershallow sequence of dags.
On the other hand, we can ask how fast the sequence can grow, provided that there exists a extender sequence. In this direction we have the following result.
Theorem 6.
Let be a sequence of numbers in such that . If is a sequence of bounded indegree dags, then is not an extender sequence.
Remark 7.
Theorem 6 implies, for example, that there are no extender sequences. However, we do not know whether there exists an extender sequence for every . This is an interesting question also because of the following “linguistic” reason.
It seems to the authors that the phrase extender sequence should ultimately be reserved only for those extender sequences where grows “as fast as possible” (similarly to the fact that not every nonhyperfinite sequence of graphs is an expander sequence). Alas, we do not know any more about the restrictions on the growth of than what is stated in Theorem 6. Until we get to know more about it, we only use the phrase “extender sequence”.
In Section 2 we list some standard definitions and conventions, and we state and prove a variant of Pinsker’s inequality which involves the Shannon entropy (Proposition 22). This is the most important external result in our analysis of the random graphs .
2 Preliminaries
We use the following conventions. If , then . If is a set, then is the set of all functions from to . This leads to the following notational clash: for , the symbol can either denote a number (and hence a set of numbers) or the set of all functions from to . We believe that resolving this ambiguity will never cause any difficulty for the reader.
If is a set, then denotes the power set of , i.e. the set of all subsets of .
2.1 Dags
Definition 8.
Let be a graph, and let .

, ,

, ,

,

, ,

, , ,

we write when there exists a directed path from to in and we write if or .
Definition 9.
Let be a sequence of graphs. We say that has, respectively, bounded degree, bounded indegree, or bounded outdegree, if, respectively, , , or .
Definition 10.

If
is a probability measure on
, then we also use the symbol for the function which sends to ) (so in particular we can write instead of , and we letwhere by convention .

A random variable on a standard probability space with values in a standard Borel space is a Borel function . The law of is the pushforward measure on , i.e. for we let .

If is an valued random variable and is its law, then we define .

If and are random variables with values in a standard Borel space , then we define a new random variable with values in by, informally speaking, choosing between and with probability .
Formally, suppose that and are defined on and , respectively. The probability space on which is defined is , where is the unique measure on such that when and when . We let for and for .

For we let .
Lemma 11.
If and are random variables with values in the same space , with laws and respectively, then the law of is .
Proof.
Follows directly from the definitions. ∎
The main point of the following proposition is contained in its second item. Informally, it allows us to say the following: if and are valued random variables with laws and respectively, and is chosen according to the law , then either it is roughly as probable that as it is that , or the entropy of is substantially larger than the average of the entropies of and .
Proposition 12.
Let and be valued random variables with laws and , respectively.

We have .

We have
(1)
Proof.
By the previous lemma we have that the law of is . As such the first item follows from Jensen’s inequality.
The second item is a simple corollary of Pinsker’s inequality (see e.g. [MR2319879, Theorem 2.16] for the statement and proof of Pinsker’s inequality). To derive it, we start by stating the following two special cases of Pinsker’s inequality:
and
where
and similarly for . By convention we set in the definitions of and .
Noting that , summing the two inequalities above gives
A direct computation shows that , so together with the triangle inequality we deduce that
(2) 
3 Existence of nonhypershallow sequences
In this section we will describe a probabilistic construction of nonhypershallow sequences of dags. They will be in fact expander sequences for .
We will construct a sequence of random graphs which asymptotically almost surely forms, after small modifications, an extender sequence. The graphs will be essentially defined as follows. The vertices are and for every , we add an edge independently with probability proportional to . In order to simplify the proof, we will slightly change the probabilities when we define in Subsection 3.2.
We start with the definition and discussion of depth functions in Subsection 3.1, as they provide a convenient way of characterising the property of being an extender, which will be crucial in the analysis of the random graphs in Subsection 3.3.
3.1 Depth functions
Given a graph and , we can associate to it a function which “measures the maximal distance to ”. More precisely we define by setting to be the maximal for which there exists a directed simple path with , , and when . Let us start by abstracting some properties of into the notion of a depth function as follows.
Definition 13.
Let be a graph.

A depth function for is a function such that the following conditions hold:

For every we have either or

For every such that there exists such that and .


Let and let . An depth function for is a depth function for such that for all we have and .
We will also make use of the following functions.
Definition 14.
For we define by setting to be equal to the maximal length of a directed simple path in the graph which connects to a vertex in .
The following lemma is straightforward to verify directly from the definitions.
Lemma 15.
Let be a graph and let . Then is a
depth function for . ∎
On the other hand, it is not true that for every we have that is a depth function for . Condition b) of Definition 13.1 is fulfilled for every , but Condition a) does not have to be fulfilled. However, the next lemma shows in particular that for every depth function we can find such that .
Lemma 16.
Let be a graph and be a depth function for . Then there exists such that the following properties hold true.

,

, and

.
Proof.
By Condition b) of Definition 13.1, there exists a function such that for every we have . We let . It is straightforward to check that has the desired properties. ∎
Corollary 17.
If is an depth function, then .
Proof.
By the previous lemma we can find such that and . Clearly we have
and it is evident from the definition of that is equal to the maximal value of , which is bounded by since . ∎
Let us finish this subsection with the characterisation of extender graphs which we will use to establish the existence of extender sequences
Lemma 18.
Let and let be a dag. Then is an extender if and only if there is no depth function for .
Proof.
Let us first assume that is an extender, so if with , then . Let and suppose that is an depth function for . By the previous corollary we have , and therefore . But by the definition of being an depth function we have , which shows .
In the other direction, let us assume that there are no depth functions, and let us show that is an extender. Let be a set with . Then by Lemma 15 we have that is an depth function, so we have . This finishes the proof. ∎
We finish by restating the above lemma for the case of graph sequences.
Corollary 19.
Let be a bounded indegree sequence of dags and let be a sequence of positive real numbers with . The following conditions are equivalent.

The sequence is a extender sequence

There exists , such that for all we have that does not admit a compatible depth function.
∎
3.2 Definition and basic properties of the random graphs
In this paper a random graph is a pair where is a nonempty finite set and is a random variable with values in such that is disjoint from the diagonal in almost surely.
For let . For we write if , , where and . We also let .
We start by defining a random variable with values in , as follows. We first choose uniformly at random, then we choose uniformly at random, and we choose uniformly at random in
The law of will be denoted with .
Now for we define a random graph as follows: we let , and the random variable with values in is defined by choosing elements of independently at random according to the law . This finishes the definition of .
Let us note that is typically neither a dag nor of bounded degree, but the following lemma implies that with high probability becomes a bounded degree dag after removing a small amount of vertices.
Lemma 20.
Let , , and let . We have
(3) 
and
(4) 
Proof.
Note that we have the following alternative description of the law of choosing a random edge in : we choose uniformly at random, then we choose uniformly at random and we choose the edge . Therefore, if we fix , then
where the last inequality is obtained by writing
Similarly for a fixed we have
and hence
Now by linearity of expectation we have
and the righthand side is bounded from above by . Thus, by Markov’s inequality we have
which finishes the proof of (3).
In order to prove (4), we start by bounding from above. By the description of above, the only way in which might take a value with is when we start by choosing such that . As such we have
which is bounded from above by
Therefore, we have
and Markov’s inequality again gives us the desired bound. ∎
3.3 Construction of an extender sequence from
The key lemma which we need is the following.
Lemma 21.
Let , let , let , let , and let . We have
(5) 
The intuition behind the proof of Lemma 21 is the following. If the distribution of , , …, is very close to the distribution of , , …, , then for a random edge between the two vertex sets, increases or decreases with approximately the same probability. But if the two distributions are not very close, then the entropy of the distribution of the union , , …, is larger than the average of the two entropies. As the entropy is bounded from above by , this latter case must rarely happen. This intuition is formalised by Proposition 12 above.
Proof of Lemma 21.
For , , let denote the restriction of to , and let denote the restriction of to .
Note that As such, by the first item of Proposition 12, for all we have
On the other hand, we have , where is chosen uniformly at random from . Hence
and so
Now Markov’s inequality shows that
By the second item of Proposition 12, if for some we have , then . Thus by definition of , we have
which finishes the proof. ∎
Proposition 22.
Let , , let , let , and let ,
(6) 
where for we set .
Proof.
Clearly it is enough to show that
(7) 
Since each depth function compatible with a given graph is of the form for some with , we have that (7) is bounded above by
(8)  
Given , let be defined by
and let . Furthermore if is a graph and is such that is an depth function, then let us say that is an depth set for .
Recall that the law of is the pushforward of through the map . As such, we deduce that (8) is bounded above by
(9)  
Let us first estimate the number of summands in (
9). Recall that for and we have (see e.g. [galvin2014tutorial, Theorem 3.1]). Since and , we see that the number of summands in (9) is therefore at most .We are now ready to prove Theorem 5. Clearly it follows from the following theorem.
Theorem 23.
Let . Then there exists a bounded degree sequence of dags which is an extender sequence.
Comments
There are no comments yet.