The Shannon capacity of a graph is [Sha56]
where denotes the independence number and is the th strong power (see Section 2 for definitions). In the context of information theory, the optimal rate of zero-error communication over a noisy classical channel is equal to the Shannon capacity of its confusability graph.
In [Zui18] Zuiddam introduced the asymptotic spectrum of graphs as follows. Let denote the set of isomorphism classes of finite undirected simple graphs. The asymptotic spectrum of graphs is the set of functions which satisfy for all
(additive under disjoint union)
(multiplicative under the strong product)
if there is a homomorphism between the complements then
Elements of are also called spectral points. Using the theory of asymptotic spectra, developed by Strassen in [Str88], he found the following characterization of the Shannon capacity:
A number of well-studied graph parameters turn out to be spectral points: the Lovász theta number [Lov79], the fractional clique cover number , the complement of the projective rank [CMR14], and the fractional Haemers bound over any field [Hae78, Bla13, BC18]. The latter gives rise to an infinite family of distinct points. is also the maximum of the spectral points. In fact, both this and eq. 2 remains true if we allow optimization over the larger set of functions subject only to properties 4, 3 and 2 [Fri17, 8.1. Example].
In [CK81] Csiszár and Körner introduced a refinement of the Shannon capacity, imposing that the independent set consists of sequences with the same frequency for each vertex of
, in the limit approaching a prescribed probability distributionon the vertex set . Their definition is equivalent to
where is the set of those sequences whose type (empirical distribution) is -close to and is the subgraph induced by this subset. Some properties are more conveniently expressed in terms of , which is also called the Shannon capacity.
In information theory, the independent sets in a type class are constant composition codes for zero-error communication, while similar notions in graph theory are sometimes called probabilistic refinements or “within a type” versions. To avoid proliferation of notations, we adopt the convention that graph parameters and their probabilistic refinements (defined using strong products) are denoted with the same symbol, even if alternative notation is in use elsewhere.
The aim of this paper is to gain a better understanding of by studying the probabilistic refinements of spectral points, focusing on those properties which follow from properties 4, 3, 2 and 1 and thus are shared by all of them. Some of these properties were already known to be true for specific ones.
Before stating the main results we introduce some terminology. A probabilistic graph is a nonempty graph together with a probability measure on (notation: ). Two probabilistic graphs are isomorphic if there is an isomorphism between the underlying graphs that is measure preserving. Let denote the set of isomorphism classes of probabilistic graphs.
Let . Then for any probabilistic graph the limit
exists. Consider as a function . It satisfies the following properties:
for any graph the map is concave
if are graphs and then
where , denote the marginals of on and , is the mutual information with the Shannon entropy
if are graphs, , and then
if is a homomorphism and then
and can be recovered as .
Unsurprisingly, it turns out that the following counterpart of eq. 2 for probabilistic graphs is true:
We prove the following converse to Theorem 1.1.
Theorems 1.2 and 1.1 set up a bijection between and the set of functions satisfying properties 4, 3, 2 and 1. The inequalities defining the latter are affine, therefore it is a convex subset of the space of all functions on . Translating back to functions on , it follows that e.g. the graph parameter
is an element of . Moreover, the function is the maximum of affine functions for any fixed graph , therefore it is convex. This allows us to find examples of graphs where a combined function like in eq. 9 gives a strictly better bound than the two spectral points involved.
In addition, we prove analogues of some of the properties that were previously known for specific spectral points. These include subadditivity with respect to the intersection; the value on the join of two graphs; and a characterization of multiplicativity under the lexicographic product. We introduce for each spectral point a complementary function and find a characterization of the Witsenhausen rate [Wit76] and of the complementary graph entropy [KL73].
The probabilistic refinement of the fractional clique cover number (also known as the graph entropy of the complement) is the entropy with respect to the vertex packing polytope [CKL90]. Similarly, the probabilistic refinement of the Lovász number is also the entropy with respect to a convex corner [Mar93], called the theta body [GLS86]. We show that this property is shared by every spectral point, and give another characterization of the probabilistic refinements as the entropy functions associated with certain convex corner-valued functions on .
1.2 Organization of this paper
In Section 2 we collect basic definitions and facts from graph theory and information theory, in particular those which are central to the method of types. Section 3 contains the proof of Theorems 1.2 and 1.1. In Section 4 we discuss a number of properties that have been known for specific spectral points and are true for all (or at least a large subset) of them. These include subadditivity under intersection of graphs with common vertex set and the behaviour under graph join and lexicographic product. We also put some notions related to graph entropy and complementary graph entropy into our more general context. In Section 5 we connect our results to the theory of convex corners.
Every graph in this paper is assumed to be a finite simple undirected graph. The vertex set of a graph is and its edge set is . The complement is the graph with the same vertex set and edge set . Given a graph and a subset the induced subgraph is the graph with vertex set and edge set . We write when and . This is a partial order and the complement operation is order reversing. The complete graph on a set is , for the notation is simplified to .
For graphs and the disjoint union has vertex set and edge set . The strong product has vertex set and iff ( or ) and ( or ) but . The join and the costrong product are
We use the notation ( operands), and similarly for other associative binary operations. The lexicographic product has vertex set and iff or ( and ). The lexicographic product satisfies and the three types of products are ordered as .
A graph homomorphism is a function such that for all . An isomorphism is a homomorphism which is a bijection between the vertex sets and its inverse is also a homomorphism. and are isomorphic if there is an isomorphism between them. The set of isomorphism classes of graphs is denoted by . The set of isomorphisms is . We write if there is a homomorphism . In particular, for any , because the inclusion of an induced subgraph is a homomorphism and passing to induced subgraphs commutes with complementation.
A probability distribution on a finite set is a function satisfying . The support of is . For , is said to be an -type if for all . The set of probability distributions on will be denoted by , the set of -types by and
The latter is a dense subset of , equipped with the subspace topology from the Euclidean space . denotes the open -ball in centered at with respect to the total variation distance.
For an -type the type class is the set of strings in which occurs exactly times for all . More generally, for a subset we define
The number of type classes satisfies [CK11, Lemma 2.2].
The (Shannon) entropy of a probability distribution is
where is to base and by convention
(justified by continuous extension). A special case is the entropy of a Bernoulli distribution,. When
we have the cardinality estimates[CK11, Lemma 2.3]
The relative entropy between two Bernoulli distributions is , which satisfies .
When and are finite sets and , the distributions and given by
are called the marginals of . The mutual information is . denotes the probability distribution on given by , while for , denotes the distribution on defined as
If is a function between finite sets and , then the pushforward is the distribution defined as
The probabilistic refinement of a graph parameter is whenever the limit exists, where is a nonempty graph and . In particular, existence is guaranteed if is -supermultiplicative and nonincreasing under taking induced subgraphs. In all the examples in this paper, when , this quantity is the same as .
3 Probabilistic refinement of spectral points
Let be a vertex-transitive graph and . Then with
The proof is essentially a folklore argument. Draw at random independently and uniformly from . Define as . For any and ,
is uniformly distributed onby vertex-transitivity. For fixed and varying these events are independent, therefore
Thus is surjective for some choice of the permutations. Fix such a choice and let be an arbitrary right inverse of . Suppose that such that and let , . If then is not an edge in , therefore . Otherwise since is an automorphism, therefore . This proves that is a homomorphism. ∎
Let be a graph, , and . Then
for some satisfying
We start with the first inequality. Both sides can be represented as induced subgraphs of , on the vertex sets and , respectively. Since , the left hand side is an induded subgraph of the right hand side.
For the second inequality we apply Lemma 3.1 to the graph and the subset of its vertex set. The upper bound on the resulting follows from the (crude) estimate
For every nonempty graph and we have
This expression defines a uniformly continuous function on , therefore has a unique continuous extension to , which we denote by the same symbol. Moreover,
is concave (property 1)
satisfies the continuity estimate
It is enough to establish existence of the limit and verify properties 4, 3, 2 and 1 on . Property 4 will then imply uniform continuity, hence existence of the continuous extension, which is unique since is dense in .
Let and . Let . By the first inequality of Lemma 3.2, . Apply to both sides. Using that is monotone, multiplicative under the strong product, and we get
By Fekete’s lemma converges to its supremum, which is in the interval (property 1). If and then also and
because the sequence in the middle is a subsequence of the other two. Therefore the limit defines a function on .
Let and . Choose such that , and . By Lemma 3.2 we have
Apply , take the logarithm and divide by to get
and take the limit :
Let , and define
and . Then . By concavity of and and using we get the estimates
We add the two inequalities and rearrange:
Finally, divide by and use the definition of :
The expression on the right hand side is symmetric in and , therefore it is also an upper bound on the absolute value of the left hand side, which proves property 4. ∎
The probabilistic refinement of the Lovász theta number was defined and studied by Marton in [Mar93] via a non-asymptotic formula. The probabilistic refinement of the fractional clique cover number is related to the graph entropy as [Kör73].
Clearly, only depends on and .
We remark that the upper bound in eq. (26) is close to optimal among the expressions depending only on and : if we omit the last term and specialise to then it becomes sharp, see [Pet07, Theorem 3.8] and [Aud07].
The route we followed is not the only way to arrive at the probabilistic refinement. We now state its equivalence with other common definitions.
Let be a graph and . Then
for any sequence such that and , and any .
For the proof see Appendix A.
For any graph we have .
For the proof see Appendix A.
Let be graphs and . Let and denote its marginals on and , respectively. Then
holds in for some satisfying .
The marginal types of any sequence in are and , therefore is an induced subgraph of .
For the second inequality we apply Lemma 3.1 to the graph , which comes equipped with a transitive action of . The upper bound on can be seen from
satisfies property 2.
By continuity, it is enough to verify the inequalities for distributions with rational probabilities. Let be graphs and . Lemma 3.6 implies
where . Apply and then divide by and take the limit to get