1 Introduction
Let be an undirected graph. An independent set in is a subset of pairwise nonadjacent vertices. The independence number of , denoted by , is the largest possible size of an independent set in . For two graphs and , their strong product is a graph such that

the vertex set of is the Cartesian product ; and

any two distinct vertices and are adjacent in if and , or and , or and .
The graph is defined inductively by . The Shannon capacity of a graph is defined by
(1) 
where the limit exists by the supermultiplicativity of and Fekete’s Lemma.
This graph quantity was introduced by Shannon [18] as the zeroerror capacity in the context of channel coding. In this setup, a transmitter would like to communicate a message to a receiver through the channel, and the receiver must decode the message without error. This problem can be equivalently cast in terms of the confusion graph associated with the channel. The vertices of the confusion graph are the input symbols, and two vertices are adjacent if the corresponding inputs can result in the same output. It is easy to check that is the confusion graph for uses of the channel, and that is the maximum number of messages that can be transmitted without error over uses of the channel.
Despite the apparent simplicity of the problem, a general characterization of remains elusive. Several lower and upper bounds were obtained by Shannon [18], Lovász [13] and Haemers [9]. These bounds are briefly reviewed in Section 2. In Section 3 we present a new bound on the Shannon capacity via a variation on the linear program pertaining to the fractional independence number of the graph. Next, we show that the new bound can simultaneously outperform both the Lovász theta number and the Haemers minimum rank bound. In Section 4, we leverage our bound to prove a new upper bound for the broadcast rate of Index Coding. It should be noted that a fractional version of the Haemers minimum rank bound, denoted , was introduced independently by Blasiak [6] and Shanmugam et al. [16], and investigated in more detail by Bukh and Cox [8] very recently. This bound is at least as good as , one of our new bounds. Nevertheless is very difficult to compute, and our bound is more tractable and provides a feasible way to approach (see Remark 2 below for more details).
2 Upper Bounds on the Shannon Capacity
In this section, we give a brief overview of three wellknown upper bounds on the Shannon capacity. Throughout this section let be a graph with vertex set .
2.1 Fractional Independence Number
The fractional independence number is the linear programming relaxation of the  integer linear programming that computes the independence number. More precisely, the fractional independence number is defined as the optimal value of the following linear program:
(2) 
(A clique in is a subset of the vertices, , such that every two distinct vertices are adjacent in .) From the duality theorem of linear programming, can also be computed as follows:
(3) 
(The optimal value of (3) is also called the fractional cliquecover number of , and denoted as .)
The following bound was first given by Shannon [18], and was also discussed in detail by Rosenfeld [15].
Theorem 1.
[18, Theorem 7] .
2.2 Lovász Theta Number
In his seminal paper [13], Lovász solved the longstanding problem of the Shannon capacity of the pentagon graph, by introducing an important new graph invariant, called the Lovász theta number. An orthonormal representation of
is a system of unit vectors
in some Euclidean space such that if and are nonadjacent then and are orthogonal (all vectors will be column vectors). The Lovász theta number of is defined aswhere the minimum is taken over all unit vectors and all orthonormal representations of . The following bound is the main result of [13].
Theorem 2.
[13, Theorem 1]
In the sequel we will also need the following results from [13]
. The theta number of odd cycles was calculated by Lovász
[13].Proposition 1.
[13, Corollary 5] For odd ,
In particular, for the pentagon graph, Lovász proved that , which meets the lower bound given by Shannon [18]. Also, there exists the following duality between and its complementary graph .
Proposition 2.
[13, Theorem 5] Let range over all unit vectors and let range over all orthonormal representations of . Then
(4) 
For two graphs and , their disjoint union, denoted as , is the graph whose vertex set is the disjoint union of and and whose edge set is the disjoint union of and . The Lovász theta number is multiplicative with respect to the strong product, and additive with respect to the disjoint union.
2.3 Haemers Minimum Rank Bound
Haemers [9, 10] proved a very useful upper bound based on the matrix rank as follows. An matrix over some field is said to fit if for , and whenever vertices and are nonadjacent for and . Let denote the Kronecker product of copies of . It is easy to verify that if fits , then fits .
Theorem 3.
[10] If a matrix fits a graph , then .
For a graph , Haemers [10] introduced the following graph invariant
where the minimization is taken over all fields. By Theorem 3 it follows that . Moreover, for a fixed field , define
It is easy to verify that is submultiplicative with respect to the strong product and additive with respect to the disjoint union, i.e., for any two graphs and ,
The following example is provided by Haemers [9] to answer some problems raised in [13].
Example 1.
[9] Let be the complement of the Schläfli graph, which is the unique strongly regular graph^{3}^{3}3A strongly regular graph with parameters is a regular graph with vertices and degree such that every two adjacent vertices have common neighbours, and every two nonadjacent vertices have common neighbours. with parameters . Let be the adjacency matrix of , and let
be the identity matrix of order
. Then the matrix fits the graph , and its rank over is equal to . Hence . This improves the bound given by the Lovász theta number. Moreover, Tims [20, Example 3.8] proved that over any field , and therefore . Similarly, the rank of the matrix over the field is also equal to , hence (this fact will be used in Example 7 and Remark 3).3 A Linear Programming Variation
In this section we will prove our main result, providing a new upper bound on the Shannon capacity by a variation of the linear programming bound given in (2)(3). For a subset , the induced subgraph is the graph whose vertex set is and whose edge set consists of all of the edges in that have both endpoints in .
Let be a realvalued function defined on graphs, and let be the optimal value of the following linear program:
(5) 
By duality can also be computed as follows:
(6) 
Remark 1.
The nonnegative realvalued function in (6), satisfying that for each vertex in , is called a fractional cover of by Körner, Pilotto and Simonyi [12]. The fractional cover is used to generalize the local chromatic number to provide an upper bound for the Sperner capacity of directed graphs^{4}^{4}4The Sperner capacity of directed graphs is a natural generalization of the Shannon capacity of undirected graphs. See [12] for definitions of the local chromatic number and the Sperner capacity., cf. [12, Theorem 6]. Note that for undirected graphs, the bounds in [12] are always no stronger than the fractional independence number, and hence are not useful upper bounds for the Shannon capacity.
By taking for and otherwise, it is readily verified that . In the next two lemmas, we show that if the function satisfies certain properties, then these properties are also inherited by . We say that is an upper bound on the independence number if for any graph .
Lemma 1.
If is an upper bound on the independence number, then so is .
Proof.
We say that is submultiplicative (with respect to the strong product) if for any two graphs and , .
Lemma 2.
If is submultiplicative, then so is .
Proof.
Let and be optimal solutions of the linear program (6) for and respectively. Now we assign weights to each subset of as follows: if for some and , then we set ; otherwise we set . Then for each vertex in , we have
So is a feasible solution for (6), and
in which the second inequality follows from the submultiplicativity of . This proves the result. ∎
Now we can prove the following upper bound on the Shannon capacity.
Theorem 4.
Let be a submultiplicative upper bound on the independence number. Then,
Any function that is a submultiplicative upper bound on the independence number forms by itself an upper bound on the Shannon capacity, i.e., . Combining this with Theorem 4 and the fact that we get Simply put, this chain of inequalities shows that is a bound that is at least as good as the bound that we started with in the first place. An immediate question is, can we get the strict inequality ? In other words, can we improve the bound on the Shannon capacity by solving the corresponding linear programming problem? In the sequel, we give an affirmative answer to this question by providing several explicit examples where a strict inequality holds. Furthermore, we answer the following two natural questions: 1) which functions should we use in Theorem 4? and 2) do we always get a tighter upper bound for any function ?
Before we proceed to answer those questions, we show some simple properties of , which are used later. We say that is superadditive with respect to the disjoint union if for any two graphs and .
Proposition 4.

If for each clique in , then . In particular, .

.

If is superadditive, then . In particular,
Proof.
1) Follows directly from (5). 2) Follows directly from (6). 3) Let and be optimal solutions of the primal linear program (5) for and respectively. We define an assignment for as follows: if and if . By the superadditivity of , we can verify that is a feasible solution of (5) for , and thus . Combining it with 2) proves that . The second equality follows from the fact that is additive with respect to the disjoint union. ∎
3.1 A New Bound
Now we take which is a submultiplicative upper bound on the Shannon capacity, and show that there exist graphs such that our new bound can outperform both minrk and Lovász theta number. The following three examples show several instances of it. Example 2 shows a family of graphs where our bound outperforms minrk but not Lovász theta number.
Example 2.
Example 3 provides a family of graphs where our bound outperforms simultaneously both minrk and Lovász theta number, however it might seem a bit artificial since it is a disjoint union of two graphs.
Example 3.
In Example 4 we construct a connected graph for which our bound also outperforms both minrk and Lovász theta number.
Example 4.
Let be the graph as plotted in Fig. 1. Note that , the induced subgraph of on the vertices , is the complement of the Schläfli graph, and the vertex of is connected to vertices and . Using Sagemath [19] one can verify that . From Proposition 7 in Appendix we see that . Take , and consider the following linear program:
(7) 
Using Sagemath [19] one can compute that the optimal value of (7) is equal to . Comparing (7) with (5), we have .
The following result shows that we cannot always get a tighter bound through this linear programming variation.
Proposition 5.
Fix a field . Let be a graph such that and for any subset we have . Then .
Proof.
The following example shows that there exist graphs satisfying the conditions of Proposition 5.
Example 5.
Fix a field . Let be a graph such that . If for any subset , then by Proposition 5. Otherwise, let be a subset of with the smallest size among those subsets such that . Obviously, the induced subgraph satisfies the conditions of Proposition 5, hence . (Note that there are many graphs for which , e.g. the complement of the Schläfli graph for .)
3.2 Bounds for Disjoint Union of Graphs
For the Shannon capacity of the disjoint union of two graphs, we have the following simple observation.
Corollary 1.
Proof.
Next, we shall combine the Lovász theta number and
through a weighted geometric mean to get another upper bound on the Shannon capacity of the disjoint union. Fix a field
and suppose . Then we can easily verify thatis also a submultiplicative upper bound on the independence number.
Corollary 2.
For a fixed field and a number ,
Proof.
Example 6.
Let be the complement of the Schläfli graph. Consider the graph . It is not hard to verify that and . By Corollary 2,
For the term achieving its minimum value on . Note that this value is strictly better than () and .
Lastly, if we take to be the Lovász theta number, our new bound cannot improve it.
Proposition 6.
.
4 A New Upper Bound for Index Coding
In this section we show that our technique also allows us to derive a new bound for the Index Coding problem to be defined next. In the Index Coding problem, a sender holds a set of messages to be broadcast to a group of receivers. Each receiver is interested in one of the messages, and has some prior side information comprising some subset of the other messages. This variant of source coding problem was first proposed in [5] by Birk and Kol, and later investigated in [4] by BarYossef et al.
The Index Coding problem can be formalized as follows: the sender holds messages where is the set of possible messages, and wishes to send them to receivers . Receiver wants to receive the message , and knows some subset of the other messages. The goal is to construct an efficient encoding scheme , where is a finite alphabet to be transmitted by the sender, such that for any , every receiver is able to decode the message from the value together with his own side information . We associate a directed graph with the sideinformation subset , whose vertex set is
, and whose edge set consists of all ordered pairs
such that . Here and in what follows, we further assume that the sideinformation graph is undirected, that is, if then . For messages that are bits long, i.e. , we use to denote the corresponding minimum possible encoding length . The broadcast rate of the sideinformation graph is defined aswhere the limit exists by subadditivity of and Fekete’s Lemma. That is to say that is the average asymptotic number of broadcast bits needed per bit of input. This quantity has received significant interest, and in this section we prove a new upper bound for it. In [5, 4, 14], it was proved that
(8) 
(here is an arbitrary finite field and is the cliquecover number of ). On the other hand, Blasiak et al. [7] proved that . For more background and details on the Index Coding problem, see [5, 4, 2, 7] and references therein.
Similarly as in Section 3, let be a realvalued function defined on graphs, and let be the optimal value of (5). Now we show that if is an upper bound on the broadcast rate, that is, for any graph , then is also an upper bound on the broadcast rate. The proof is a simple extension of [7, Claim 2.8].
Theorem 5.
If is an upper bound on the broadcast rate, then so is .
Proof.
Let be an optimal solution of the linear program (6). Without loss of generality, we can assume that each is a nonnegative rational number, otherwise we can choose a rational number arbitrarily close to . By (6) we get and for every vertex in . Let be a positive integer such that all the numbers are integers, and let for each . Then
Namely, we cover the graph using a collection of copies of for each . Set . Then, altogether we have a sequence of subsets , in which each appears times, such that every vertex in appears in at least of these subsets. By assumption, for each induced sideinformation graph , the average asymptotic number of broadcast bits needed per bit of input is upper bounded by . Concatenating these individual index codes for the graph (if for some vertex , then we may ignore extra bits), we can see that the average asymptotic number of broadcast bits needed per bit of input for graph is upper bounded by . This concludes the proof. ∎
Let us now consider the function for any graph , where the infimum ranges over all finite fields . By (8) we see that is an upper bound on the broadcast rate, hence so is by Theorem 5. Note that for different graphs, the value of may be obtained as the minimum rank over different fields. Therefore, the achievable scheme given by an optimal solution of the corresponding linear program (6) might yield a scheme that uses several different fields simultaneously. More simply, we can take for some fixed finite field . As is an upper bound for the broadcast rate by (8), we can get the following result directly from Theorem 5.
Corollary 3.
For any graph and any finite field , .
By 1) of Proposition 4, . Hence the bound is at least as good as and . The following example shows that sometimes can simultaneously outperform both and .
Example 7.
Remark 2.
Blasiak [6] and Shanmugam et al. [16] independently^{5}^{5}5Here we adopt the notion in Blasiak [6], which is slightly different from that in Shanmugam et al. [16]. obtained expressions for the infimum of the broadcast rate of vector linear broadcasting schemes ^{6}^{6}6A vector linear broadcasting scheme over a finite field is a scheme in which the message alphabet is a finite dimensional vector space over and the encoding and decoding functions are linear. over all finite fields as follows. Let be a graph with vertex set , and let be an matrix whose entries are matrices over some field . We say that fractionally represents the sideinformation graph over if is the identity matrix of size , and
is the zero matrix of size
whenever and are nonadjacent. The fractional minrank of is defined by(9) 
and
It is shown in [6, 16] that is the infimum of the broadcast rate of all vector linear broadcasting schemes over . On the other hand, we can obtain a vector linear broadcasting scheme over of rate , by using a vector linear broadcasting scheme of rate for each induced subgraph in the proof of Theorem 5. Hence . Note that it is very difficult to compute via (9). But our graph invariant provides a way to approach , since we can always get an upper bound for , and thus for , by solving the linear programming problem (6) or its subproblems obtained by removing some constraints from (6). Blasiak [6] and Shanmugam et al. [16] also proved that . See [8] for more properties of .
Remark 3.
Remark 4.
Shanmugam, Dimakis and Langberg [17] presented an upper bound for the broadcast rate of general sideinformation graphs using the local chromatic number. Later, this bound is further extended by Arbabjolfaei and Kim [3, Theorems 3–4] and Agarwal and Mazumdar [1, Theorems 3–5] via linear programming. Similarly as in Remark 1, for undirected graphs, it is not hard to check that those bounds are always no stronger than the fractional independence number.
Appendix
Lemma 3.
[20, Theorem 3.6] Let be a graph, let be a maximum independent set in , and let be a vertex not in . Set , which is nonempty since is maximum. If there exists another vertex that is adjacent to but not adjacent to any vertex of , then delete the edge , and let
Comments
There are no comments yet.