## 1 Introduction

Let and . Throughout this paper, we write and for an integer . It is well known that any function can be expressed as

where for
are the *Fourier basis* *functions* and

for are called the *Fourier coefficients*
of . When is a Boolean function, i.e., ,
we have

by Parseval, so we can treat the Fourier coefficients’ squares as a probability distribution

on the subsets of , which we call the*spectral distribution*of .

The following two parameters of the function can be defined in terms of its spectral distribution.

###### Definition.

The *total influence* (also called average sensitivity) of a
Boolean function is

###### Definition.

The *spectral entropy* of a Boolean function is the (Shannon)
entropy of its spectral distribution

In 1996 Friedgut and Kalai raised the following conjecture, known as the Fourier Entropy/Influence (FEI) conjecture:

###### Conjecture 1.1 ([4]).

There exists a universal constant such that for every Boolean function with total influence and spectral entropy we have .

Conjecture 1.1 was verified for various families of Boolean functions (e.g., symmetric functions [10], random functions [3], read-once formulas [1, 9]

, decision trees of constant average depth

[11], read- decision trees for constant [11]) but is still open for the class of general Boolean functions.The rest of this paper is organized as follows. In the remainder of Section 1 we describe past results and some rudimentary improvements. In Section 2 we introduce lexicographic functions and provide a formal proof of the approach described in Section 1.3. In Section 3 we generalize Proposition 1.2 to biased functions and get an improved lower bound. In Section 4 we build a limit-of-limits function that achieves an even better bound. In Section 5 we prove a Lipschitz-type condition used throughout the paper, namely that a small change in a Boolean function cannot result in a substantial change to its total influence and spectral entropy.

### 1.1 A baby example and two definitions

Here is a example of providing a lower bound on C. For consider the function

It satisfies and ^{1}^{1}1More precisely,
for all . so any constant in Conjecture 1.1 must satisfy

This is true for every , so by taking we establish that .

###### Definition.

A Boolean function is called *monotone* if changing an input
bit from to cannot change the output
from to .

###### Fact.

A Boolean function is monotone if and only if it can be expressed as a formula combining variables using conjunctions () and disjunctions () only, with no negations.

###### Definition.

Let be a Boolean function on variables. The *dual *function
of , denoted , is defined as

###### Fact.

For all we have .

###### Corollary.

The spectral distributions and are identical; in particular, and . But and .

###### Remark.

If is monotone then is monotone too. Furthermore, given a monotone formula computing , the formula obtained by swapping conjunctions and disjunctions computes .

###### Example.

The dual of is .

### 1.2 Past results and preliminary improvements

The current best lower bound on was achieved by O’Donnell and Tan [9]. Using recursive composition they showed the following bound:

###### Proposition 1.2.

Let be a balanced Boolean function such that . Then any constant in Conjecture 1.1 satisfies

###### Remark.

Any balanced Boolean function has since ; in case of equality we must have for some and thus is supported on a single set and its spectral entropy is zero.

By presenting a function on variables with total influence and entropy , they established that . Although the specific function presented in [9] happens to be biased, their result stands as there exists a balanced Boolean function on 6 variables with the same total influence and entropy:

A slight improvement can be achieved by modifying the last clause of . Indeed,

is balanced too, with the same total influence and a slightly higher entropy , so we have .

Moving to balanced functions on 8 variables, we find a monotone function that provides a better lower bound:

with and yields .

A further search discovers a slightly superior function:

with and achieves .

### 1.3 Sequences of balanced monotone functions

Staring at , and

for a moment (but not

), we may see a common property: appears in all clauses except the first. Let us rewrite and in a slightly different form:This generalizes easily to a sequence of balanced (to be shown below) monotone Boolean functions:

whose first two members are

Denote by the lower bound on implied by . The first fifteen members of the sequence are explored in Table 1. Note how even is much better than the bound of Subsection 1.1.

1 | 2 | 1 | 0 | (not defined) |

2 | 4 | 3 | 6 | |

3 | 6 | |||

4 | 8 | |||

5 | 10 | |||

6 | 12 | |||

7 | 14 | |||

8 | 16 | |||

9 | 18 | |||

10 | 20 | |||

11 | 22 | |||

12 | 24 | |||

13 | 26 | |||

14 | 28 | |||

15 | 30 |

The three sequences seem to be increasing and bounded, so let us denote their respective hypothetical limits by , and . If indeed for all then . A prescient guess for the value of could be

for which we would get

as a lower bound for . We will verify this guess in Section 2.

Recall that and gave rise to better lower bounds, respectively, than and . It is tempting perhaps to consider a generalization , define accordingly and examine the hypothetical limits , and . It is indeed possible to do so, and we get while , making . Nevertheless, and seem to converge towards the same and , respectively, so there is no real benefit in pursuing this further.

It remains to verify that is indeed balanced for all . Let us write it as

where is defined recursively via

###### Remark.

The function belongs to a class of monotone Boolean functions called lexicographic functions, as we will see in Section 2.1.

For simplicity of notation, we abbreviate and write or even to denote . Since

to prove , it suffices to verify the following (see Appendix A for the calculation):

###### Claim 1.3.

For all we have .

## 2 A Tale of Two Thirds

Although each of from Table 1 is a valid, explicit lower bound on , the asymptotic discussion in Subsection 1.3 was more of a wishful thinking rather than a mathematically sound statement.

In this section we explore the class of lexicographic functions, develop tools to compute total influence and spectral entropy, and then rigorously calculate , and .

### 2.1 Lexicographic functions

###### Definition.

Fix integers and . Denote by the initial segment of cardinality (with respect to the lexicographic order on ), and denote by

###### Fact.

We have and .

###### Fact.

The function is monotone and its dual is .

###### Example.

and .

###### Fact.

If is even then is isomorphic to (when the latter is extended from to variables by adding an influenceless variable).

Let

be an odd integer, and let

be its binary representation, where is the most significant bit and is the least significant bit. Denote the corresponding representation of by .By definition, to determine the value of for an input , we need to compare with element by element. This gives a neat formula for :

(1) |

where

###### Remark.

The formula (1) shows that every monotone decision list, i.e., a monotone decision tree consisting of a single path, is isomorphic to a lexicographic function.

From (1) we derive an important property of lexicographic functions.

###### Fact 2.1.

For , the value of only depends on with probability ; that is, when for all .

###### Remark.

This can be interpreted as saying that the average decision tree complexity of is .

We extend the definition of lexicographic functions by writing for some . Note that is not necessarily odd, so the effective number of variables can be smaller.

###### Example.

For any we have ; that is, .

###### Example.

For , we have . Observe that the binary representation of the odd integer has for and thus

that is, .

Fix and consider the sequence .
Whenever is a dyadic rational^{2}^{2}2That is, a rational number of the form .,
converges to a fixed function
(e.g.,
in the example above). We would like to consider the limit object
for other values of as well.

It may sound intimidating; after all,
is a Boolean function on variables, which is quite a
lot. Nevertheless, by Fact 2.1,
only reads *two* input
bits on average.

Moreover, we care about the total influence and spectral entropy of functions. By Lemmata 5.1 and 5.2 from Section 5, and . Indeed, differs from (when considering the latter as a function on variables by adding an influenceless variable) in at most one place, and thus and are Cauchy sequences.

Needless to say, .

An even stronger statement holds (but will not be used or proved here):
the spectral distributions of
converge in distribution to a limit distribution , which
we call the spectral distribution of .
Note that is supported on *finite* subsets of .
The expected cardinality and the entropy of are
and respectively.

### 2.2 Total influence and lexicographic functions

The edge isoperimetric inequality in the discrete cube (by Harper [5], with an addendum by Bernstein [2], and independently Lindsey [7]) gives a lower bound on the total influence of Boolean functions.

###### Theorem 2.2.

Let be a Boolean function with . Then .

In fact, they proved that lexicographic functions are the minimizers of total influence.

###### Theorem 2.3.

Fix integers and and let be a Boolean function on variables with . Then .

###### Remark.

Theorem 2.3 explains our interest in lexicographic functions: when seeking a function with large entropy/influence ratio , it makes sense to minimize .

In [6], Hart exactly computed the total influence of lexicographic functions:

###### Proposition 2.4 ([6, Theorem 1.5]).

Fix integers and . Then

where is the Hamming weight of .

Let us rephrase Proposition 2.4 a bit.

###### Claim 2.5.

Let , where are the locations of in the binary representation of . Then .

###### Proof.

By induction on . For details see Appendix A. ∎

###### Example.

For we get , demonstrating the tightness of Theorem 2.2.

###### Corollary 2.6.

Let ,
where are the locations of in the
binary representation of .^{3}^{3}3To be read as a finite sum when is a dyadic rational.
Then .

This leads to the following observation:

###### Fact.

For any we have

(2) |

###### Example.

For we have

hence . By duality we have as well.

###### Remark.

Four-thirds is actually the maximum influence attainable by any lexicographic function, as the following claim shows:

###### Claim 2.7.

For all we have .

###### Proof.

. Writing , we have for all . Moreover, we cannot have for all since . Denote by the minimal for which . Now, by Corollary 2.6,

where the full calculation is in Appendix A.

. Then

. Since
is a continuous function of , it has a maximum in the closed
interval , obtained at .^{4}^{4}4If the maximum is attained multiple times, pick one arbitrarily.
If
for some then for
we have

contradicting either the choice of or one of the two previous cases.∎

###### Remark.

We have for other values of besides and , e.g.,

### 2.3 Disjoint composition

We now present the main tool we use to compute total influence and spectral entropy for our construction.

###### Definition.

For two Boolean functions and on and variables, resp., define the Boolean functions on variables and as

and denote by the one variable identity function.

###### Remark.

The class of functions built using , , and is called read-once monotone formulas. By (1) every lexicographic function is a read-once monotone formulas.

###### Definition.

Let be the binary entropy function, defined by

for and . We also make extensive use of its variant

###### Fact.

Both and are symmetric about .

The following proposition is an easy corollary of [1, Lemmata 5.7 and 5.8]. Alternatively, it is a special case of Lemma 3.1 in Section 3, which is an adaptation of [9, Proposition 3.2].

###### Proposition 2.8.

Let and be Boolean functions and let for . Then

where

###### Remark.

Via the De Morgan equality , this also yields

Proposition 2.8 gets simplified significantly when one of the functions is balanced, using the following observation (see Appendix A for the calculation):

###### Claim 2.9.

Let . Then .

###### Corollary 2.10.

Let be a Boolean function and let . Then

and

### 2.4 A first lower bound

We could use Claim 2.5 to compute the total influence of , but we also need its spectral entropy, so we use its recursive definition and Corollary 2.10. Since we are interested in asymptotics, we prefer working directly with , which satisfies the “equation” .

We already know , whereas for the entropy we have

and we can solve for

Note that it is possible to fully compute the total influence of :

and to write an expression for its spectral entropy:

but it is far easier to use the exponentially fast convergeance , rather than find an exact closed expression for .

###### Remark.

Similarly, it is possible to exactly compute the total influence and spectral entropy of for any rational . Indeed, every rational number has a recurrent binary representation, yielding linear equations in and .

Approximating and for an irrational can be done, with exponentially decreasing error, via writing as a limit of a sequence of dyadic rationals (e.g., truncated binary representations of ).

###### Remark.

In a certain sense, is the *simplest*
infinite lexicographic function. Indeed, denote by
the length of the recurring part in the binary expansion of a rational
. We have if and only if is a
dyadic rational. If is a dyadic multiple^{5}^{5}5That is, we can write for co-prime positive integers
and . of for a positive odd integer , then ,
where is the multiplicative order of modulo .
In particular, if and only if is
a dyadic multiple of .

Recall that is the conjunction of two functions: and . By Fact 2.1, these are almost independent since the shared variable has exponentially small influence on .

Comments

There are no comments yet.