Improved Lower Bounds for the Fourier Entropy/Influence Conjecture via Lexicographic Functions

by   Rani Hod, et al.
Bar-Ilan University

Every Boolean function can be uniquely represented as a multilinear polynomial. The entropy and the total influence are two ways to measure the concentration of its Fourier coefficients, namely the monomial coefficients in this representation: the entropy roughly measures their spread, while the total influence measures their average level. The Fourier Entropy/Influence conjecture of Friedgut and Kalai from 1996 states that the entropy to influence ratio is bounded by a universal constant C. Using lexicographic Boolean functions, we present three explicit asymptotic constructions that improve upon the previously best known lower bound C>6.278944 by O'Donnell and Tan, obtained via recursive composition. The first uses their construction with the lexicographic function ℓ〈 2/3〉 of measure 2/3 to demonstrate that C>4+3_43>6.377444. The second generalizes their construction to biased functions and obtains C>6.413846 using ℓ〈Φ〉 , where Φ is the inverse golden ratio. The third, independent, construction gives C>6.454784, even for monotone functions. Beyond modest improvements to the value of C, our constructions shed some new light on the properties sought in potential counterexamples to the conjecture. Additionally, we prove a Lipschitz-type condition on the total influence and spectral entropy, which may be of independent interest.



There are no comments yet.


page 1

page 2

page 3

page 4


Fourier Entropy-Influence Conjecture for Random Linear Threshold Functions

The Fourier-Entropy Influence (FEI) Conjecture states that for any Boole...

Tight Chang's-lemma-type bounds for Boolean functions

Chang's lemma (Duke Mathematical Journal, 2002) is a classical result wi...

Entropy versus influence for complex functions of modulus one

We present an example of a function f from {-1,1}^n to the unit sphere i...

Improved bounds on Fourier entropy and Min-entropy

Given a Boolean function f:{-1,1}^n→{-1,1}, the Fourier distribution ass...

Revisiting Bourgain-Kalai and Fourier Entropies

The total influence of a function is a central notion in analysis of Boo...

A proof of the Shepp-Olkin entropy monotonicity conjecture

Consider tossing a collection of coins, each fair or biased towards head...

Coin Theorems and the Fourier Expansion

In this note we compare two measures of the complexity of a class F of B...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Let and . Throughout this paper, we write and for an integer . It is well known that any function can be expressed as

where for are the Fourier basis functions and

for are called the Fourier coefficients of . When is a Boolean function, i.e., , we have

by Parseval, so we can treat the Fourier coefficients’ squares as a probability distribution 

on the  subsets of , which we call the spectral distribution of .

The following two parameters of the function can be defined in terms of its spectral distribution.


The total influence (also called average sensitivity) of a Boolean function  is


The spectral entropy of a Boolean function  is the (Shannon) entropy of its spectral distribution

In 1996 Friedgut and Kalai raised the following conjecture, known as the Fourier Entropy/Influence (FEI) conjecture:

Conjecture 1.1 ([4]).

There exists a universal constant such that for every Boolean function with total influence and spectral entropy we have .

Conjecture 1.1 was verified for various families of Boolean functions (e.g., symmetric functions [10], random functions [3], read-once formulas [1, 9]

, decision trees of constant average depth 

[11], read- decision trees for constant  [11]) but is still open for the class of general Boolean functions.

The rest of this paper is organized as follows. In the remainder of Section 1 we describe past results and some rudimentary improvements. In Section 2 we introduce lexicographic functions and provide a formal proof of the approach described in Section 1.3. In Section 3 we generalize Proposition 1.2 to biased functions and get an improved lower bound. In Section 4 we build a limit-of-limits function that achieves an even better bound. In Section 5 we prove a Lipschitz-type condition used throughout the paper, namely that a small change in a Boolean function cannot result in a substantial change to its total influence and spectral entropy.

1.1 A baby example and two definitions

Here is a example of providing a lower bound on C. For consider the function

It satisfies and 111More precisely, for all . so any constant in Conjecture 1.1 must satisfy

This is true for every , so by taking we establish that .


A Boolean function is called monotone if changing an input bit from to cannot change the output from to .


A Boolean function is monotone if and only if it can be expressed as a formula combining variables using conjunctions () and disjunctions () only, with no negations.


Let be a Boolean function on variables. The dual function of , denoted , is defined as


For all we have .


The spectral distributions and are identical; in particular, and . But and .


If  is monotone then  is monotone too. Furthermore, given a monotone formula computing , the formula obtained by swapping conjunctions and disjunctions computes .


The dual of is .

1.2 Past results and preliminary improvements

The current best lower bound on was achieved by O’Donnell and Tan [9]. Using recursive composition they showed the following bound:

Proposition 1.2.

Let be a balanced Boolean function such that . Then any constant  in Conjecture 1.1 satisfies


Any balanced Boolean function has since ; in case of equality we must have for some and thus is supported on a single set and its spectral entropy is zero.

By presenting a function on variables with total influence and entropy , they established that . Although the specific function presented in [9] happens to be biased, their result stands as there exists a balanced Boolean function on 6 variables with the same total influence and entropy:

A slight improvement can be achieved by modifying the last clause of . Indeed,

is balanced too, with the same total influence and a slightly higher entropy , so we have .

Moving to balanced functions on 8 variables, we find a monotone function that provides a better lower bound:

with and yields .

A further search discovers a slightly superior function:

with and achieves .

1.3 Sequences of balanced monotone functions

Staring at , and

for a moment (but not

), we may see a common property: appears in all clauses except the first. Let us rewrite and in a slightly different form:

This generalizes easily to a sequence of balanced (to be shown below) monotone Boolean functions:

whose first two members are

Denote by the lower bound on implied by . The first fifteen members of the sequence are explored in Table 1. Note how even is much better than the bound of Subsection 1.1.

1 2 1 0 (not defined)
2 4 3 6
3 6
4 8
5 10
6 12
7 14
8 16
9 18
10 20
11 22
12 24
13 26
14 28
15 30
Table 1: Parameters of the sequence for

The three sequences seem to be increasing and bounded, so let us denote their respective hypothetical limits by , and . If indeed for all then . A prescient guess for the value of could be

for which we would get

as a lower bound for . We will verify this guess in Section 2.

Recall that and gave rise to better lower bounds, respectively, than and . It is tempting perhaps to consider a generalization , define accordingly and examine the hypothetical limits , and . It is indeed possible to do so, and we get while , making . Nevertheless, and seem to converge towards the same and , respectively, so there is no real benefit in pursuing this further.

It remains to verify that is indeed balanced for all . Let us write it as

where is defined recursively via


The function belongs to a class of monotone Boolean functions called lexicographic functions, as we will see in Section 2.1.

For simplicity of notation, we abbreviate and write or even to denote . Since

to prove , it suffices to verify the following (see Appendix A for the calculation):

Claim 1.3.

For all we have .

2 A Tale of Two Thirds

Although each of from Table 1 is a valid, explicit lower bound on , the asymptotic discussion in Subsection 1.3 was more of a wishful thinking rather than a mathematically sound statement.

In this section we explore the class of lexicographic functions, develop tools to compute total influence and spectral entropy, and then rigorously calculate , and .

2.1 Lexicographic functions


Fix integers and . Denote by the initial segment of cardinality (with respect to the lexicographic order on ), and denote by

its characteristic function


We have and .


The function is monotone and its dual is .


and .


If is even then is isomorphic to (when the latter is extended from to variables by adding an influenceless variable).


be an odd integer, and let

be its binary representation, where  is the most significant bit and is the least significant bit. Denote the corresponding representation of  by .

By definition, to determine the value of  for an input , we need to compare  with  element by element. This gives a neat formula for :




The formula (1) shows that every monotone decision list, i.e., a monotone decision tree consisting of a single path, is isomorphic to a lexicographic function.

From (1) we derive an important property of lexicographic functions.

Fact 2.1.

For , the value of only depends on with probability ; that is, when for all .


This can be interpreted as saying that the average decision tree complexity of is .

We extend the definition of lexicographic functions by writing for some . Note that is not necessarily odd, so the effective number of variables can be smaller.


For any we have ; that is, .


For , we have . Observe that the binary representation of the odd integer has for and thus

that is, .

Fix and consider the sequence . Whenever is a dyadic rational222That is, a rational number of the form ., converges to a fixed function (e.g., in the example above). We would like to consider the limit object for other values of as well.

It may sound intimidating; after all, is a Boolean function on variables, which is quite a lot. Nevertheless, by Fact 2.1, only reads two input bits on average.

Moreover, we care about the total influence and spectral entropy of functions. By Lemmata 5.1 and 5.2 from Section 5, and . Indeed, differs from (when considering the latter as a function on variables by adding an influenceless variable) in at most one place, and thus and are Cauchy sequences.

Needless to say, .

An even stronger statement holds (but will not be used or proved here): the spectral distributions of converge in distribution to a limit distribution , which we call the spectral distribution of . Note that is supported on finite subsets of . The expected cardinality and the entropy of are and respectively.

2.2 Total influence and lexicographic functions

The edge isoperimetric inequality in the discrete cube (by Harper [5], with an addendum by Bernstein [2], and independently Lindsey [7]) gives a lower bound on the total influence of Boolean functions.

Theorem 2.2.

Let be a Boolean function with . Then .

In fact, they proved that lexicographic functions are the minimizers of total influence.

Theorem 2.3.

Fix integers and and let be a Boolean function on variables with . Then .


Theorem 2.3 explains our interest in lexicographic functions: when seeking a function with large entropy/influence ratio , it makes sense to minimize .

In [6], Hart exactly computed the total influence of lexicographic functions:

Proposition 2.4 ([6, Theorem 1.5]).

Fix integers and . Then

where is the Hamming weight of .

Let us rephrase Proposition 2.4 a bit.

Claim 2.5.

Let , where are the locations of in the binary representation of . Then .


By induction on . For details see Appendix A. ∎


For we get , demonstrating the tightness of Theorem 2.2.

Corollary 2.6.

Let , where are the locations of in the binary representation of .333To be read as a finite sum when is a dyadic rational. Then .

This leads to the following observation:


For any we have


For we have

hence . By duality we have as well.


Compare the bound obtained for from Theorem 2.2 to computed above. In fact, Theorem 2.2 is only tight when is a power of two.

Four-thirds is actually the maximum influence attainable by any lexicographic function, as the following claim shows:

Claim 2.7.

For all we have .


. Writing , we have for all . Moreover, we cannot have for all since . Denote by the minimal for which . Now, by Corollary 2.6,

where the full calculation is in Appendix A.

. Then

. Since is a continuous function of , it has a maximum in the closed interval , obtained at .444If the maximum is attained multiple times, pick one arbitrarily. If for some then for we have

contradicting either the choice of or one of the two previous cases.∎


We have for other values of besides and , e.g.,

2.3 Disjoint composition

We now present the main tool we use to compute total influence and spectral entropy for our construction.


For two Boolean functions and on and variables, resp., define the Boolean functions on variables and as

and denote by the one variable identity function.


The class of functions built using , , and is called read-once monotone formulas. By (1) every lexicographic function is a read-once monotone formulas.

As mentioned in the introduction, it was shown by [1, 9] that read-once formulas satisfy Conjecture 1.1 with the constant .


Let be the binary entropy function, defined by

for and . We also make extensive use of its variant


Both and are symmetric about .

The following proposition is an easy corollary of [1, Lemmata 5.7 and 5.8]. Alternatively, it is a special case of Lemma 3.1 in Section 3, which is an adaptation of [9, Proposition 3.2].

Proposition 2.8.

Let and be Boolean functions and let for . Then



Via the De Morgan equality , this also yields

Proposition 2.8 gets simplified significantly when one of the functions is balanced, using the following observation (see Appendix A for the calculation):

Claim 2.9.

Let . Then .

Corollary 2.10.

Let be a Boolean function and let . Then


2.4 A first lower bound

We could use Claim 2.5 to compute the total influence of , but we also need its spectral entropy, so we use its recursive definition and Corollary 2.10. Since we are interested in asymptotics, we prefer working directly with , which satisfies the “equation” .

We already know , whereas for the entropy we have

and we can solve for

Note that it is possible to fully compute the total influence of :

and to write an expression for its spectral entropy:

but it is far easier to use the exponentially fast convergeance , rather than find an exact closed expression for .


Similarly, it is possible to exactly compute the total influence and spectral entropy of for any rational . Indeed, every rational number has a recurrent binary representation, yielding linear equations in and .

Approximating and for an irrational can be done, with exponentially decreasing error, via writing as a limit of a sequence of dyadic rationals (e.g., truncated binary representations of ).


In a certain sense, is the simplest infinite lexicographic function. Indeed, denote by the length of the recurring part in the binary expansion of a rational . We have if and only if  is a dyadic rational. If  is a dyadic multiple555That is, we can write for co-prime positive integers and . of for a positive odd integer , then , where is the multiplicative order of  modulo . In particular, if and only if is a dyadic multiple of .

Recall that is the conjunction of two functions: and . By Fact 2.1, these are almost independent since the shared variable has exponentially small influence on .

When considering the limit object , the dependence disappears and we have , so we can calculate its total influence and entropy using Proposition 2.8 (full details in Appendix A):