Limits on representing Boolean functions by linear combinations of simple functions: thresholds, ReLUs, and low-degree polynomials

02/26/2018
by   R. Ryan Williams, et al.
0

We consider the problem of representing Boolean functions exactly by "sparse" linear combinations (over R) of functions from some "simple" class C. In particular, given C we are interested in finding low-complexity functions lacking sparse representations. When C is the set of PARITY functions or the set of conjunctions, this sort of problem has a well-understood answer, the problem becomes interesting when C is "overcomplete" and the set of functions is not linearly independent. We focus on the cases where C is the set of linear threshold functions, the set of rectified linear units (ReLUs), and the set of low-degree polynomials over a finite field, all of which are well-studied in different contexts. We provide generic tools for proving lower bounds on representations of this kind. Applying these, we give several new lower bounds for "semi-explicit" Boolean functions. For example, we show there are functions in nondeterministic quasi-polynomial time that require super-polynomial size: ∙ Depth-two neural networks with sign activation function, a special case of depth-two threshold circuit lower bounds. ∙ Depth-two neural networks with ReLU activation function. ∙ R-linear combinations of O(1)-degree F_p-polynomials, for every prime p (related to problems regarding Higher-Order "Uncertainty Principles"). We also obtain a function in E^NP requiring 2^Ω(n) linear combinations. ∙ R-linear combinations of ACC ∘ THR circuits of polynomial size (further generalizing the recent lower bounds of Murray and the author). (The above is a shortened abstract. For the full abstract, see the paper.)

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro