Strong Memory Lower Bounds for Learning Natural Models

06/09/2022
by   Gavin Brown, et al.
5

We give lower bounds on the amount of memory required by one-pass streaming algorithms for solving several natural learning problems. In a setting where examples lie in {0,1}^d and the optimal classifier can be encoded using κ bits, we show that algorithms which learn using a near-minimal number of examples, Õ(κ), must use Ω̃( dκ) bits of space. Our space bounds match the dimension of the ambient space of the problem's natural parametrization, even when it is quadratic in the size of examples and the final classifier. For instance, in the setting of d-sparse linear classifiers over degree-2 polynomial features, for which κ=Θ(dlog d), our space lower bound is Ω̃(d^2). Our bounds degrade gracefully with the stream length N, generally having the form Ω̃(dκ·κ/N). Bounds of the form Ω(dκ) were known for learning parity and other problems defined over finite fields. Bounds that apply in a narrow range of sample sizes are also known for linear regression. Ours are the first such bounds for problems of the type commonly seen in recent learning applications that apply for a large range of input sizes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/02/2020

Near-Quadratic Lower Bounds for Two-Pass Graph Streaming Algorithms

We prove that any two-pass graph streaming algorithm for the s-t reachab...
research
02/09/2019

Space lower bounds for linear prediction

We show that fundamental learning tasks, such as finding an approximate ...
research
12/23/2020

Lower bounds for the number of random bits in Monte Carlo algorithms

We continue the study of restricted Monte Carlo algorithms in a general ...
research
04/09/2019

Polynomial Pass Lower Bounds for Graph Streaming Algorithms

We present new lower bounds that show that a polynomial number of passes...
research
12/11/2020

When is Memorization of Irrelevant Training Data Necessary for High-Accuracy Learning?

Modern machine learning models are complex and frequently encode surpris...
research
04/18/2019

Memory-Sample Tradeoffs for Linear Regression with Small Error

We consider the problem of performing linear regression over a stream of...
research
07/05/2021

Memory-Sample Lower Bounds for Learning Parity with Noise

In this work, we show, for the well-studied problem of learning parity u...

Please sign up or login with your details

Forgot password? Click here to reset