Space lower bounds for linear prediction

02/09/2019
by   Yuval Dagan, et al.
0

We show that fundamental learning tasks, such as finding an approximate linear separator or linear regression, require memory at least quadratic in the dimension, in a natural streaming setting. This implies that such problems cannot be solved (at least in this setting) by scalable memory-efficient streaming algorithms. Our results build on a memory lower bound for a simple linear-algebraic problem -- finding orthogonal vectors -- and utilize the estimates on the packing of the Grassmannian, the manifold of all linear subspaces of fixed dimension.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2022

Strong Memory Lower Bounds for Learning Natural Models

We give lower bounds on the amount of memory required by one-pass stream...
research
11/20/2020

Space Lower Bounds for Graph Stream Problems

This work concerns with proving space lower bounds for graph problems in...
research
03/19/2018

On the streaming complexity of fundamental geometric problems

In this paper, we focus on lower bounds and algorithms for some basic ge...
research
05/13/2019

Streaming Algorithms for Bin Packing and Vector Scheduling

Problems involving the efficient arrangement of simple objects, as captu...
research
10/03/2020

Spiking Neural Networks Through the Lens of Streaming Algorithms

We initiate the study of biological neural networks from the perspective...
research
11/12/2020

FPT-Algorithms for the l-Matchoid Problem with Linear and Submodular Objectives

We design a fixed-parameter deterministic algorithm for computing a maxi...
research
07/13/2022

Realizability Makes a Difference: A Complexity Gap for Sink-Finding in USOs

Algorithms for finding the sink in Unique Sink Orientations (USOs) of th...

Please sign up or login with your details

Forgot password? Click here to reset