Training Neural Networks is NP-Hard in Fixed Dimension

03/29/2023
by   Vincent Froese, et al.
0

We study the parameterized complexity of training two-layer neural networks with respect to the dimension of the input data and the number of hidden neurons, considering ReLU and linear threshold activation functions. Albeit the computational complexity of these problems has been studied numerous times in recent years, several questions are still open. We answer questions by Arora et al. [ICLR '18] and Khalife and Basu [IPCO '22] showing that both problems are NP-hard for two dimensions, which excludes any polynomial-time algorithm for constant dimension. We also answer a question by Froese et al. [JAIR '22] proving W[1]-hardness for four ReLUs (or two linear threshold neurons) with zero training error. Finally, in the ReLU case, we show fixed-parameter tractability for the combined parameter number of dimensions and number of ReLUs if the network is assumed to compute a convex map. Our results settle the complexity status regarding these parameters almost completely.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2018

Complexity of Training ReLU Neural Network

In this paper, we explore some basic questions on the complexity of trai...
research
02/19/2019

Travelling on Graphs with Small Highway Dimension

We study the Travelling Salesperson (TSP) and the Steiner Tree problem (...
research
10/09/2018

The Computational Complexity of Training ReLU(s)

We consider the computational complexity of training depth-2 neural netw...
research
06/23/2020

Polynomial Time Approximation Schemes for Clustering in Low Highway Dimension Graphs

We study clustering problems such as k-Median, k-Means, and Facility Loc...
research
10/23/2015

On the complexity of switching linear regression

This technical note extends recent results on the computational complexi...
research
11/05/2019

Computational Separations between Sampling and Optimization

Two commonly arising computational tasks in Bayesian learning are Optimi...
research
05/18/2021

The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality

Understanding the computational complexity of training simple neural net...

Please sign up or login with your details

Forgot password? Click here to reset