On the Bounds of Function Approximations

08/26/2019
by   Adrian de Wynter, et al.
0

Within machine learning, the subfield of Neural Architecture Search (NAS) has recently garnered research attention due to its ability to improve upon human-designed models. However, the computational requirements for finding an exact solution to this problem are often intractable, and the design of the search space still requires manual intervention. In this paper we attempt to establish a formalized framework from which we can better understand the computational bounds of NAS in relation to its search space. For this, we first reformulate the function approximation problem in terms of sequences of functions, and we call it the Function Approximation (FA) problem; then we show that it is computationally infeasible to devise a procedure that solves FA for all functions to zero error, regardless of the search space. We show also that such error will be minimal if a specific class of functions is present in the search space. Subsequently, we show that machine learning as a mathematical problem is a solution strategy for FA, albeit not an effective one, and further describe a stronger version of this approach: the Approximate Architectural Search Problem (a-ASP), which is the mathematical equivalent of NAS. We leverage the framework from this paper and results from the literature to describe the conditions under which a-ASP can potentially solve FA as well as an exhaustive search, but in polynomial time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2021

Poisoning the Search Space in Neural Architecture Search

Deep learning has proven to be a highly effective problem-solving tool f...
research
04/03/2020

Neural Architecture Generator Optimization

Neural Architecture Search (NAS) was first proposed to achieve state-of-...
research
03/22/2021

AutoSpace: Neural Architecture Search with Less Human Interference

Current neural architecture search (NAS) algorithms still require expert...
research
04/21/2023

SSS3D: Fast Neural Architecture Search For Efficient Three-Dimensional Semantic Segmentation

We present SSS3D, a fast multi-objective NAS framework designed to find ...
research
12/31/2019

Scalable NAS with Factorizable Architectural Parameters

Neural architecture search (NAS) is an emerging topic in machine learnin...
research
03/19/2021

GNAS: A Generalized Neural Network Architecture Search Framework

In practice, the problems encountered in training NAS (Neural Architectu...
research
09/15/2022

Generalization Properties of NAS under Activation and Skip Connection Search

Neural Architecture Search (NAS) has fostered the automatic discovery of...

Please sign up or login with your details

Forgot password? Click here to reset