Adaptive Neural Networks Using Residual Fitting

01/13/2023
by   Noah Ford, et al.
0

Current methods for estimating the required neural-network size for a given problem class have focused on methods that can be computationally intensive, such as neural-architecture search and pruning. In contrast, methods that add capacity to neural networks as needed may provide similar results to architecture search and pruning, but do not require as much computation to find an appropriate network size. Here, we present a network-growth method that searches for explainable error in the network's residuals and grows the network if sufficient error is detected. We demonstrate this method using examples from classification, imitation learning, and reinforcement learning. Within these tasks, the growing network can often achieve better performance than small networks that do not grow, and similar performance to networks that begin much larger.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/01/2019

Sub-Architecture Ensemble Pruning in Neural Architecture Search

Neural architecture search (NAS) is gaining more and more attention in r...
research
04/22/2019

GraphNAS: Graph Neural Architecture Search with Reinforcement Learning

Graph Neural Networks (GNNs) have been popularly used for analyzing non-...
research
08/30/2019

Learning Digital Circuits: A Journey Through Weight Invariant Self-Pruning Neural Networks

Recently, in the paper "Weight Agnostic Neural Networks" Gaier & Ha util...
research
07/05/2023

Dynamical Isometry based Rigorous Fair Neural Architecture Search

Recently, the weight-sharing technique has significantly speeded up the ...
research
03/27/2019

Network Slimming by Slimmable Networks: Towards One-Shot Architecture Search for Channel Numbers

We study how to set channel numbers in a neural network to achieve bette...
research
04/25/2021

Balancing Accuracy and Latency in Multipath Neural Networks

The growing capacity of neural networks has strongly contributed to thei...
research
07/27/2021

Experiments on Properties of Hidden Structures of Sparse Neural Networks

Sparsity in the structure of Neural Networks can lead to less energy con...

Please sign up or login with your details

Forgot password? Click here to reset