Overcoming The Limitations of Neural Networks in Composite-Pattern Learning with Architopes

10/29/2020
by   Anastasis Kratsios, et al.
0

The effectiveness of neural networks in solving complex problems is well recognized; however, little is known about their limitations. We demonstrate that the feed-forward architecture, for most commonly used activation functions, is incapable of approximating functions comprised of multiple sub-patterns while simultaneously respecting their composite-pattern structure. We overcome this bottleneck with a simple architecture modification that reallocates the neurons of any single feed-forward network across several smaller sub-networks, each specialized on a distinct part of the input-space. The modified architecture, called an Architope, is more expressive on two fronts. First, it is dense in an associated space of piecewise continuous functions in which the feed-forward architecture is not dense. Second, it achieves the same approximation rate as the feed-forward networks while only requiring 𝒪(N^-1) fewer parameters in its hidden layers. Moreover, the architecture achieves these approximation improvements while preserving the target's composite-pattern structure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/24/2020

Architopes: An Architecture Modification for Composite Pattern Learning, Increased Expressiveness, and Reduced Training Time

We introduce a simple neural network architecture modification that enab...
research
08/17/2001

Artificial Neurons with Arbitrarily Complex Internal Structures

Artificial neurons with arbitrarily complex internal structure are intro...
research
09/07/2016

Deep Markov Random Field for Image Modeling

Markov Random Fields (MRFs), a formulation widely used in generative ima...
research
10/13/2020

Unfolding recurrence by Green's functions for optimized reservoir computing

Cortical networks are strongly recurrent, and neurons have intrinsic tem...
research
02/21/2023

On the Behaviour of Pulsed Qubits and their Application to Feed Forward Networks

In the last two decades, the combination of machine learning and quantum...
research
01/11/2023

Exploring the Approximation Capabilities of Multiplicative Neural Networks for Smooth Functions

Multiplication layers are a key component in various influential neural ...
research
08/19/2021

Determinant-free fermionic wave function using feed-forward neural networks

We propose a general framework for finding the ground state of many-body...

Please sign up or login with your details

Forgot password? Click here to reset