Neural Network Processing Neural Networks: An efficient way to learn higher order functions

11/06/2019
by   Firat Tuna, et al.
0

Functions are rich in meaning and can be interpreted in a variety of ways. Neural networks were proven to be capable of approximating a large class of functions[1]. In this paper, we propose a new class of neural networks called "Neural Network Processing Neural Networks" (NNPNNs), which inputs neural networks and numerical values, instead of just numerical values. Thus enabling neural networks to represent and process rich structures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2018

Some negative results for Neural Networks

We demonstrate some negative results for approximation of functions with...
research
10/30/2021

Equinox: neural networks in JAX via callable PyTrees and filtered transformations

JAX and PyTorch are two popular Python autodifferentiation frameworks. J...
research
12/13/2019

On the approximation of rough functions with deep neural networks

Deep neural networks and the ENO procedure are both efficient frameworks...
research
04/20/2022

Investigating the Optimal Neural Network Parameters for Decoding

Neural Networks have been proved to work as decoders in telecommunicatio...
research
02/01/2022

Data-driven emergence of convolutional structure in neural networks

Exploiting data invariances is crucial for efficient learning in both ar...
research
06/01/2022

Higher-Order Attention Networks

This paper introduces higher-order attention networks (HOANs), a novel c...
research
08/14/2021

Optimal Approximation with Sparse Neural Networks and Applications

We use deep sparsely connected neural networks to measure the complexity...

Please sign up or login with your details

Forgot password? Click here to reset