A Geometric Framework for Convolutional Neural Networks

08/15/2016
by   Anthony L. Caterini, et al.
0

In this paper, a geometric framework for neural networks is proposed. This framework uses the inner product space structure underlying the parameter set to perform gradient descent not in a component-based form, but in a coordinate-free manner. Convolutional neural networks are described in this framework in a compact form, with the gradients of standard --- and higher-order --- loss functions calculated for each layer of the network. This approach can be applied to other network structures and provides a basis on which to create new networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2016

A Novel Representation of Neural Networks

Deep Neural Networks (DNNs) have become very popular for prediction in m...
research
10/09/2018

The Outer Product Structure of Neural Network Derivatives

In this paper, we show that feedforward and recurrent neural networks ex...
research
11/08/2018

A Geometric Approach of Gradient Descent Algorithms in Neural Networks

In this article we present a geometric framework to analyze convergence ...
research
05/14/2019

Convolutional neural networks with fractional order gradient method

This paper proposes a fractional order gradient method for the backward ...
research
11/27/2021

Nonparametric Topological Layers in Neural Networks

Various topological techniques and tools have been applied to neural net...
research
04/30/2019

PR Product: A Substitute for Inner Product in Neural Networks

In this paper, we analyze the inner product of weight vector and input v...
research
06/06/2019

Learning in Gated Neural Networks

Gating is a key feature in modern neural networks including LSTMs, GRUs ...

Please sign up or login with your details

Forgot password? Click here to reset