PR Product: A Substitute for Inner Product in Neural Networks

04/30/2019
by   Zhennan Wang, et al.
0

In this paper, we analyze the inner product of weight vector and input vector in neural networks from the perspective of vector orthogonal decomposition and prove that the local direction gradient of weight vector decreases as the angle between them gets closer to 0 or π. We propose the PR Product, a substitute for the inner product, which makes the local direction gradient of weight vector independent of the angle and consistently larger than the one in the conventional inner product while keeping the forward propagation identical. As the basic operation in neural networks, the PR Product can be applied into many existing deep learning modules, so we develop the PR Product version of the fully connected layer, convolutional layer, and LSTM layer. In static image classification, the experiments on CIFAR10 and CIFAR100 datasets demonstrate that the PR Product can robustly enhance the ability of various state-of-the-art classification networks. On the task of image captioning, even without any bells and whistles, our PR Product version of captioning model can compete or outperform the state-of-the-art models on MS COCO dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/31/2015

Exploiting Local Structures with the Kronecker Layer in Convolutional Networks

In this paper, we propose and study a technique to reduce the number of ...
research
02/20/2017

Cosine Normalization: Using Cosine Similarity Instead of Dot Product in Neural Networks

Traditionally, multi-layer neural networks use dot product between the o...
research
09/26/2017

Tensor Product Generation Networks

We present a new tensor product generation network (TPGN) that generates...
research
09/04/2015

Quantization based Fast Inner Product Search

We propose a quantization based approach for fast approximate Maximum In...
research
08/15/2016

A Geometric Framework for Convolutional Neural Networks

In this paper, a geometric framework for neural networks is proposed. Th...
research
04/09/2021

Householder orthogonalization with a non-standard inner product

Householder orthogonalization plays an important role in numerical linea...
research
09/18/2022

Spatial Autocorrelation Equation Based on Moran's Index

Based on standardized vector and globally normalized weight matrix, Mora...

Please sign up or login with your details

Forgot password? Click here to reset