Backpropagation with N-D Vector-Valued Neurons Using Arbitrary Bilinear Products

05/24/2018
by   Zhe-Cheng Fan, et al.
2

Vector-valued neural learning has emerged as a promising direction in deep learning recently. Traditionally, training data for neural networks (NNs) are formulated as a vector of scalars; however, its performance may not be optimal since associations among adjacent scalars are not modeled. In this paper, we propose a new vector neural architecture called the Arbitrary BIlinear Product Neural Network (ABIPNN), which processes information as vectors in each neuron, and the feedforward projections are defined using arbitrary bilinear products. Such bilinear products can include circular convolution, seven-dimensional vector product, skew circular convolution, reversed- time circular convolution, or other new products not seen in previous work. As a proof-of-concept, we apply our proposed network to multispectral image denoising and singing voice sepa- ration. Experimental results show that ABIPNN gains substantial improvements when compared to conventional NNs, suggesting that associations are learned during training.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset