Energy Saving Additive Neural Network

02/09/2017
by   Arman Afrasiyabi, et al.
0

In recent years, machine learning techniques based on neural networks for mobile computing become increasingly popular. Classical multi-layer neural networks require matrix multiplications at each stage. Multiplication operation is not an energy efficient operation and consequently it drains the battery of the mobile device. In this paper, we propose a new energy efficient neural network with the universal approximation property over space of Lebesgue integrable functions. This network, called, additive neural network, is very suitable for mobile computing. The neural structure is based on a novel vector product definition, called ef-operator, that permits a multiplier-free implementation. In ef-operation, the "product" of two real numbers is defined as the sum of their absolute values, with the sign determined by the sign of the product of the numbers. This "product" is used to construct a vector product in R^N. The vector product induces the l_1 norm. The proposed additive neural network successfully solves the XOR problem. The experiments on MNIST dataset show that the classification performances of the proposed additive neural networks are very similar to the corresponding multi-layer perceptron and convolutional neural networks (LeNet).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2020

A Greedy Algorithm for Quantizing Neural Networks

We propose a new computationally efficient method for quantizing the wei...
research
07/21/2018

On the Analysis of Trajectories of Gradient Descent in the Optimization of Deep Neural Networks

Theoretical analysis of the error landscape of deep neural networks has ...
research
04/10/2017

Pyramid Vector Quantization for Deep Learning

This paper explores the use of Pyramid Vector Quantization (PVQ) to redu...
research
05/19/2018

Energy-Efficient Mobile Network I/O Optimization at the Application Layer

Mobile data traffic (cellular + WiFi) will exceed PC Internet traffic by...
research
10/04/2021

Universal approximation properties of shallow quadratic neural networks

In this paper we propose a new class of neural network functions which a...
research
01/31/2022

Equivariant neural networks for recovery of Hadamard matrices

We propose a message passing neural network architecture designed to be ...
research
12/04/2018

Prototype-based Neural Network Layers: Incorporating Vector Quantization

Neural networks currently dominate the machine learning community and th...

Please sign up or login with your details

Forgot password? Click here to reset