Rethinking Arithmetic for Deep Neural Networks

05/07/2019
by   George A. Constantinides, et al.
0

We consider efficiency in deep neural networks. Hardware accelerators are gaining interest as machine learning becomes one of the drivers of high-performance computing. In these accelerators, the directed graph describing a neural network can be implemented as a directed graph describing a Boolean circuit. We make this observation precise, leading naturally to an understanding of practical neural networks as discrete functions, and show that so-called binarised neural networks are functionally complete. In general, our results suggest that it is valuable to consider Boolean circuits as neural networks, leading to the question of which circuit topologies are promising. We argue that continuity is central to generalisation in learning, explore the interaction between data coding, network topology, and node functionality for continuity, and pose some open questions for future research. As a first step to bridging the gap between continuous and Boolean views of neural network accelerators, we present some recent results from our work on LUTNet, a novel Field-Programmable Gate Array inference approach. Finally, we conclude with additional possible fruitful avenues for research bridging the continuous and discrete views of neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2021

TRIM: A Design Space Exploration Model for Deep Neural Networks Inference and Training Accelerators

There is increasing demand for specialized hardware for training deep ne...
research
01/21/2019

Deep Neural Network Approximation for Custom Hardware: Where We've Been, Where We're Going

Deep neural networks have proven to be particularly effective in visual ...
research
12/18/2019

Design Considerations for Efficient Deep Neural Networks on Processing-in-Memory Accelerators

This paper describes various design considerations for deep neural netwo...
research
10/25/2019

Learning Boolean Circuits with Neural Networks

Training neural-networks is computationally hard. However, in practice t...
research
08/01/2023

Descriptive complexity for neural networks via Boolean networks

We investigate the descriptive complexity of a class of neural networks ...
research
03/09/2021

A Gradient Estimator for Time-Varying Electrical Networks with Non-Linear Dissipation

We propose a method for extending the technique of equilibrium propagati...
research
03/27/2020

Boolean learning under noise-perturbations in hardware neural networks

A high efficiency hardware integration of neural networks benefits from ...

Please sign up or login with your details

Forgot password? Click here to reset