EuclidNets: An Alternative Operation for Efficient Inference of Deep Learning Models

12/22/2022
by   Xinlin Li, et al.
0

With the advent of deep learning application on edge devices, researchers actively try to optimize their deployments on low-power and restricted memory devices. There are established compression method such as quantization, pruning, and architecture search that leverage commodity hardware. Apart from conventional compression algorithms, one may redesign the operations of deep learning models that lead to more efficient implementation. To this end, we propose EuclidNet, a compression method, designed to be implemented on hardware which replaces multiplication, xw, with Euclidean distance (x-w)^2. We show that EuclidNet is aligned with matrix multiplication and it can be used as a measure of similarity in case of convolutional layers. Furthermore, we show that under various transformations and noise scenarios, EuclidNet exhibits the same performance compared to the deep learning models designed with multiplication operations.

READ FULL TEXT

page 10

page 13

page 14

research
04/25/2023

Optimizing Deep Learning Models For Raspberry Pi

Deep learning models have become increasingly popular for a wide range o...
research
06/15/2023

Neural Network Compression using Binarization and Few Full-Precision Weights

Quantization and pruning are known to be two effective Deep Neural Netwo...
research
09/23/2019

Compiler-Level Matrix Multiplication Optimization for Deep Learning

An important linear algebra routine, GEneral Matrix Multiplication (GEMM...
research
08/08/2021

Expressive Power and Loss Surfaces of Deep Learning Models

The goals of this paper are two-fold. The first goal is to serve as an e...
research
12/11/2017

StrassenNets: Deep learning with a multiplication budget

A large fraction of the arithmetic operations required to evaluate deep ...
research
02/18/2022

Rethinking Pareto Frontier for Performance Evaluation of Deep Neural Networks

Recent efforts in deep learning show a considerable advancement in redes...

Please sign up or login with your details

Forgot password? Click here to reset