DeepAI AI Chat
Log In Sign Up

EuclidNets: An Alternative Operation for Efficient Inference of Deep Learning Models

12/22/2022
by   Xinlin Li, et al.
HUAWEI Technologies Co., Ltd.
Microsoft
McGill University
0

With the advent of deep learning application on edge devices, researchers actively try to optimize their deployments on low-power and restricted memory devices. There are established compression method such as quantization, pruning, and architecture search that leverage commodity hardware. Apart from conventional compression algorithms, one may redesign the operations of deep learning models that lead to more efficient implementation. To this end, we propose EuclidNet, a compression method, designed to be implemented on hardware which replaces multiplication, xw, with Euclidean distance (x-w)^2. We show that EuclidNet is aligned with matrix multiplication and it can be used as a measure of similarity in case of convolutional layers. Furthermore, we show that under various transformations and noise scenarios, EuclidNet exhibits the same performance compared to the deep learning models designed with multiplication operations.

READ FULL TEXT

page 10

page 13

page 14

04/25/2023

Optimizing Deep Learning Models For Raspberry Pi

Deep learning models have become increasingly popular for a wide range o...
03/26/2023

Common Subexpression-based Compression and Multiplication of Sparse Constant Matrices

In deep learning inference, model parameters are pruned and quantized to...
09/23/2019

Compiler-Level Matrix Multiplication Optimization for Deep Learning

An important linear algebra routine, GEneral Matrix Multiplication (GEMM...
08/08/2021

Expressive Power and Loss Surfaces of Deep Learning Models

The goals of this paper are two-fold. The first goal is to serve as an e...
08/22/2022

Design Automation for Fast, Lightweight, and Effective Deep Learning Models: A Survey

Deep learning technologies have demonstrated remarkable effectiveness in...
02/18/2022

Rethinking Pareto Frontier for Performance Evaluation of Deep Neural Networks

Recent efforts in deep learning show a considerable advancement in redes...