Lazy Evaluation of Convolutional Filters

05/27/2016
by   Sam Leroux, et al.
0

In this paper we propose a technique which avoids the evaluation of certain convolutional filters in a deep neural network. This allows to trade-off the accuracy of a deep neural network with the computational and memory requirements. This is especially important on a constrained device unable to hold all the weights of the network in memory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/06/2021

Binary Neural Network for Speaker Verification

Although deep neural networks are successful for many tasks in the speec...
research
09/06/2023

An Evaluation of Software Sketches

This work presents a detailed evaluation of Rust (software) implementati...
research
11/18/2018

RePr: Improved Training of Convolutional Filters

A well-trained Convolutional Neural Network can easily be pruned without...
research
10/18/2017

Enhancing the Performance of Convolutional Neural Networks on Quality Degraded Datasets

Despite the appeal of deep neural networks that largely replace the trad...
research
09/08/2020

Low-Rank Training of Deep Neural Networks for Emerging Memory Technology

The recent success of neural networks for solving difficult decision tal...
research
08/24/2018

An Enhanced SCMA Detector Enabled by Deep Neural Network

In this paper, we propose a learning approach for sparse code multiple a...
research
07/02/2019

MimosaNet: An Unrobust Neural Network Preventing Model Stealing

Deep Neural Networks are robust to minor perturbations of the learned ne...

Please sign up or login with your details

Forgot password? Click here to reset