Neural Networks at a Fraction with Pruned Quaternions

08/13/2023
by   Sahel Mohammad Iqbal, et al.
0

Contemporary state-of-the-art neural networks have increasingly large numbers of parameters, which prevents their deployment on devices with limited computational power. Pruning is one technique to remove unnecessary weights and reduce resource requirements for training and inference. In addition, for ML tasks where the input data is multi-dimensional, using higher-dimensional data embeddings such as complex numbers or quaternions has been shown to reduce the parameter count while maintaining accuracy. In this work, we conduct pruning on real and quaternion-valued implementations of different architectures on classification tasks. We find that for some architectures, at very high sparsity levels, quaternion models provide higher accuracies than their real counterparts. For example, at the task of image classification on CIFAR-10 using Conv-4, at 3% of the number of parameters as the original model, the pruned quaternion version outperforms the pruned real by more than 10%. Experiments on various network architectures and datasets show that for deployment in extremely resource-constrained environments, a sparse quaternion network might be a better candidate than a real sparse model of similar architecture.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/23/2020

Pruning Convolutional Filters using Batch Bridgeout

State-of-the-art computer vision models are rapidly increasing in capaci...
research
06/12/2020

AlgebraNets

Neural networks have historically been built layerwise from the set of f...
research
04/17/2020

Finding the Optimal Network Depth in Classification Tasks

We develop a fast end-to-end method for training lightweight neural netw...
research
03/20/2023

Induced Feature Selection by Structured Pruning

The advent of sparsity inducing techniques in neural networks has been o...
research
12/19/2021

Elastic-Link for Binarized Neural Network

Recent work has shown that Binarized Neural Networks (BNNs) are able to ...
research
06/21/2023

Quantifying lottery tickets under label noise: accuracy, calibration, and complexity

Pruning deep neural networks is a widely used strategy to alleviate the ...
research
07/27/2021

COPS: Controlled Pruning Before Training Starts

State-of-the-art deep neural network (DNN) pruning techniques, applied o...

Please sign up or login with your details

Forgot password? Click here to reset