DeepAI AI Chat
Log In Sign Up

Exploiting Elasticity in Tensor Ranks for Compressing Neural Networks

05/10/2021
by   Jie Ran, et al.
0

Elasticities in depth, width, kernel size and resolution have been explored in compressing deep neural networks (DNNs). Recognizing that the kernels in a convolutional neural network (CNN) are 4-way tensors, we further exploit a new elasticity dimension along the input-output channels. Specifically, a novel nuclear-norm rank minimization factorization (NRMF) approach is proposed to dynamically and globally search for the reduced tensor ranks during training. Correlation between tensor ranks across multiple layers is revealed, and a graceful tradeoff between model size and accuracy is obtained. Experiments then show the superiority of NRMF over the previous non-elastic variational Bayesian matrix factorization (VBMF) scheme.

READ FULL TEXT
12/15/2014

Bayesian multi-tensor factorization

We introduce Bayesian multi-tensor factorization, a model that is the fi...
01/20/2020

Deep Image Clustering with Tensor Kernels and Unsupervised Companion Objectives

In this paper we develop a new model for deep image clustering, using co...
01/06/2020

Hyperspectral Super-Resolution via Coupled Tensor Ring Factorization

Hyperspectral super-resolution (HSR) fuses a low-resolution hyperspectra...
06/10/2016

Incoherent Tensor Norms and Their Applications in Higher Order Tensor Completion

In this paper, we investigate the sample size requirement for a general ...
07/18/2022

Implicit Regularization with Polynomial Growth in Deep Tensor Factorization

We study the implicit regularization effects of deep learning in tensor ...
10/10/2022

Efficient NTK using Dimensionality Reduction

Recently, neural tangent kernel (NTK) has been used to explain the dynam...
09/25/2019

Information Plane Analysis of Deep Neural Networks via Matrix-Based Renyi's Entropy and Tensor Kernels

Analyzing deep neural networks (DNNs) via information plane (IP) theory ...