Manifold Regularization for Memory-Efficient Training of Deep Neural Networks

05/26/2023
by   Shadi Sartipi, et al.
0

One of the prevailing trends in the machine- and deep-learning community is to gravitate towards the use of increasingly larger models in order to keep pushing the state-of-the-art performance envelope. This tendency makes access to the associated technologies more difficult for the average practitioner and runs contrary to the desire to democratize knowledge production in the field. In this paper, we propose a framework for achieving improved memory efficiency in the process of learning traditional neural networks by leveraging inductive-bias-driven network design principles and layer-wise manifold-oriented regularization objectives. Use of the framework results in improved absolute performance and empirical generalization error relative to traditional learning techniques. We provide empirical validation of the framework, including qualitative and quantitative evidence of its effectiveness on two standard image datasets, namely CIFAR-10 and CIFAR-100. The proposed framework can be seamlessly combined with existing network compression methods for further memory savings.

READ FULL TEXT

page 3

page 7

research
07/19/2019

Post-synaptic potential regularization has potential

Improving generalization is one of the main challenges for training deep...
research
11/18/2020

A Novel Memory-Efficient Deep Learning Training Framework via Error-Bounded Lossy Compression

Deep neural networks (DNNs) are becoming increasingly deeper, wider, and...
research
11/18/2021

COMET: A Novel Memory-Efficient Deep Learning Training Framework by Using Error-Bounded Lossy Compression

Training wide and deep neural networks (DNNs) require large amounts of s...
research
11/07/2017

Compression-aware Training of Deep Networks

In recent years, great progress has been made in a variety of applicatio...
research
12/02/2019

Discovery and Separation of Features for Invariant Representation Learning

Supervised machine learning models often associate irrelevant nuisance f...
research
04/11/2019

Deep Neural Network Ensembles

Current deep neural networks suffer from two problems; first, they are h...
research
10/16/2017

Generalization in Deep Learning

This paper explains why deep learning can generalize well, despite large...

Please sign up or login with your details

Forgot password? Click here to reset