SVMax: A Feature Embedding Regularizer

03/04/2021
by   Ahmed Taha, et al.
0

A neural network regularizer (e.g., weight decay) boosts performance by explicitly penalizing the complexity of a network. In this paper, we penalize inferior network activations – feature embeddings – which in turn regularize the network's weights implicitly. We propose singular value maximization (SVMax) to learn a more uniform feature embedding. The SVMax regularizer supports both supervised and unsupervised learning. Our formulation mitigates model collapse and enables larger learning rates. We evaluate the SVMax regularizer using both retrieval and generative adversarial networks. We leverage a synthetic mixture of Gaussians dataset to evaluate SVMax in an unsupervised setting. For retrieval networks, SVMax achieves significant improvement margins across various ranking losses. Code available at https://bit.ly/3jNkgDt

READ FULL TEXT
research
08/07/2020

Improve Generalization and Robustness of Neural Networks via Weight Scale Shifting Invariant Regularizations

Using weight decay to penalize the L2 norms of weights in neural network...
research
12/14/2021

DeepDiffusion: Unsupervised Learning of Retrieval-adapted Representations via Diffusion-based Ranking on Latent Feature Manifold

Unsupervised learning of feature representations is a challenging yet im...
research
07/21/2020

Unsupervised Learning of Solutions to Differential Equations with Generative Adversarial Networks

Solutions to differential equations are of significant scientific and en...
research
10/29/2020

Graph Regularized Autoencoder and its Application in Unsupervised Anomaly Detection

Dimensionality reduction is a crucial first step for many unsupervised l...
research
07/19/2020

A Generic Visualization Approach for Convolutional Neural Networks

Retrieval networks are essential for searching and indexing. Compared to...
research
01/24/2019

In Defense of the Triplet Loss for Visual Recognition

We employ triplet loss as a space embedding regularizer to boost classif...
research
09/25/2018

Utilizing Class Information for DNN Representation Shaping

Statistical characteristics of DNN (Deep Neural Network) representations...

Please sign up or login with your details

Forgot password? Click here to reset