Scale-Regularized Filter Learning

07/10/2017
by   Marco Loog, et al.
0

We start out by demonstrating that an elementary learning task, corresponding to the training of a single linear neuron in a convolutional neural network, can be solved for feature spaces of very high dimensionality. In a second step, acknowledging that such high-dimensional learning tasks typically benefit from some form of regularization and arguing that the problem of scale has not been taken care of in a very satisfactory manner, we come to a combined resolution of both of these shortcomings by proposing a form of scale regularization. Moreover, using variational method, this regularization problem can also be solved rather efficiently and we demonstrate, on an artificial filter learning problem, the capabilities of our basic linear neuron. From a more general standpoint, we see this work as prime example of how learning and variational methods could, or even should work to their mutual benefit.

READ FULL TEXT

page 5

page 8

page 9

page 10

page 11

research
04/09/2020

Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks

In a deep neural network (DNN), the number of the parameters is usually ...
research
07/24/2023

A Connection between One-Step Regularization and Critic Regularization in Reinforcement Learning

As with any machine learning problem with limited data, effective offlin...
research
02/20/2023

Bilevel learning of regularization models and their discretization for image deblurring and super-resolution

Bilevel learning is a powerful optimization technique that has extensive...
research
07/25/2019

Framelet Pooling Aided Deep Learning Network : The Method to Process High Dimensional Medical Data

Machine learning-based analysis of medical images often faces several hu...
research
11/30/2021

Neuron with Steady Response Leads to Better Generalization

Regularization can mitigate the generalization gap between training and ...
research
06/02/2020

An Informal Introduction to Multiplet Neural Networks

In the artificial neuron, I replace the dot product with the weighted Le...
research
03/09/2018

Local Kernels that Approximate Bayesian Regularization and Proximal Operators

In this work, we broadly connect kernel-based filtering (e.g. approaches...

Please sign up or login with your details

Forgot password? Click here to reset