Boundary Optimizing Network (BON)

01/08/2018
by   Marco Singh, et al.
0

Despite all the success that deep neural networks have seen in classifying certain datasets, the challenge of finding optimal solutions that generalize well still remains. In this paper, we propose the Boundary Optimizing Network (BON), a new approach to generalization for deep neural networks when used for supervised learning. Given a classification network, we propose to use a collaborative generative network that produces new synthetic data points in the form of perturbations of original data points. In this way, we create a data support around each original data point which prevents decision boundaries to pass too close to the original data points, i.e. prevents overfitting. To prevent catastrophic forgetting during training, we propose to use a variation of Memory Aware Synapses to optimize the generative networks. On the Iris dataset, we show that the BON algorithm creates better decision boundaries when compared to a network regularized by the popular dropout scheme.

READ FULL TEXT

page 4

page 6

research
09/16/2020

Analysis of Generalizability of Deep Neural Networks Based on the Complexity of Decision Boundary

For supervised learning models, the analysis of generalization ability (...
research
12/24/2019

Characterizing the Decision Boundary of Deep Neural Networks

Deep neural networks and in particular, deep neural classifiers have bec...
research
03/07/2020

Synaptic Metaplasticity in Binarized Neural Networks

While deep neural networks have surpassed human performance in multiple ...
research
05/02/2023

Generalizing Dataset Distillation via Deep Generative Prior

Dataset Distillation aims to distill an entire dataset's knowledge into ...
research
02/05/2020

Understanding the Decision Boundary of Deep Neural Networks: An Empirical Study

Despite achieving remarkable performance on many image classification ta...
research
08/14/2021

Investigating the Relationship Between Dropout Regularization and Model Complexity in Neural Networks

Dropout Regularization, serving to reduce variance, is nearly ubiquitous...
research
06/05/2023

Learning Causal Mechanisms through Orthogonal Neural Networks

A fundamental feature of human intelligence is the ability to infer high...

Please sign up or login with your details

Forgot password? Click here to reset