DNN or k-NN: That is the Generalize vs. Memorize Question

05/17/2018
by   Gilad Cohen, et al.
0

This paper studies the relationship between the classification performed by deep neural networks and the k-NN decision at the embedding space of these networks. This simple important connection shown here provides a better understanding of the relationship between the ability of neural networks to generalize and their tendency to memorize the training data, which are traditionally considered to be contradicting to each other and here shown to be compatible and complementary. Our results support the conjecture that deep neural networks approach Bayes optimal error rates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/25/2022

Learning Ability of Interpolating Convolutional Neural Networks

It is frequently observed that overparameterized neural networks general...
research
08/02/2021

Bucketed PCA Neural Networks with Neurons Mirroring Signals

The bucketed PCA neural network (PCA-NN) with transforms is developed he...
research
12/09/2019

Over-parametrized deep neural networks do not generalize well

Recently it was shown in several papers that backpropagation is able to ...
research
06/23/2016

DropNeuron: Simplifying the Structure of Deep Neural Networks

Deep learning using multi-layer neural networks (NNs) architecture manif...
research
05/22/2020

Premium Access to Convolutional Neural Networks

Neural Networks (NNs) are today used for all our daily tasks; for instan...
research
05/01/2023

Towards a Phenomenological Understanding of Neural Networks: Data

A theory of neural networks (NNs) built upon collective variables would ...
research
11/16/2018

nn-dependability-kit: Engineering Neural Networks for Safety-Critical Systems

nn-dependability-kit is an open-source toolbox to support safety enginee...

Please sign up or login with your details

Forgot password? Click here to reset