Neural Kernels Without Tangents

03/04/2020
by   Vaishaal Shankar, et al.
10

We investigate the connections between neural networks and simple building blocks in kernel space. In particular, using well established feature space tools such as direct sum, averaging, and moment lifting, we present an algebra for creating "compositional" kernels from bags of features. We show that these operations correspond to many of the building blocks of "neural tangent kernels (NTK)". Experimentally, we show that there is a correlation in test error between neural network architectures and the associated kernels. We construct a simple neural network architecture using only 3x3 convolutions, 2x2 average pooling, ReLU, and optimized with SGD and MSE loss that achieves 96 on CIFAR10, and whose corresponding compositional kernel achieves 90 We also use our constructions to investigate the relative performance of neural networks, NTKs, and compositional kernels in the small dataset regime. In particular, we find that compositional kernels outperform NTKs and neural networks outperform both kernel methods.

READ FULL TEXT
research
04/09/2020

Mehler's Formula, Branching Process, and Compositional Kernels of Deep Neural Networks

In this paper, we utilize a connection between compositional kernels and...
research
03/26/2021

Modeling the Nonsmoothness of Modern Neural Networks

Modern neural networks have been successful in many regression-based tas...
research
11/07/2016

Hierarchical compositional feature learning

We introduce the hierarchical compositional network (HCN), a directed ge...
research
11/21/2019

Fast Sparse ConvNets

Historically, the pursuit of efficient inference has been one of the dri...
research
02/19/2020

Deep regularization and direct training of the inner layers of Neural Networks with Kernel Flows

We introduce a new regularization method for Artificial Neural Networks ...
research
11/13/2017

Visual Concepts and Compositional Voting

It is very attractive to formulate vision in terms of pattern theory Mum...
research
08/06/2020

Structured Convolutions for Efficient Neural Network Design

In this work, we tackle model efficiency by exploiting redundancy in the...

Please sign up or login with your details

Forgot password? Click here to reset