Semi-Parametric Deep Neural Networks in Linear Time and Memory

05/24/2022
by   Richa Rastogi, et al.
8

Recent advances in deep learning have been driven by large-scale parametric models, which can be computationally expensive and lack interpretability. Semi-parametric methods query the training set at inference time and can be more compact, although they typically have quadratic computational complexity. Here, we introduce SPIN, a general-purpose semi-parametric neural architecture whose computational cost is linear in the size and dimensionality of the data. Our architecture is inspired by inducing point methods and relies on a novel application of cross-attention between datapoints. At inference time, its computational cost is constant in the training set size as the data gets distilled into a fixed number of inducing points. We find that our method reduces the computational requirements of existing semi-parametric models by up to an order of magnitude across a range of datasets and improves state-of-the-art performance on an important practical problem, genotype imputation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2017

Exemplar-Centered Supervised Shallow Parametric Data Embedding

Metric learning methods for dimensionality reduction in combination with...
research
07/30/2018

Multi-Fiber Networks for Video Recognition

In this paper, we aim to reduce the computational cost of spatio-tempora...
research
01/05/2021

Methods for computing b-functions associated with μ-constant deformations – Case of inner modality 2 –

New methods for computing parametric local b-functions are introduced fo...
research
10/08/2021

Dataset Condensation with Distribution Matching

Computational cost to train state-of-the-art deep models in many learnin...
research
03/07/2017

NoScope: Optimizing Neural Network Queries over Video at Scale

Recent advances in computer vision-in the form of deep neural networks-h...
research
04/09/2020

Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training

Deep Neural Networks are successful but highly computationally expensive...
research
11/01/2020

DebiNet: Debiasing Linear Models with Nonlinear Overparameterized Neural Networks

Recent years have witnessed strong empirical performance of over-paramet...

Please sign up or login with your details

Forgot password? Click here to reset