Log In Sign Up

Probabilistic Numeric Convolutional Neural Networks

by   Marc Finzi, et al.

Continuous input signals like images and time series that are irregularly sampled or have missing values are challenging for existing deep learning methods. Coherently defined feature representations must depend on the values in unobserved regions of the input. Drawing from the work in probabilistic numerics, we propose Probabilistic Numeric Convolutional Neural Networks which represent features as Gaussian processes (GPs), providing a probabilistic description of discretization error. We then define a convolutional layer as the evolution of a PDE defined on this GP, followed by a nonlinearity. This approach also naturally admits steerable equivariant convolutions under e.g. the rotation group. In experiments we show that our approach yields a 3× reduction of error from the previous state of the art on the SuperPixel-MNIST dataset and competitive performance on the medical time series dataset PhysioNet2012.


Learning from Irregularly-Sampled Time Series: A Missing Data Perspective

Irregularly-sampled time series occur in many domains including healthca...

Classification of Time-Series Images Using Deep Convolutional Neural Networks

Convolutional Neural Networks (CNN) has achieved a great success in imag...

Interrelation of equivariant Gaussian processes and convolutional neural networks

Currently there exists rather promising new trend in machine leaning (ML...

Temporally Folded Convolutional Neural Networks for Sequence Forecasting

In this work we propose a novel approach to utilize convolutional neural...

Automatic Forecasting using Gaussian Processes

Automatic forecasting is the task of receiving a time series and returni...

PDE-based Group Equivariant Convolutional Neural Networks

We present a PDE-based framework that generalizes Group equivariant Conv...

Irregularly-Sampled Time Series Modeling with Spline Networks

Observations made in continuous time are often irregular and contain the...