Auto-Differentiating Linear Algebra

by   Matthias Seeger, et al.

Development systems for deep learning, such as Theano, Torch, TensorFlow, or MXNet, are easy-to-use tools for creating complex neural network models. Since gradient computations are automatically baked in, and execution is mapped to high performance hardware, these models can be trained end-to-end on large amounts of data. However, it is currently not easy to implement many basic machine learning primitives in these systems (such as Gaussian processes, least squares estimation, principal components analysis, Kalman smoothing), mainly because they lack efficient support of linear algebra primitives as differentiable operators. We detail how a number of matrix decompositions (Cholesky, LQ, symmetric eigen) can be implemented as differentiable operators. We have implemented these primitives in MXNet, running on CPU and GPU in single and double precision. We sketch use cases of these new operators, learning Gaussian process and Bayesian linear regression models. Our implementation is based on BLAS/LAPACK APIs, for which highly tuned implementations are available on all major CPUs and GPUs.



There are no comments yet.


page 1

page 2

page 3

page 4


Differentiable Computational Geometry for 2D and 3D machine learning

With the growth of machine learning algorithms with geometry primitives,...

Banded Matrix Operators for Gaussian Markov Models in the Automatic Differentiation Era

Banded matrices can be used as precision matrices in several models incl...

GPU-based Parallel Computation Support for Stan

This paper details an extensible OpenCL framework that allows Stan to ut...

Benchmarking the Linear Algebra Awareness of TensorFlow and PyTorch

Linear algebra operations, which are ubiquitous in machine learning, for...

DiffSharp: An AD Library for .NET Languages

DiffSharp is an algorithmic differentiation or automatic differentiation...

GraphBLAST: A High-Performance Linear Algebra-based Graph Framework on the GPU

High-performance implementations of graph algorithms are challenging to ...

Neural Algebra of Classifiers

The world is fundamentally compositional, so it is natural to think of v...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.