Computing Sparse Jacobians and Hessians Using Algorithmic Differentiation

11/09/2021
by   Bradley M. Bell, et al.
0

Stochastic scientific models and machine learning optimization estimators have a large number of variables; hence computing large sparse Jacobians and Hessians is important. Algorithmic differentiation (AD) greatly reduces the programming effort required to obtain the sparsity patterns and values for these matrices. We present forward, reverse, and subgraph methods for computing sparse Jacobians and Hessians. Special attention is given the the subgraph method because it is new. The coloring and compression steps are not necessary when computing sparse Jacobians and Hessians using subgraphs. Complexity analysis shows that for some problems the subgraph method is expected to be much faster. We compare C++ operator overloading implementations of the methods in the ADOL-C and CppAD software packages using some of the MINPACK-2 test problems. The experiments are set up in a way that makes them easy to run on different hardware, different systems, different compilers, other test problem and other AD packages. The setup time is the time to record the graph, compute sparsity, coloring, compression, and optimization of the graph. If the setup is necessary for each evaluation, the subgraph implementation has similar run times for sparse Jacobians and faster run times for sparse Hessians.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/21/2020

Breaking the 2^n barrier for 5-coloring and 6-coloring

The coloring problem (i.e., computing the chromatic number of a graph) c...
research
04/19/2019

On the fixed-parameter tractability of the maximum 2-edge-colorable subgraph problem

A k-edge-coloring of a graph is an assignment of colors {1,...,k} to edg...
research
07/26/2018

A Benchmark of Selected Algorithmic Differentiation Tools on Some Problems in Computer Vision and Machine Learning

Algorithmic differentiation (AD) allows exact computation of derivatives...
research
11/06/2019

In Search of Dense Subgraphs: How Good is Greedy Peeling?

The problem of finding the densest subgraph in a given graph has several...
research
04/14/2020

Hierarchical and Modularly-Minimal Vertex Colorings

Cographs are exactly the hereditarily well-colored graphs, i.e., the gra...
research
09/25/2021

AbstractDifferentiation.jl: Backend-Agnostic Differentiable Programming in Julia

No single Automatic Differentiation (AD) system is the optimal choice fo...
research
11/10/2016

Tricks from Deep Learning

The deep learning community has devised a diverse set of methods to make...

Please sign up or login with your details

Forgot password? Click here to reset