On the implementation of checkpointing with high-level algorithmic differentiation

05/16/2023
by   James R. Maddison, et al.
0

Automated code generation allows for a separation between the development of a model, expressed via a domain specific language, and lower level implementation details. Algorithmic differentiation can be applied symbolically at the level of the domain specific language, and the code generator reused to implement code required for an adjoint calculation. However the adjoint calculations are complicated by the well-known problem of storing or recomputing the forward model data required by the adjoint, and different checkpointing strategies have been developed to tackle this problem. This article describes the application of checkpointing strategies to high-level algorithmic differentiation, applied to codes developed using automated code generation. Since the high-level approach provides a simplified view of the model itself, the data required to restart the forward and data required to advance the adjoint can be identified, and the difference between them leveraged to implement checkpointing strategies of improved performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro