Anderson Acceleration Using the H^-s Norm

02/10/2020 ∙ by Yunan Yang, et al. ∙ 0

Anderson acceleration (AA) is a technique for accelerating the convergence of fixed-point iterations. In this paper, we apply AA to a sequence of functions and modify the norm in its internal optimization problem to the H^-s norm, for some integer s, to bias it towards low-frequency spectral content in the residual. We analyze the convergence of AA by quantifying its improvement over Picard iteration. We find that AA based on the H^-2 norm is well-suited to solve fixed-point operators derived from second-order elliptic differential operators and a Helmholtz recovery problem.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.