70 years of Krylov subspace methods: The journey continues

11/02/2022
by   Erin Carson, et al.
0

Using computed examples for the Conjugate Gradient method and GMRES, we recall important building blocks in the understanding of Krylov subspace methods over the last 70 years. Each example consists of a description of the setup and the numerical observations, followed by an explanation of the observed phenomena, where we keep technical details as small as possible. Our goal is to show the mathematical beauty and hidden intricacies of the methods, and to point out some persistent misunderstandings as well as important open problems. We hope that this work initiates further investigations of Krylov subspace methods, which are efficient computational tools and exciting mathematical objects that are far from being fully understood.

READ FULL TEXT
research
11/30/2017

RANSAC Algorithms for Subspace Recovery and Subspace Clustering

We consider the RANSAC algorithm in the context of subspace recovery and...
research
02/19/2014

Subspace Learning with Partial Information

The goal of subspace learning is to find a k-dimensional subspace of R^d...
research
12/24/2022

Speeding up Krylov subspace methods for computing f(A)b via randomization

This work is concerned with the computation of the action of a matrix fu...
research
12/12/2018

Gradient Descent Happens in a Tiny Subspace

We show that in a variety of large-scale deep learning scenarios the gra...
research
03/11/2022

Sparse Subspace Clustering for Concept Discovery (SSCCD)

Concepts are key building blocks of higher level human understanding. Ex...
research
03/25/2021

A Matrix-free Multigrid Preconditioner for Jacobian-free Newton-Krylov Methods

In this work, we propose a multigrid preconditioner for Jacobian-free Ne...

Please sign up or login with your details

Forgot password? Click here to reset