DeepAI
Log In Sign Up

Improved Convergence Rates for the Orthogonal Greedy Algorithm

06/28/2021
by   Jonathan W. Siegel, et al.
0

We analyze the orthogonal greedy algorithm when applied to dictionaries 𝔻 whose convex hull has small entropy. We show that if the metric entropy of the convex hull of 𝔻 decays at a rate of O(n^-1/2-α) for α > 0, then the orthogonal greedy algorithm converges at the same rate. This improves upon the well-known O(n^-1/2) convergence rate of the orthogonal greedy algorithm in many cases, most notably for dictionaries corresponding to shallow neural networks. Finally, we show that these improved rates are sharp under the given entropy decay assumptions.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/04/2013

Chebushev Greedy Algorithm in convex optimization

Chebyshev Greedy Algorithm is a generalization of the well known Orthogo...
12/10/2014

Convergence and rate of convergence of some greedy algorithms in convex optimization

The paper gives a systematic study of the approximate versions of three ...
11/04/2015

Dictionary descent in optimization

The problem of convex optimization is studied. Usually in convex optimiz...
08/29/2020

A remark on entropy numbers

Talagrand's fundamental result on the entropy numbers is slightly improv...
12/26/2019

Sparse Optimization on General Atomic Sets: Greedy and Forward-Backward Algorithms

We consider the problem of sparse atomic optimization, where the notion ...