DeepAI

# Improved Convergence Rates for the Orthogonal Greedy Algorithm

We analyze the orthogonal greedy algorithm when applied to dictionaries 𝔻 whose convex hull has small entropy. We show that if the metric entropy of the convex hull of 𝔻 decays at a rate of O(n^-1/2-α) for α > 0, then the orthogonal greedy algorithm converges at the same rate. This improves upon the well-known O(n^-1/2) convergence rate of the orthogonal greedy algorithm in many cases, most notably for dictionaries corresponding to shallow neural networks. Finally, we show that these improved rates are sharp under the given entropy decay assumptions.

• 10 publications
• 30 publications
12/04/2013

### Chebushev Greedy Algorithm in convex optimization

Chebyshev Greedy Algorithm is a generalization of the well known Orthogo...
06/02/2020

### On optimal convergence rates of spectral orthogonal projection approximation for functions of algbraic and logarithmatic regularities

Based on the Hilb type formula between Jacobi polynomials and Bessel fun...
06/02/2020

### Convergence rates of spectral orthogonal projection approximation for functions of algebraic and logarithmatic regularities

Based on the Hilb type formula between Jacobi polynomials and Bessel fun...
12/10/2014

### Convergence and rate of convergence of some greedy algorithms in convex optimization

The paper gives a systematic study of the approximate versions of three ...
11/04/2015

### Dictionary descent in optimization

The problem of convex optimization is studied. Usually in convex optimiz...
08/29/2020

### A remark on entropy numbers

Talagrand's fundamental result on the entropy numbers is slightly improv...
12/26/2019

### Sparse Optimization on General Atomic Sets: Greedy and Forward-Backward Algorithms

We consider the problem of sparse atomic optimization, where the notion ...