Learning Schatten--Von Neumann Operators

01/29/2019
by   Puoya Tabaghi, et al.
0

We study the learnability of a class of compact operators known as Schatten--von Neumann operators. These operators between infinite-dimensional function spaces play a central role in a variety of applications in learning theory and inverse problems. We address the question of sample complexity of learning Schatten-von Neumann operators and provide an upper bound on the number of measurements required for the empirical risk minimizer to generalize with arbitrary precision and probability, as a function of class parameter p. Our results give generalization guarantees for regression of infinite-dimensional signals from infinite-dimensional data. Next, we adapt the representer theorem of Abernethy et al. to show that empirical risk minimization over an a priori infinite-dimensional, non-compact set, can be converted to a convex finite dimensional optimization problem over a compact set. In summary, the class of p-Schatten--von Neumann operators is probably approximately correct (PAC)-learnable via a practical convex program for any p < ∞.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2023

Online Infinite-Dimensional Regression: Learning Linear Operators

We consider the problem of learning linear operators under squared loss ...
research
11/16/2022

Learning linear operators: Infinite-dimensional regression as a well-behaved non-compact inverse problem

We consider the problem of learning a linear operator θ between two Hilb...
research
10/15/2018

A unified approach to calculation of information operators in semiparametric models

The infinite-dimensional information operator for the nuisance parameter...
research
06/09/2020

Probably Approximately Correct Constrained Learning

As learning solutions reach critical applications in social, industrial,...
research
12/31/2018

Towards a topological-geometrical theory of group equivariant non-expansive operators for data analysis and machine learning

The aim of this paper is to provide a general mathematical framework for...
research
06/06/2023

Globally injective and bijective neural operators

Recently there has been great interest in operator learning, where netwo...
research
06/15/2015

Convex Risk Minimization and Conditional Probability Estimation

This paper proves, in very general settings, that convex risk minimizati...

Please sign up or login with your details

Forgot password? Click here to reset