DeepAI AI Chat
Log In Sign Up

Non-Euclidean Universal Approximation

06/03/2020
by   Anastasis Kratsios, et al.
ETH Zurich
myumanitoba.ca
0

Modifications to a neural network's input and output layers are often required to accommodate the specificities of most practical learning tasks. However, the impact of such changes on architecture's approximation capabilities is largely not understood. We present general conditions describing feature and readout maps that preserve an architecture's ability to approximate any continuous functions uniformly on compacts. As an application, we show that if an architecture is capable of universal approximation, then modifying its final layer to produce binary values creates a new architecture capable of deterministically approximating any classifier. In particular, we obtain guarantees for deep CNNs, deep ffNN, and universal Gaussian processes. Our results also have consequences within the scope of geometric deep learning. Specifically, when the input and output spaces are Hadamard manifolds, we obtain geometrically meaningful feature and readout maps satisfying our criteria. Consequently, commonly used non-Euclidean regression models between spaces of symmetric positive definite matrices are extended to universal DNNs. The same result allows us to show that the hyperbolic feed-forward networks, used for hierarchical learning, are universal. Our result is also used to show that the common practice of randomizing all but the last two layers of a DNN produces a universal family of functions with probability one.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/24/2023

A Transfer Principle: Universal Approximators Between Metric Spaces From Euclidean Universal Approximators

We build universal approximators of continuous maps between arbitrary Po...
06/05/2023

Global universal approximation of functional input maps on weighted spaces

We introduce so-called functional input neural networks defined on a pos...
05/17/2021

Universal Regular Conditional Distributions

We introduce a general framework for approximating regular conditional d...
07/30/2020

Random Vector Functional Link Networks for Function Approximation on Manifolds

The learning speed of feed-forward neural networks is notoriously slow a...
03/28/2018

Intertwiners between Induced Representations (with Applications to the Theory of Equivariant Neural Networks)

Group equivariant and steerable convolutional neural networks (regular a...
10/22/2020

Fading memory echo state networks are universal

Echo state networks (ESNs) have been recently proved to be universal app...