Expression of Fractals Through Neural Network Functions

05/27/2019
by   Nadav Dym, et al.
6

To help understand the underlying mechanisms of neural networks (NNs), several groups have, in recent years, studied the number of linear regions ℓ of piecewise linear functions generated by deep neural networks (DNN). In particular, they showed that ℓ can grow exponentially with the number of network parameters p, a property often used to explain the advantages of DNNs over shallow NNs in approximating complicated functions. Nonetheless, a simple dimension argument shows that DNNs cannot generate all piecewise linear functions with ℓ linear regions as soon as ℓ > p. It is thus natural to seek to characterize specific families of functions with ℓ linear regions that can be constructed by DNNs. Iterated Function Systems (IFS) generate sequences of piecewise linear functions F_k with a number of linear regions exponential in k. We show that, under mild assumptions, F_k can be generated by a NN using only O(k) parameters. IFS are used extensively to generate, at low computational cost, natural-looking landscape textures in artificial images. They have also been proposed for compression of natural images, albeit with less commercial success. The surprisingly good performance of this fractal-based compression suggests that our visual system may lock in, to some extent, on self-similarities in images. The combination of this phenomenon with the capacity, demonstrated here, of DNNs to efficiently approximate IFS may contribute to the success of DNNs, particularly striking for image processing tasks, as well as suggest new algorithms for representing self similarities in images based on the DNN mechanism.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 9

page 10

research
10/31/2018

Nearly-tight bounds on linear regions of piecewise linear neural networks

The developments of deep neural networks (DNN) in recent years have ushe...
research
01/04/2020

Empirical Studies on the Properties of Linear Regions in Deep Neural Networks

A deep neural network (DNN) with piecewise linear activations can partit...
research
09/24/2020

Theoretical Analysis of the Advantage of Deepening Neural Networks

We propose two new criteria to understand the advantage of deepening neu...
research
05/10/2023

DNN Verification, Reachability, and the Exponential Function Problem

Deep neural networks (DNNs) are increasingly being deployed to perform s...
research
01/25/2019

Complexity of Linear Regions in Deep Networks

It is well-known that the expressivity of a neural network depends on it...
research
04/13/2023

Do deep neural networks have an inbuilt Occam's razor?

The remarkable performance of overparameterized deep neural networks (DN...
research
10/23/2020

On the Number of Linear Functions Composing Deep Neural Network: Towards a Refined Definition of Neural Networks Complexity

The classical approach to measure the expressive power of deep neural ne...

Please sign up or login with your details

Forgot password? Click here to reset