Deep Neural Networks Are Congestion Games: From Loss Landscape to Wardrop Equilibrium and Beyond

by   Nina Vesseron, et al.

The theoretical analysis of deep neural networks (DNN) is arguably among the most challenging research directions in machine learning (ML) right now, as it requires from scientists to lay novel statistical learning foundations to explain their behaviour in practice. While some success has been achieved recently in this endeavour, the question on whether DNNs can be analyzed using the tools from other scientific fields outside the ML community has not received the attention it may well have deserved. In this paper, we explore the interplay between DNNs and game theory (GT), and show how one can benefit from the classic readily available results from the latter when analyzing the former. In particular, we consider the widely studied class of congestion games, and illustrate their intrinsic relatedness to both linear and non-linear DNNs and to the properties of their loss surface. Beyond retrieving the state-of-the-art results from the literature, we argue that our work provides a very promising novel tool for analyzing the DNNs and support this claim by proposing concrete open problems that can advance significantly our understanding of DNNs when solved.



There are no comments yet.


page 1

page 2

page 3

page 4


SyReNN: A Tool for Analyzing Deep Neural Networks

Deep Neural Networks (DNNs) are rapidly gaining popularity in a variety ...

Ensemble Of Deep Neural Networks For Acoustic Scene Classification

Deep neural networks (DNNs) have recently achieved great success in a mu...

Embedding Principle of Loss Landscape of Deep Neural Networks

Understanding the structure of loss landscape of deep neural networks (D...

Interpreting Deep Learning: The Machine Learning Rorschach Test?

Theoretical understanding of deep learning is one of the most important ...

Neuroevolution in Deep Neural Networks: Current Trends and Future Challenges

A variety of methods have been applied to the architectural configuratio...

Information Bottleneck and its Applications in Deep Learning

Information Theory (IT) has been used in Machine Learning (ML) from earl...

Understanding the Ability of Deep Neural Networks to Count Connected Components in Images

Humans can count very fast by subitizing, but slow substantially as the ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.