Machine learning at the mesoscale: a computation-dissipation bottleneck

07/05/2023
by   Alessandro Ingrosso, et al.
0

The cost of information processing in physical systems calls for a trade-off between performance and energetic expenditure. Here we formulate and study a computation-dissipation bottleneck in mesoscopic systems used as input-output devices. Using both real datasets and synthetic tasks, we show how non-equilibrium leads to enhanced performance. Our framework sheds light on a crucial compromise between information compression, input-output computation and dynamic irreversibility induced by non-reciprocal interactions.

READ FULL TEXT
research
12/22/2017

On Reduced Input-Output Dynamic Mode Decomposition

The identification of reduced-order models from high-dimensional data is...
research
03/29/2023

On real and observable realizations of input-output equations

Given a single algebraic input-output equation, we present a method for ...
research
04/15/2022

The Distributed Information Bottleneck reveals the explanatory structure of complex systems

The fruits of science are relationships made comprehensible, often by wa...
research
11/06/2019

Machine Learning using the Variational Predictive Information Bottleneck with a Validation Set

Zellner (1988) modeled statistical inference in terms of information pro...
research
06/07/2021

Nonequilibrium Thermodynamics in Measuring Carbon Footprints: Disentangling Structure and Artifact in Input-Output Accounting

Multiregional input-output (MRIO) tables, in conjunction with Leontief a...
research
02/04/2019

Restoration and extrapolation of structural transformation by dynamical general equilibrium feedbacks

We model sectoral production by serially nesting (cascading) binary comp...
research
09/08/2023

Emergent learning in physical systems as feedback-based aging in a glassy landscape

By training linear physical networks to learn linear transformations, we...

Please sign up or login with your details

Forgot password? Click here to reset