Thermodynamic AI and the fluctuation frontier

02/09/2023
by   Patrick J. Coles, et al.
0

Many Artificial Intelligence (AI) algorithms are inspired by physics and employ stochastic fluctuations. We connect these physics-inspired AI algorithms by unifying them under a single mathematical framework that we call Thermodynamic AI. Seemingly disparate algorithmic classes can be described by this framework, for example, (1) Generative diffusion models, (2) Bayesian neural networks, (3) Monte Carlo sampling and (4) Simulated annealing. Such Thermodynamic AI algorithms are currently run on digital hardware, ultimately limiting their scalability and overall potential. Stochastic fluctuations naturally occur in physical thermodynamic systems, and such fluctuations can be viewed as a computational resource. Hence, we propose a novel computing paradigm, where software and hardware become inseparable. Our algorithmic unification allows us to identify a single full-stack paradigm, involving Thermodynamic AI hardware, that could accelerate such algorithms. We contrast Thermodynamic AI hardware with quantum computing where noise is a roadblock rather than a resource. Thermodynamic AI hardware can be viewed as a novel form of computing, since it uses a novel fundamental building block. We identify stochastic bits (s-bits) and stochastic modes (s-modes) as the respective building blocks for discrete and continuous Thermodynamic AI hardware. In addition to these stochastic units, Thermodynamic AI hardware employs a Maxwell's demon device that guides the system to produce non-trivial states. We provide a few simple physical architectures for building these devices and we develop a formalism for programming the hardware via gate sequences. We hope to stimulate discussion around this new computing paradigm. Beyond acceleration, we believe it will impact the design of both hardware and algorithms, while also deepening our understanding of the connection between physics and intelligence.

READ FULL TEXT
research
05/15/2021

The Paradigm of Digital Twin Communications

With the fast evolving of cloud computing and artificial intelligence (A...
research
02/13/2023

A full-stack view of probabilistic computing with p-bits: devices, architectures and algorithms

The transistor celebrated its 75^th birthday in 2022. The continued scal...
research
05/20/2015

Towards a Simulation-Based Programming Paradigm for AI applications

We present initial ideas for a programming paradigm based on simulation ...
research
05/15/2022

Physics-inspired Ising Computing with Ring Oscillator Activated p-bits

The nearing end of Moore's Law has been driving the development of domai...
research
03/25/2021

Enabling Design Methodologies and Future Trends for Edge AI: Specialization and Co-design

Artificial intelligence (AI) technologies have dramatically advanced in ...
research
05/22/2023

Training an Ising Machine with Equilibrium Propagation

Ising machines, which are hardware implementations of the Ising model of...
research
10/19/2022

The Future of Consumer Edge-AI Computing

Deep Learning has proliferated dramatically across consumer devices in l...

Please sign up or login with your details

Forgot password? Click here to reset