DeepAI AI Chat
Log In Sign Up

Linking Microscopic and Macroscopic Models for Evolution: Markov Chain Network Training and Conservation Law Approximations

by   Roderick V. N. Melnik, et al.

In this paper, a general framework for the analysis of a connection between the training of artificial neural networks via the dynamics of Markov chains and the approximation of conservation law equations is proposed. This framework allows us to demonstrate an intrinsic link between microscopic and macroscopic models for evolution via the concept of perturbed generalized dynamic systems. The main result is exemplified with a number of illustrative examples where efficient numerical approximations follow directly from network-based computational models, viewed here as Markov chain approximations. Finally, stability and consistency conditions of such computational models are discussed.


page 1

page 2

page 3

page 4


Functional equivariance and conservation laws in numerical integration

Preservation of linear and quadratic invariants by numerical integrators...

Toward a Theory of Markov Influence Systems and their Renormalization

Nonlinear Markov chains are probabilistic models commonly used in physic...

Perturbation Bounds for Monte Carlo within Metropolis via Restricted Approximations

The Monte Carlo within Metropolis (MCwM) algorithm, interpreted as a per...

Stick-breaking processes, clumping, and Markov chain occupation laws

We consider the connections among `clumped' residual allocation models (...

45-year CPU evolution: one law and two equations

Moore's law and two equations allow to explain the main trends of CPU ev...

Rethinking skip connection model as a learnable Markov chain

Over past few years afterward the birth of ResNet, skip connection has b...