Stochasticity from function - why the Bayesian brain may need no noise

09/21/2018
by   Dominik Dold, et al.
0

An increasing body of evidence suggests that the trial-to-trial variability of spiking activity in the brain is not mere noise, but rather the reflection of a sampling-based encoding scheme for probabilistic computing. Since the precise statistical properties of neural activity are important in this context, many models assume an ad-hoc source of well-behaved, explicit noise, either on the input or on the output side of single neuron dynamics, most often assuming an independent Poisson process in either case. However, these assumptions are somewhat problematic: neighboring neurons tend to share receptive fields, rendering both their input and their output correlated; at the same time, neurons are known to behave largely deterministically, as a function of their membrane potential and conductance. We suggest that spiking neural networks may, in fact, have no need for noise to perform sampling-based Bayesian inference. We study analytically the effect of auto- and cross-correlations in functionally Bayesian spiking networks and demonstrate how their effect translates to synaptic interaction strengths, rendering them controllable through synaptic plasticity. This allows even small ensembles of interconnected deterministic spiking networks to simultaneously and co-dependently shape their output activity through learning, enabling them to perform complex Bayesian computation without any need for noise, which we demonstrate in silico, both in classical simulation and in neuromorphic emulation. These results close a gap between the abstract models and the biology of functionally Bayesian spiking networks, effectively reducing the architectural constraints imposed on physical neural substrates required to perform probabilistic computing, be they biological or artificial.

READ FULL TEXT
research
09/30/2019

Normalisation of Weights and Firing Rates in Spiking Neural Networks with Spike-Timing-Dependent Plasticity

Maintaining the ability to fire sparsely is crucial for information enco...
research
06/19/2020

Oscillatory background activity implements a backbone for sampling-based computations in spiking neural networks

Various data suggest that the brain carries out probabilistic inference....
research
01/05/2016

The high-conductance state enables neural sampling in networks of LIF neurons

The apparent stochasticity of in-vivo neural circuits has long been hypo...
research
07/23/2020

Multi-Compartment Variational Online Learning for Spiking Neural Networks

Spiking Neural Networks (SNNs) offer a novel computational paradigm that...
research
07/18/2023

Approximating nonlinear functions with latent boundaries in low-rank excitatory-inhibitory spiking networks

Deep feedforward and recurrent rate-based neural networks have become su...
research
06/23/2020

The principles of adaptation in organisms and machines II: Thermodynamics of the Bayesian brain

This article reviews how organisms learn and recognize the world through...
research
03/15/2021

Constrained plasticity reserve as a natural way to control frequency and weights in spiking neural networks

Biological neurons have adaptive nature and perform complex computations...

Please sign up or login with your details

Forgot password? Click here to reset