Log In Sign Up

Neural Sampling Machine with Stochastic Synapse allows Brain-like Learning and Inference

by   Sourav Dutta, et al.

Many real-world mission-critical applications require continual online learning from noisy data and real-time decision making with a defined confidence level. Probabilistic models and stochastic neural networks can explicitly handle uncertainty in data and allow adaptive learning-on-the-fly, but their implementation in a low-power substrate remains a challenge. Here, we introduce a novel hardware fabric that implements a new class of stochastic NN called Neural-Sampling-Machine that exploits stochasticity in synaptic connections for approximate Bayesian inference. Harnessing the inherent non-linearities and stochasticity occurring at the atomic level in emerging materials and devices allows us to capture the synaptic stochasticity occurring at the molecular level in biological synapses. We experimentally demonstrate in-silico hybrid stochastic synapse by pairing a ferroelectric field-effect transistor -based analog weight cell with a two-terminal stochastic selector element. Such a stochastic synapse can be integrated within the well-established crossbar array architecture for compute-in-memory. We experimentally show that the inherent stochastic switching of the selector element between the insulator and metallic state introduces a multiplicative stochastic noise within the synapses of NSM that samples the conductance states of the FeFET, both during learning and inference. We perform network-level simulations to highlight the salient automatic weight normalization feature introduced by the stochastic synapses of the NSM that paves the way for continual online learning without any offline Batch Normalization. We also showcase the Bayesian inferencing capability introduced by the stochastic synapse during inference mode, thus accounting for uncertainty in data. We report 98.25 estimation of data uncertainty in rotated samples.


page 1

page 14

page 16

page 17


Inherent Weight Normalization in Stochastic Neural Networks

Multiplicative stochasticity such as Dropout improves the robustness and...

SPINBIS: Spintronics based Bayesian Inference System with Stochastic Computing

Bayesian inference is an effective approach for solving statistical lear...

Adaptive Synaptic Failure Enables Sampling from Posterior Predictive Distributions in the Brain

Bayesian interpretations of neural processing require that biological me...

Network Plasticity as Bayesian Inference

General results from statistical learning theory suggest to understand n...

Memristor-based Synaptic Sampling Machines

Synaptic Sampling Machine (SSM) is a type of neural network model that c...

Exploiting Oxide Based Resistive RAM Variability for Bayesian Neural Network Hardware Design

Uncertainty plays a key role in real-time machine learning. As a signifi...