Bayesian Inference with Nonlinear Generative Models: Comments on Secure Learning

01/19/2022
by   Ali Bereyhi, et al.
7

Unlike the classical linear model, nonlinear generative models have been addressed sparsely in the literature. This work aims to bring attention to these models and their secrecy potential. To this end, we invoke the replica method to derive the asymptotic normalized cross entropy in an inverse probability problem whose generative model is described by a Gaussian random field with a generic covariance function. Our derivations further demonstrate the asymptotic statistical decoupling of Bayesian inference algorithms and specify the decoupled setting for a given nonlinear model. The replica solution depicts that strictly nonlinear models establish an all-or-nothing phase transition: There exists a critical load at which the optimal Bayesian inference changes from being perfect to an uncorrelated learning. This finding leads to design of a new secure coding scheme which achieves the secrecy capacity of the wiretap channel. The proposed coding has a significantly smaller codebook size compared to the random coding scheme of Wyner. This interesting result implies that strictly nonlinear generative models are perfectly secured without any secure coding. We justify this latter statement through the analysis of an illustrative model for perfectly secure and reliable inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2022

Secure Coding via Gaussian Random Fields

Inverse probability problems whose generative models are given by strict...
research
07/09/2018

Active Secure Coding Based on Eavesdropper Behavior Learning

The secrecy capacity achieving problem of the wiretap channel against an...
research
11/09/2018

Provably Secure Steganography on Generative Media

In this paper, we propose provably secure steganography on generative me...
research
03/15/2022

Generative models and Bayesian inversion using Laplace approximation

The Bayesian approach to solving inverse problems relies on the choice o...
research
11/02/2021

The Secrecy Gain of Formally Unimodular Lattices on the Gaussian Wiretap Channel

We consider lattice coding for the Gaussian wiretap channel, where the c...
research
03/15/2020

Semi-Modular Inference: enhanced learning in multi-modular models by tempering the influence of components

Bayesian statistical inference loses predictive optimality when generati...
research
09/03/2018

From Bayesian Inference to Logical Bayesian Inference: A New Mathematical Frame for Semantic Communication and Machine Learning

Bayesian Inference (BI) uses the Bayes' posterior whereas Logical Bayesi...

Please sign up or login with your details

Forgot password? Click here to reset