Inference from Stationary Time Sequences via Learned Factor Graphs

06/05/2020
by   Nir Shlezinger, et al.
0

The design of methods for inference from time sequences has traditionally relied on statistical models that describe the relation between a latent desired sequence and the observed one. A broad family of model-based algorithms have been derived to carry out inference at controllable complexity using recursive computations over the factor graph representing the underlying distribution. An alternative model-agnostic approach utilizes machine learning (ML) methods. Here we propose a framework that combines model-based inference algorithms and data-driven ML tools for stationary time sequences. In the proposed approach, neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence, rather than the complete inference task. By exploiting stationary properties of this distribution, the resulting approach can be applied to sequences of varying temporal duration. Additionally, this approach facilitates the use of compact neural networks which can be trained with small training sets, or alternatively, can be used to improve upon existing deep inference systems. We present an inference algorithm based on learned stationary factor graphs, referred to as StaSPNet, which learns to implement the sum product scheme from labeled data, and can be applied to sequences of different lengths. Our experimental results demonstrate the ability of the proposed StaSPNet to learn to carry out accurate inference from small training sets for sleep stage detection using the Sleep-EDF dataset, as well as for symbol detection in digital communications with unknown channels.

READ FULL TEXT
research
01/31/2020

Data-Driven Factor Graphs for Deep Symbol Detection

Many important schemes in signal processing and communications, ranging ...
research
02/14/2020

Data-Driven Symbol Detection via Model-Based Machine Learning

The design of symbol detectors in digital communication systems has trad...
research
11/21/2022

Structural Optimization of Factor Graphs for Symbol Detection via Continuous Clustering and Machine Learning

We propose a novel method to optimize the structure of factor graphs for...
research
03/07/2022

Neural Enhancement of Factor Graph-based Symbol Detection

We study the application of the factor graph framework for symbol detect...
research
03/30/2022

Low-complexity Near-optimum Symbol Detection Based on Neural Enhancement of Factor Graphs

We consider the application of the factor graph framework for symbol det...
research
04/24/2018

Block-Structure Based Time-Series Models For Graph Sequences

Although the computational and statistical trade-off for modeling single...

Please sign up or login with your details

Forgot password? Click here to reset