FOSA: Full Information Maximum Likelihood (FIML) Optimized Self-Attention Imputation for Missing Data

08/23/2023
by   Ou Deng, et al.
0

In data imputation, effectively addressing missing values is pivotal, especially in intricate datasets. This paper delves into the FIML Optimized Self-attention (FOSA) framework, an innovative approach that amalgamates the strengths of Full Information Maximum Likelihood (FIML) estimation with the capabilities of self-attention neural networks. Our methodology commences with an initial estimation of missing values via FIML, subsequently refining these estimates by leveraging the self-attention mechanism. Our comprehensive experiments on both simulated and real-world datasets underscore FOSA's pronounced advantages over traditional FIML techniques, encapsulating facets of accuracy, computational efficiency, and adaptability to diverse data structures. Intriguingly, even in scenarios where the Structural Equation Model (SEM) might be mis-specified, leading to suboptimal FIML estimates, the robust architecture of FOSA's self-attention component adeptly rectifies and optimizes the imputation outcomes. Our empirical tests reveal that FOSA consistently delivers commendable predictions, even in the face of up to 40 missingness, highlighting its robustness and potential for wide-scale applications in data imputation.

READ FULL TEXT
research
02/17/2022

SAITS: Self-Attention-based Imputation for Time Series

Missing data in time series is a pervasive problem that puts obstacles i...
research
05/23/2019

CDSA: Cross-Dimensional Self-Attention for Multivariate, Geo-tagged Time Series Imputation

Many real-world applications involve multivariate, geo-tagged time serie...
research
06/30/2021

DAEMA: Denoising Autoencoder with Mask Attention

Missing data is a recurrent and challenging problem, especially when usi...
research
09/05/2023

Generalized Simplicial Attention Neural Networks

The aim of this work is to introduce Generalized Simplicial Attention Ne...
research
01/11/2023

Multiple-level Point Embedding for Solving Human Trajectory Imputation with Prediction

Sparsity is a common issue in many trajectory datasets, including human ...
research
03/14/2022

Simplicial Attention Neural Networks

The aim of this work is to introduce simplicial attention networks (SANs...
research
07/06/2020

A Mathematical Theory of Attention

Attention is a powerful component of modern neural networks across a wid...

Please sign up or login with your details

Forgot password? Click here to reset