Influence-guided Data Augmentation for Neural Tensor Completion

08/23/2021
by   Sejoon Oh, et al.
0

How can we predict missing values in multi-dimensional data (or tensors) more accurately? The task of tensor completion is crucial in many applications such as personalized recommendation, image and video restoration, and link prediction in social networks. Many tensor factorization and neural network-based tensor completion algorithms have been developed to predict missing entries in partially observed tensors. However, they can produce inaccurate estimations as real-world tensors are very sparse, and these methods tend to overfit on the small amount of data. Here, we overcome these shortcomings by presenting a data augmentation technique for tensors. In this paper, we propose DAIN, a general data augmentation framework that enhances the prediction accuracy of neural tensor completion methods. Specifically, DAIN first trains a neural model and finds tensor cell importances with influence functions. After that, DAIN aggregates the cell importance to calculate the importance of each entity (i.e., an index of a dimension). Finally, DAIN augments the tensor by weighted sampling of entity importances and a value predictor. Extensive experimental results show that DAIN outperforms all data augmentation baselines in terms of enhancing imputation accuracy of neural tensor completion on four diverse real-world tensors. Ablation studies of DAIN substantiate the effectiveness of each component of DAIN. Furthermore, we show that DAIN scales near linearly to large datasets.

READ FULL TEXT
research
07/14/2016

Concatenated image completion via tensor augmentation and completion

This paper proposes a novel framework called concatenated image completi...
research
01/31/2022

JULIA: Joint Multi-linear and Nonlinear Identification for Tensor Completion

Tensor completion aims at imputing missing entries from a partially obse...
research
05/20/2021

A Biased Deep Tensor Factorization Network For Tensor Completion

Tensor decomposition is a popular technique for tensor completion, Howev...
research
10/06/2017

Scalable Tucker Factorization for Sparse Tensors - Algorithms and Discoveries

Given sparse multi-dimensional data (e.g., (user, movie, time; rating) f...
research
05/08/2022

GOCPT: Generalized Online Canonical Polyadic Tensor Factorization and Completion

Low-rank tensor factorization or completion is well-studied and applied ...
research
10/01/2021

Applying Differential Privacy to Tensor Completion

Tensor completion aims at filling the missing or unobserved entries base...
research
04/24/2020

Baseline Estimation of Commercial Building HVAC Fan Power Using Tensor Completion

Commercial building heating, ventilation, and air conditioning (HVAC) sy...

Please sign up or login with your details

Forgot password? Click here to reset