Online and Differentially-Private Tensor Decomposition

06/20/2016
by   Yining Wang, et al.
0

In this paper, we resolve many of the key algorithmic questions regarding robustness, memory efficiency, and differential privacy of tensor decomposition. We propose simple variants of the tensor power method which enjoy these strong properties. We present the first guarantees for online tensor power method which has a linear memory requirement. Moreover, we present a noise calibrated tensor power method with efficient privacy guarantees. At the heart of all these guarantees lies a careful perturbation analysis derived in this paper which improves up on the existing results significantly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2018

Distributed Differentially-Private Algorithms for Matrix and Tensor Factorization

In many signal processing and machine learning applications, datasets co...
research
01/27/2021

Randori: Local Differential Privacy for All

Polls are a common way of collecting data, including product reviews and...
research
12/23/2020

Hiding Among the Clones: A Simple and Nearly Optimal Analysis of Privacy Amplification by Shuffling

Recent work of Erlingsson, Feldman, Mironov, Raghunathan, Talwar, and Th...
research
02/14/2022

Fast algorithm for overcomplete order-3 tensor decomposition

We develop the first fast spectral algorithm to decompose a random third...
research
10/09/2017

CTD: Fast, Accurate, and Interpretable Method for Static and Dynamic Tensor Decompositions

How can we find patterns and anomalies in a tensor, or multi-dimensional...
research
10/01/2021

Applying Differential Privacy to Tensor Completion

Tensor completion aims at filling the missing or unobserved entries base...

Please sign up or login with your details

Forgot password? Click here to reset