Online Forgetting Process for Linear Regression Models

12/03/2020 ∙ by Yuantong Li, et al. ∙ 4

Motivated by the EU's "Right To Be Forgotten" regulation, we initiate a study of statistical data deletion problems where users' data are accessible only for a limited period of time. This setting is formulated as an online supervised learning task with constant memory limit. We propose a deletion-aware algorithm FIFD-OLS for the low dimensional case, and witness a catastrophic rank swinging phenomenon due to the data deletion operation, which leads to statistical inefficiency. As a remedy, we propose the FIFD-Adaptive Ridge algorithm with a novel online regularization scheme, that effectively offsets the uncertainty from deletion. In theory, we provide the cumulative regret upper bound for both online forgetting algorithms. In the experiment, we showed FIFD-Adaptive Ridge outperforms the ridge regression algorithm with fixed regularization level, and hopefully sheds some light on more complex statistical models.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 29

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.