Truthful Generalized Linear Models

09/16/2022
by   Yuan Qiu, et al.
0

In this paper we study estimating Generalized Linear Models (GLMs) in the case where the agents (individuals) are strategic or self-interested and they concern about their privacy when reporting data. Compared with the classical setting, here we aim to design mechanisms that can both incentivize most agents to truthfully report their data and preserve the privacy of individuals' reports, while their outputs should also close to the underlying parameter. In the first part of the paper, we consider the case where the covariates are sub-Gaussian and the responses are heavy-tailed where they only have the finite fourth moments. First, motivated by the stationary condition of the maximizer of the likelihood function, we derive a novel private and closed form estimator. Based on the estimator, we propose a mechanism which has the following properties via some appropriate design of the computation and payment scheme for several canonical models such as linear regression, logistic regression and Poisson regression: (1) the mechanism is o(1)-jointly differentially private (with probability at least 1-o(1)); (2) it is an o(1/n)-approximate Bayes Nash equilibrium for a (1-o(1))-fraction of agents to truthfully report their data, where n is the number of agents; (3) the output could achieve an error of o(1) to the underlying parameter; (4) it is individually rational for a (1-o(1)) fraction of agents in the mechanism ; (5) the payment budget required from the analyst to run the mechanism is o(1). In the second part, we consider the linear regression model under more general setting where both covariates and responses are heavy-tailed and only have finite fourth moments. By using an ℓ_4-norm shrinkage operator, we propose a private estimator and payment scheme which have similar properties as in the sub-Gaussian case.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/10/2022

Differentially Private ℓ_1-norm Linear Regression with Heavy-tailed Data

We study the problem of Differentially Private Stochastic Convex Optimiz...
research
02/19/2022

Differentially Private Regression with Unbounded Covariates

We provide computationally efficient, differentially private algorithms ...
research
10/24/2017

Taming the heavy-tailed features by shrinkage and clipping

In this paper, we consider the generalized linear models (GLM) with heav...
research
10/30/2022

Robust and Tuning-Free Sparse Linear Regression via Square-Root Slope

We consider the high-dimensional linear regression model and assume that...
research
06/10/2015

Truthful Linear Regression

We consider the problem of fitting a linear model to data held by indivi...
research
07/23/2021

High Dimensional Differentially Private Stochastic Optimization with Heavy-tailed Data

As one of the most fundamental problems in machine learning, statistics ...
research
03/07/2023

PRIMO: Private Regression in Multiple Outcomes

We introduce a new differentially private regression setting we call Pri...

Please sign up or login with your details

Forgot password? Click here to reset