Learning Fair Canonical Polyadical Decompositions using a Kernel Independence Criterion

04/27/2021
by   Kevin Kim, et al.
0

This work proposes to learn fair low-rank tensor decompositions by regularizing the Canonical Polyadic Decomposition factorization with the kernel Hilbert-Schmidt independence criterion (KHSIC). It is shown, theoretically and empirically, that a small KHSIC between a latent factor and the sensitive features guarantees approximate statistical parity. The proposed algorithm surpasses the state-of-the-art algorithm, FATR (Zhu et al., 2018), in controlling the trade-off between fairness and residual fit on synthetic and real data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/12/2023

Hadamard-Hitchcock decompositions: identifiability and computation

A Hadamard-Hitchcock decomposition of a multidimensional array is a deco...
research
01/08/2021

Group Fairness: Independence Revisited

This paper critically examines arguments against independence, a measure...
research
08/25/2022

Tucker-L_2E: Robust Low-rank Tensor Decomposition with the L_2 Criterion

The growing prevalence of tensor data, or multiway arrays, in science an...
research
06/27/2019

Learning Fair Representations for Kernel Models

Fair representations are a powerful tool for establishing criteria like ...
research
10/27/2021

Streaming Generalized Canonical Polyadic Tensor Decompositions

In this paper, we develop a method which we call OnlineGCP for computing...
research
10/27/2021

Model based Multi-agent Reinforcement Learning with Tensor Decompositions

A challenge in multi-agent reinforcement learning is to be able to gener...
research
09/15/2019

Minimax separation of the Cauchy kernel

We prove and apply an optimal low-rank approximation of the Cauchy kerne...

Please sign up or login with your details

Forgot password? Click here to reset