On the Fine-Grained Hardness of Inverting Generative Models

09/11/2023
by   Feyza Duman Keles, et al.
0

The objective of generative model inversion is to identify a size-n latent vector that produces a generative model output that closely matches a given target. This operation is a core computational primitive in numerous modern applications involving computer vision and NLP. However, the problem is known to be computationally challenging and NP-hard in the worst case. This paper aims to provide a fine-grained view of the landscape of computational hardness for this problem. We establish several new hardness lower bounds for both exact and approximate model inversion. In exact inversion, the goal is to determine whether a target is contained within the range of a given generative model. Under the strong exponential time hypothesis (SETH), we demonstrate that the computational complexity of exact inversion is lower bounded by Ω(2^n) via a reduction from k-SAT; this is a strengthening of known results. For the more practically relevant problem of approximate inversion, the goal is to determine whether a point in the model range is close to a given target with respect to the ℓ_p-norm. When p is a positive odd integer, under SETH, we provide an Ω(2^n) complexity lower bound via a reduction from the closest vectors problem (CVP). Finally, when p is even, under the exponential time hypothesis (ETH), we provide a lower bound of 2^Ω (n) via a reduction from Half-Clique and Vertex-Cover.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/29/2020

Three-dimensional matching is NP-Hard

The standard proof of NP-Hardness of 3DM provides a power-4 reduction of...
research
10/13/2022

The Fine-Grained Complexity of Graph Homomorphism Parameterized by Clique-Width

The generic homomorphism problem, which asks whether an input graph G ad...
research
06/29/2022

Hardness and Algorithms for Robust and Sparse Optimization

We explore algorithms and limitations for sparse optimization problems s...
research
03/14/2022

Refined Hardness of Distance-Optimal Multi-Agent Path Finding

We study the computational complexity of multi-agent path finding (MAPF)...
research
06/06/2021

The Fine-Grained Hardness of Sparse Linear Regression

Sparse linear regression is the well-studied inference problem where one...
research
06/18/2019

Inverting Deep Generative models, One layer at a time

We study the problem of inverting a deep generative model with ReLU acti...
research
11/26/2020

Outcome Indistinguishability

Prediction algorithms assign numbers to individuals that are popularly u...

Please sign up or login with your details

Forgot password? Click here to reset