Analyzing the Latent Space of GAN through Local Dimension Estimation

05/26/2022
by   Jaewoong Choi, et al.
0

The impressive success of style-based GANs (StyleGANs) in high-fidelity image synthesis has motivated research to understand the semantic properties of their latent spaces. Recently, a close relationship was observed between the semantically disentangled local perturbations and the local PCA components in the learned latent space 𝒲. However, understanding the number of disentangled perturbations remains challenging. Building upon this observation, we propose a local dimension estimation algorithm for an arbitrary intermediate layer in a pre-trained GAN model. The estimated intrinsic dimension corresponds to the number of disentangled local perturbations. In this perspective, we analyze the intermediate layers of the mapping network in StyleGANs. Our analysis clarifies the success of 𝒲-space in StyleGAN and suggests an alternative. Moreover, the intrinsic dimension estimation opens the possibility of unsupervised evaluation of global-basis-compatibility and disentanglement for a latent space. Our proposed metric, called Distortion, measures an inconsistency of intrinsic tangent space on the learned latent space. The metric is purely geometric and does not require any additional attribute information. Nevertheless, the metric shows a high correlation with the global-basis-compatibility and supervised disentanglement score. Our findings pave the way towards an unsupervised selection of globally disentangled latent space among the intermediate latent spaces in a GAN.

READ FULL TEXT

page 6

page 9

page 18

page 22

research
10/11/2022

Finding the global semantic representation in GAN through Frechet Mean

The ideally disentangled latent space in GAN involves the global represe...
research
06/13/2021

Do Not Escape From the Manifold: Discovering the Local Coordinates on the Latent Space of GANs

In this paper, we propose a method to find local-geometry-aware traversa...
research
07/24/2023

Understanding the Latent Space of Diffusion Models through the Lens of Riemannian Geometry

Despite the success of diffusion models (DMs), we still lack a thorough ...
research
11/02/2020

Learning a Deep Reinforcement Learning Policy Over the Latent Space of a Pre-trained GAN for Semantic Age Manipulation

Learning a disentangled representation of the latent space has become on...
research
12/04/2018

A Spectral Regularizer for Unsupervised Disentanglement

Generative models that learn to associate variations in the output along...
research
03/16/2022

Fantastic Style Channels and Where to Find Them: A Submodular Framework for Discovering Diverse Directions in GANs

The discovery of interpretable directions in the latent spaces of pre-tr...
research
03/27/2020

LIMP: Learning Latent Shape Representations with Metric Preservation Priors

In this paper, we advocate the adoption of metric preservation as a powe...

Please sign up or login with your details

Forgot password? Click here to reset