Searching for an (un)stable equilibrium: experiments in training generative models without data

10/06/2019
by   Terence Broad, et al.
16

This paper details a developing artistic practice around an ongoing series of works called (un)stable equilibrium. These works are the product of using modern machine toolkits to train generative models without data, an approach akin to traditional generative art where dynamical systems are explored intuitively for their latent generative possibilities. We discuss some of the guiding principles that have been learnt in the process of experimentation, present details of the implementation of the first series of works and discuss possibilities for future experimentation.

READ FULL TEXT

page 4

page 5

page 6

research
02/01/2022

Right for the Right Latent Factors: Debiasing Generative Models via Disentanglement

A key assumption of most statistical machine learning methods is that th...
research
07/13/2023

Tensor Decompositions Meet Control Theory: Learning General Mixtures of Linear Dynamical Systems

Recently Chen and Poor initiated the study of learning mixtures of linea...
research
11/07/2018

Effects of Dataset properties on the training of GANs

Generative Adversarial Networks are a new family of generative models, f...
research
07/21/2021

Generative Models for Security: Attacks, Defenses, and Opportunities

Generative models learn the distribution of data from a sample dataset a...
research
05/23/2023

Robust non-computability and stability of dynamical systems

In this paper, we examine the relationship between the stability of the ...
research
09/29/2017

A Study of Cross-domain Generative Models applied to Cartoon Series

We investigate Generative Adversarial Networks (GANs) to model one parti...

Please sign up or login with your details

Forgot password? Click here to reset