DiffStyler: Controllable Dual Diffusion for Text-Driven Image Stylization

11/19/2022
by   Nisha Huang, et al.
0

Despite the impressive results of arbitrary image-guided style transfer methods, text-driven image stylization has recently been proposed for transferring a natural image into the stylized one according to textual descriptions of the target style provided by the user. Unlike previous image-to-image transfer approaches, text-guided stylization progress provides users with a more precise and intuitive way to express the desired style. However, the huge discrepancy between cross-modal inputs/outputs makes it challenging to conduct text-driven image stylization in a typical feed-forward CNN pipeline. In this paper, we present DiffStyler on the basis of diffusion models. The cross-modal style information can be easily integrated as guidance during the diffusion progress step-by-step. In particular, we use a dual diffusion processing architecture to control the balance between the content and style of the diffused results. Furthermore, we propose a content image-based learnable noise on which the reverse denoising process is based, enabling the stylization results to better preserve the structure information of the content image. We validate the proposed DiffStyler beyond the baseline methods through extensive qualitative and quantitative experiments.

READ FULL TEXT

page 1

page 3

page 5

page 6

page 7

page 8

research
06/15/2023

ArtFusion: Controllable Arbitrary Style Transfer using Dual Conditional Latent Diffusion Models

Arbitrary Style Transfer (AST) aims to transform images by adopting the ...
research
09/30/2022

Diffusion-based Image Translation using Disentangled Style and Content Representation

Diffusion-based image translation guided by semantic texts or a single t...
research
03/15/2023

Class-Guided Image-to-Image Diffusion: Cell Painting from Brightfield Images with Class Labels

Image-to-image reconstruction problems with free or inexpensive metadata...
research
06/04/2019

Selective Style Transfer for Text

This paper explores the possibilities of image style transfer applied to...
research
11/23/2022

Inversion-Based Style Transfer with Diffusion Models

The artistic style within a painting is the means of expression, which i...
research
06/22/2022

A Fast Text-Driven Approach for Generating Artistic Content

In this work, we propose a complete framework that generates visual art....
research
04/13/2023

Expressive Text-to-Image Generation with Rich Text

Plain text has become a prevalent interface for text-to-image synthesis....

Please sign up or login with your details

Forgot password? Click here to reset