ArtFusion: Controllable Arbitrary Style Transfer using Dual Conditional Latent Diffusion Models

06/15/2023
by   Dar-Yen Chen, et al.
0

Arbitrary Style Transfer (AST) aims to transform images by adopting the style from any selected artwork. Nonetheless, the need to accommodate diverse and subjective user preferences poses a significant challenge. While some users wish to preserve distinct content structures, others might favor a more pronounced stylization. Despite advances in feed-forward AST methods, their limited customizability hinders their practical application. We propose a new approach, ArtFusion, which provides a flexible balance between content and style. In contrast to traditional methods reliant on biased similarity losses, ArtFusion utilizes our innovative Dual Conditional Latent Diffusion Probabilistic Models (Dual-cLDM). This approach mitigates repetitive patterns and enhances subtle artistic aspects like brush strokes and genre-specific features. Despite the promising results of conditional diffusion probabilistic models (cDM) in various generative tasks, their introduction to style transfer is challenging due to the requirement for paired training data. ArtFusion successfully navigates this issue, offering more practical and controllable stylization. A key element of our approach involves using a single image for both content and style during model training, all the while maintaining effective stylization during inference. ArtFusion outperforms existing approaches on outstanding controllability and faithful presentation of artistic details, providing evidence of its superior style transfer capabilities. Furthermore, the Dual-cLDM utilized in ArtFusion carries the potential for a variety of complex multi-condition generative tasks, thus greatly broadening the impact of our research.

READ FULL TEXT

page 5

page 6

page 7

page 13

page 14

page 19

page 20

page 21

research
08/15/2023

StyleDiffusion: Controllable Disentangled Style Transfer via Diffusion Models

Content and style (C-S) disentanglement is a fundamental problem and cri...
research
11/19/2022

DiffStyler: Controllable Dual Diffusion for Text-Driven Image Stylization

Despite the impressive results of arbitrary image-guided style transfer ...
research
03/17/2020

Parameter-Free Style Projection for Arbitrary Style Transfer

Arbitrary image style transfer is a challenging task which aims to styli...
research
03/27/2023

Training-free Style Transfer Emerges from h-space in Diffusion models

Diffusion models (DMs) synthesize high-quality images in various domains...
research
06/29/2023

ICDaeLST: Intensity-Controllable Detail Attention-enhanced for Lightweight Fast Style Transfer

The mainstream style transfer methods usually use pre-trained deep convo...

Please sign up or login with your details

Forgot password? Click here to reset