One-Shot Mutual Affine-Transfer for Photorealistic Stylization

07/24/2019
by   Ying Qu, et al.
4

Photorealistic style transfer aims to transfer the style of a reference photo onto a content photo naturally, such that the stylized image looks like a real photo taken by a camera. Existing state-of-the-art methods are prone to spatial structure distortion of the content image and global color inconsistency across different semantic objects, making the results less photorealistic. In this paper, we propose a one-shot mutual Dirichlet network, to address these challenging issues. The essential contribution of the work is the realization of a representation scheme that successfully decouples the spatial structure and color information of images, such that the spatial structure can be well preserved during stylization. This representation is discriminative and context-sensitive with respect to semantic objects. It is extracted with a shared sparse Dirichlet encoder. Moreover, such representation is encouraged to be matched between the content and style images for faithful color transfer. The affine-transfer model is embedded in the decoder of the network to facilitate the color transfer. The strong representative and discriminative power of the proposed network enables one-shot learning given only one content-style image pair. Experimental results demonstrate that the proposed method is able to generate photorealistic photos without spatial distortion or abrupt color changes.

READ FULL TEXT

page 1

page 7

page 8

research
03/22/2017

Deep Photo Style Transfer

This paper introduces a deep-learning approach to photographic style tra...
research
07/21/2020

Deep Preset: Blending and Retouching Photos with Color Style Transfer

End-users, without knowledge in photography, desire to beautify their ph...
research
05/09/2020

Photo style transfer with consistency losses

We address the problem of style transfer between two photos and propose ...
research
03/15/2022

APRNet: Attention-based Pixel-wise Rendering Network for Photo-Realistic Text Image Generation

Style-guided text image generation tries to synthesize text image by imi...
research
11/19/2019

Two-Stream FCNs to Balance Content and Style for Style Transfer

Style transfer is to render given image contents in given styles, and it...
research
12/01/2017

Multi-Content GAN for Few-Shot Font Style Transfer

In this work, we focus on the challenge of taking partial observations o...
research
03/27/2019

Mimicking the In-Camera Color Pipeline for Camera-Aware Object Compositing

We present a method for compositing virtual objects into a photograph su...

Please sign up or login with your details

Forgot password? Click here to reset