APRNet: Attention-based Pixel-wise Rendering Network for Photo-Realistic Text Image Generation

03/15/2022
by   Yangming Shi, et al.
0

Style-guided text image generation tries to synthesize text image by imitating reference image's appearance while keeping text content unaltered. The text image appearance includes many aspects. In this paper, we focus on transferring style image's background and foreground color patterns to the content image to generate photo-realistic text image. To achieve this goal, we propose 1) a content-style cross attention based pixel sampling approach to roughly mimicking the style text image's background; 2) a pixel-wise style modulation technique to transfer varying color patterns of the style image to the content image spatial-adaptively; 3) a cross attention based multi-scale style fusion approach to solving text foreground misalignment issue between style and content images; 4) an image patch shuffling strategy to create style, content and ground truth image tuples for training. Experimental results on Chinese handwriting text image synthesis with SCUT-HCCDoc and CASIA-OLHWDB datasets demonstrate that the proposed method can improve the quality of synthetic text images and make them more photo-realistic.

READ FULL TEXT

page 3

page 5

page 6

page 7

page 8

page 9

page 12

page 13

research
04/20/2023

Scene Style Text Editing

In this work, we propose a task called "Scene Style Text Editing (SSTE)"...
research
11/14/2022

Arbitrary Style Guidance for Enhanced Diffusion-Based Text-to-Image Generation

Diffusion-based text-to-image generation models like GLIDE and DALLE-2 h...
research
08/11/2023

BATINet: Background-Aware Text to Image Synthesis and Manipulation Network

Background-Induced Text2Image (BIT2I) aims to generate foreground conten...
research
06/15/2021

TextStyleBrush: Transfer of Text Aesthetics from a Single Example

We present a novel approach for disentangling the content of a text imag...
research
09/22/2018

Parametric Synthesis of Text on Stylized Backgrounds using PGGANs

We describe a novel method of generating high-resolution real-world imag...
research
07/24/2019

One-Shot Mutual Affine-Transfer for Photorealistic Stylization

Photorealistic style transfer aims to transfer the style of a reference ...
research
04/02/2022

PixelFolder: An Efficient Progressive Pixel Synthesis Network for Image Generation

Pixel synthesis is a promising research paradigm for image generation, w...

Please sign up or login with your details

Forgot password? Click here to reset