Facial Attribute Transformers for Precise and Robust Makeup Transfer

by   Zhaoyi Wan, et al.

In this paper, we address the problem of makeup transfer, which aims at transplanting the makeup from the reference face to the source face while preserving the identity of the source. Existing makeup transfer methods have made notable progress in generating realistic makeup faces, but do not perform well in terms of color fidelity and spatial transformation. To tackle these issues, we propose a novel Facial Attribute Transformer (FAT) and its variant Spatial FAT for high-quality makeup transfer. Drawing inspirations from the Transformer in NLP, FAT is able to model the semantic correspondences and interactions between the source face and reference face, and then precisely estimate and transfer the facial attributes. To further facilitate shape deformation and transformation of facial parts, we also integrate thin plate splines (TPS) into FAT, thus creating Spatial FAT, which is the first method that can transfer geometric attributes in addition to color and texture. Extensive qualitative and quantitative experiments demonstrate the effectiveness and superiority of our proposed FATs in the following aspects: (1) ensuring high-fidelity color transfer; (2) allowing for geometric transformation of facial parts; (3) handling facial variations (such as poses and shadows) and (4) supporting high-resolution face generation.


page 2

page 4

page 6

page 7

page 8

page 9


Instance-level Facial Attributes Transfer with Geometry-Aware Flow

We address the problem of instance-level facial attribute transfer witho...

HifiFace: 3D Shape and Semantic Prior Guided High Fidelity Face Swapping

In this work, we propose a high fidelity face swapping method, called Hi...

De-identification without losing faces

Training of deep learning models for computer vision requires large imag...

Makeup like a superstar: Deep Localized Makeup Transfer Network

In this paper, we propose a novel Deep Localized Makeup Transfer Network...

Region-Aware Face Swapping

This paper presents a novel Region-Aware Face Swapping (RAFSwap) network...

FaceShifter: Towards High Fidelity And Occlusion Aware Face Swapping

In this work, we propose a novel two-stage framework, called FaceShifter...

SOGAN: 3D-Aware Shadow and Occlusion Robust GAN for Makeup Transfer

In recent years, virtual makeup applications have become more and more p...