Exemplar-Based Portrait Style Transfer

Transferring the style of an example image to a content image opens the door of artistic creation for end users. However, it is especially challenging for portrait photos since human vision system is sensitive to the slight artifacts on portraits. Previous methods use facial landmarks to densely ali...

Full description

Bibliographic Details
Main Authors: Ming Lu, Feng Xu, Hao Zhao, Anbang Yao, Yurong Chen, Li Zhang
Format: Article
Language:English
Published: IEEE 2018-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8485287/
Description
Summary:Transferring the style of an example image to a content image opens the door of artistic creation for end users. However, it is especially challenging for portrait photos since human vision system is sensitive to the slight artifacts on portraits. Previous methods use facial landmarks to densely align the content face with the style face to reduce the artifacts. However, they can only handle the facial region. As for the whole image, building the dense correspondence is difficult and may easily introduce errors. In this paper, we propose a robust approach for portrait style transfer that gets rid of dense correspondence. Our approach is based on the guided image synthesis framework. We propose three novel guidance maps for the synthesis process. Contrary to former methods, these maps do not require the dense correspondence between content image and style image, which allows our method to handle the whole portrait photo instead of facial region only. In comparison with recent neural style transfer methods, our method achieves more pleasing results and preserves more texture details. Extensive experiments demonstrate our advantage over former methods on portrait style transfer.
ISSN:2169-3536