Image Neural Style Transfer With Global and Local Optimization Fusion

This paper presents a new image synthesis method for image style transfer. For some common methods, the textures and colors in the style image are sometimes applied inappropriately to the content image, which generates artifacts. In order to improve the results, we propose a novel method based on a...

Full description

Bibliographic Details
Main Authors: Hui-Huang Zhao, Paul L. Rosin, Yu-Kun Lai, Mu-Gang Lin, Qin-Yun Liu
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8736245/
Description
Summary:This paper presents a new image synthesis method for image style transfer. For some common methods, the textures and colors in the style image are sometimes applied inappropriately to the content image, which generates artifacts. In order to improve the results, we propose a novel method based on a new strategy that combines both local and global style losses. On the one hand, a style loss function based on a local approach is used to keep the style details. On the other hand, another style loss function based on global measures is used to capture more global structural information. The results on various images show that the proposed method reduces artifacts while faithfully transferring the style image's characteristics and preserving the structure and color of the content image.
ISSN:2169-3536