Improved Texture Networks: Maximizing Quality and Diversity in Feed-Forward Stylization and Texture Synthesis

Dmitry Ulyanov, Andrea Vedaldi, Victor Lempitsky; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017, pp. 6924-6932

Abstract


The recent work of Gatys et al., who characterized the style of an image by the statistics of convolutional neural network filters, ignited a renewed interest in the texture generation and image stylization problems. While their image generation technique uses a slow optimization process, recently several authors have proposed to learn generator neural networks that can produce similar outputs in one quick forward pass. While generator networks are promising, they are still inferior in visual quality and diversity compared to generation-by-optimization. In this work, we advance them in two significant ways. First, we introduce an instance normalization module to replace batch normalization with significant improvements to the quality of image stylization. Second, we improve diversity by introducing a new learning formulation that encourages generators to sample unbiasedly from the Julesz texture ensemble, which is the equivalence class of all images characterized by certain filter responses. Together, these two improvements take feed forward texture synthesis and image stylization much closer to the quality of generation-via-optimization, while retaining the speed advantage.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Ulyanov_2017_CVPR,
author = {Ulyanov, Dmitry and Vedaldi, Andrea and Lempitsky, Victor},
title = {Improved Texture Networks: Maximizing Quality and Diversity in Feed-Forward Stylization and Texture Synthesis},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {July},
year = {2017}
}