-
[pdf]
[supp]
[bibtex]@InProceedings{Liu_2021_CVPR, author = {Liu, Xiao-Chang and Yang, Yong-Liang and Hall, Peter}, title = {Learning To Warp for Style Transfer}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2021}, pages = {3702-3711} }
Learning To Warp for Style Transfer
Abstract
Since its inception in 2015, Style Transfer has focused on texturing a content image using an art exemplar. Recently, the geometric changes that artists make have been acknowledged as an important component of style. Our contribution is to propose a neural network that, uniquely, learns a mapping from a 4D array of inter-feature distances to a non-parametric 2D warp field. The system is generic in not being limited by semantic class, a single learned model will suffice; all examples in this paper are output from one model. Our approach combines the benefits of the high speed of Liu et al. with the non-parametric warping of Kim et al. Furthermore, our system extends the normal NST paradigm: although it can be used with a single exemplar, we also allow two style exemplars: one for texture and another for geometry. This supports far greater flexibility in use cases than single exemplars can provide.
Related Material