How Much Deep Learning does Neural Style Transfer Really Need? An Ablation Study

Len Du; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2020, pp. 3150-3159

Abstract


Neural style transfer has been a "killer app" for deep learning, drawing attention from and advertising the effectiveness to both the academic and the general public. However, we have found by ablative experiments that optimizing an image in the way neural style transfer does, while the objective functions (or more precisely, the functions to transform raw images to corresponding feature maps being compared) are constructed without pretrained weights or biases, worked al- most as well. We can even factor out the deepness (multiple layers of alternating linear and nonlinear transformations) alltogether and have neural style transfer work to a certain extent. This raises the question how much of the the current success of deep learning in computer vision should be attributed to training, structure or simply spatially aggregating the image.

Related Material


[pdf] [supp] [video]
[bibtex]
@InProceedings{Du_2020_WACV,
author = {Du, Len},
title = {How Much Deep Learning does Neural Style Transfer Really Need? An Ablation Study},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
month = {March},
year = {2020}
}