Tensor Train Decomposition for Efficient Memory Saving in Perceptual Feature-Maps

Taehyeon Kim, Jieun Lee, Yoonsik Choe; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 0-0

Abstract


The perceptual loss functions have been used successfully in image transformation for capturing high-level features from images in pre-trained convolutional neural networks (CNNs). Standard perceptual losses require numerous parameters to compare differences in feature-maps on both an input image and a target image; thus, it is not affordable for resource-constrained devices in terms of utilizing a feature-maps. Hence, we propose a compressed perceptual losses oriented Tensor Train (TT) decomposition on the feature-maps. Additionally, to decide an optimal TT-ranks, the proposed algorithm used the global analytic solution of Variational Bayesian Matrix Factorization (VBMF). Therefore, in proposed method, the low-rank approximated feature-maps consist of salient features by virtue of these two techniques. To the best of our knowledge, we are the first to consider curtailing redundancies in feature-maps via low-rank TT-decomposition. Experimental results in style transfer tasks demonstrate that our method not only yields similar qualitative and quantitative results as that of the original version, but also reduces memory requirement by approximately 77%.

Related Material


[pdf]
[bibtex]
@InProceedings{Kim_2019_ICCV,
author = {Kim, Taehyeon and Lee, Jieun and Choe, Yoonsik},
title = {Tensor Train Decomposition for Efficient Memory Saving in Perceptual Feature-Maps},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
month = {Oct},
year = {2019}
}