Learning Deep Representations with Probabilistic Knowledge Transfer

Nikolaos Passalis, Anastasios Tefas; Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 268-284

Abstract


Knowledge Transfer (KT) techniques tackle the problem of transferring the knowledge from a large and complex neural network into a smaller and faster one. However, existing KT methods are tailored towards classification tasks and they cannot be used efficiently for other representation learning tasks. In this paper we propose a novel probabilistic knowledge transfer method that works by matching the probability distribution of the data in the feature space instead of their actual representation. Apart from outperforming existing KT techniques, the proposed method allows for overcoming several of their limitations providing new insight into KT as well as novel KT applications, ranging from KT from handcrafted feature extractors to cross-modal KT from the textual modality into the representation extracted from the visual modality of the data.

Related Material


[pdf]
[bibtex]
@InProceedings{Passalis_2018_ECCV,
author = {Passalis, Nikolaos and Tefas, Anastasios},
title = {Learning Deep Representations with Probabilistic Knowledge Transfer},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}