Siamese Networks: The Tale of Two Manifolds

Soumava Kumar Roy, Mehrtash Harandi, Richard Nock, Richard Hartley; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 3046-3055

Abstract


Siamese networks are non-linear deep models that have found their ways into a broad set of problems in learning theory, thanks to their embedding capabilities. In this paper, we study Siamese networks from a new perspective and question the validity of their training procedure. We show that in the majority of cases, the objective of a Siamese network is endowed with an invariance property. Neglecting the invariance property leads to a hindrance in training the Siamese networks. To alleviate this issue, we propose two Riemannian structures and generalize a well-established accelerated stochastic gradient descent method to take into account the proposed Riemannian structures. Our empirical evaluations suggest that by making use of the Riemannian geometry, we achieve state-of-the-art results against several algorithms for the challenging problem of fine-grained image classification.

Related Material


[pdf] [supp] [video]
[bibtex]
@InProceedings{Roy_2019_ICCV,
author = {Roy, Soumava Kumar and Harandi, Mehrtash and Nock, Richard and Hartley, Richard},
title = {Siamese Networks: The Tale of Two Manifolds},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2019}
}