Hilbert Sinkhorn Divergence for Optimal Transport

Qian Li, Zhichao Wang, Gang Li, Jun Pang, Guandong Xu; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2021, pp. 3835-3844

Abstract


Sinkhorn divergence has become a very popular metric to compare probability distributions in optimal transport. However, most works resort to Sinkhorn divergence in Euclidean space, which greatly blocks their applications in complex data with nonlinear structure. It is therefore of theoretical demand to empower Sinkhorn divergence with the capability of capturing nonlinear structures. We propose a theoretical and computational framework to bridge this gap. In this paper, we extend Sinkhorn divergence in Euclidean space to the reproducing kernel Hilbert space, which we term "Hilbert Sinkhorn divergence" (HSD).In particular, we can use kernel matrices to derive a closed form expression of HSD that is proved to be a tractable convex optimization problem. We also prove several attractive statistical properties of the proposed HSD, i.e., strong consistency, asymptotic behavior and sample complexity. Empirically, our method yields state-of-the-art performances on image classification and topological data analysis.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Li_2021_CVPR, author = {Li, Qian and Wang, Zhichao and Li, Gang and Pang, Jun and Xu, Guandong}, title = {Hilbert Sinkhorn Divergence for Optimal Transport}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2021}, pages = {3835-3844} }