Secrets of Matrix Factorization: Approximations, Numerics, Manifold Optimization and Random Restarts

Je Hyeong Hong, Andrew Fitzgibbon; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2015, pp. 4130-4138

Abstract


Matrix factorization (or low-rank matrix completion) with missing data is a key computation in many computer vision and machine learning tasks, and is also related to a broader class of nonlinear optimization problems such as bundle adjustment. The problem has received much attention recently, with renewed interest in variable-projection approaches, yielding dramatic improvements in reliability and speed. However, on a wide class of problems, no one approach dominates, and because the various approaches have been derived in a multitude of different ways, it has been difficult to unify them. This paper provides a unified derivation of a number of recent approaches, so that similarities and differences are easily observed. We also present a simple meta-algorithm which wraps any existing algorithm, yielding 100% success rate on many standard datasets. Given 100% success, the focus of evaluation must turn to speed, as 100% success is trivially achieved if we do not care about speed. Again our unification allows a number of generic improvements applicable to all members of the family to be isolated, yielding a unified algorithm that outperforms our re-implementation of existing algorithms, which in some cases already outperform the original authors' publicly available codes.

Related Material


[pdf]
[bibtex]
@InProceedings{Hong_2015_ICCV,
author = {Hong, Je Hyeong and Fitzgibbon, Andrew},
title = {Secrets of Matrix Factorization: Approximations, Numerics, Manifold Optimization and Random Restarts},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2015}
}