GILDA++: Grassmann Incremental Linear Discriminant Analysis

Navya Nagananda, Andreas Savakis; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2021, pp. 4453-4461


Linear Discriminant Analysis (LDA) is an important supervised dimensionality reduction method. Traditional LDA makes use of the eigenvalue decomposition of the scatter matrices based on the entire dataset. However, in some settings, the whole dataset may not be available at once. Our approach considers an incremental LDA framework where the model receives training data in the form of chunks for subsequent analysis. We propose the Grassmann-Incremental Linear Discriminant Analysis (GILDA++) using the proxy matrix optimization method (PMO). The PMO method does not directly optimize a matrix on the manifold but uses an auxiliary or proxy matrix in ambient space which is retracted to the closest location on the manifold along the loss minimizing geodesic. PMO makes use of an LDA objective by incrementally updating the scatter matrices to handle chunks of data. It makes use of automatic differentiation and stochastic gradient descent to find the lower dimensional LDA projection matrix. GILDA++ is able to handle chunk data, where each chunk has new samples from existing classes or novel classes. Our experiments demonstrate that GILDA++ outperforms the prevailing incremental LDA methods in various datasets.

Related Material

@InProceedings{Nagananda_2021_CVPR, author = {Nagananda, Navya and Savakis, Andreas}, title = {GILDA++: Grassmann Incremental Linear Discriminant Analysis}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2021}, pages = {4453-4461} }