Low-Rank-Sparse Subspace Representation for Robust Regression

Yongqiang Zhang, Daming Shi, Junbin Gao, Dansong Cheng; Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017, pp. 7445-7454

Abstract


Learning robust regression model from high-dimensional corrupted data is an essential and difficult problem in many practical applications. The state-of-the-art methods have studied low-rank regression models that are robust against typical noises (like Gaussian noise and out-sample sparse noise) or outliers, such that a regression model can be learned from clean data lying on underlying subspaces. However, few of the existing low-rank regression methods can handle the outliers/noise lying on the sparsely corrupted disjoint subspaces. To address this issue, we propose a low-rank-sparse subspace representation for robust regression, hereafter referred to as LRS-RR in this paper. The main contribution include the following: (1) Unlike most of the existing regression methods, we propose an approach with two phases of low-rank-sparse subspace recovery and regression optimization being carried out simultaneously;(2) we also apply the linearized alternating direction method with adaptive penalty to solved the formulated LRS-RR problem and prove the convergence of the algorithm and analyze its complexity; (3) we demonstrate the efficiency of our method for the high-dimensional corrupted data on both synthetic data and two benchmark datasets against several state-of-the-art robust methods.

Related Material


[pdf] [supp] [poster]
[bibtex]
@InProceedings{Zhang_2017_CVPR,
author = {Zhang, Yongqiang and Shi, Daming and Gao, Junbin and Cheng, Dansong},
title = {Low-Rank-Sparse Subspace Representation for Robust Regression},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {July},
year = {2017}
}