Robust regression using sparse learning for high dimensional parameter estimation problems

TitleRobust regression using sparse learning for high dimensional parameter estimation problems
Publication TypeConference Papers
Year of Publication2010
AuthorsMitra K, Veeraraghavan A, Chellappa R
Conference NameAcoustics Speech and Signal Processing (ICASSP), 2010 IEEE International Conference on
Date Published2010/03//
Keywordsalgorithm;random, analysis;, combinatorial, complexity;least, complexity;parameter, consensus;robust, Estimation, estimation;polynomials;regression, learning;sparse, median, of, problem;cubic, problem;polynomial, problem;sparse, regression, representation;computational, sample, squares;parameter, TIME

Algorithms such as Least Median of Squares (LMedS) and Random Sample Consensus (RANSAC) have been very successful for low-dimensional robust regression problems. However, the combinatorial nature of these algorithms makes them practically unusable for high-dimensional applications. In this paper, we introduce algorithms that have cubic time complexity in the dimension of the problem, which make them computationally efficient for high-dimensional problems. We formulate the robust regression problem by projecting the dependent variable onto the null space of the independent variables which receives significant contributions only from the outliers. We then identify the outliers using sparse representation/learning based algorithms. Under certain conditions, that follow from the theory of sparse representation, these polynomial algorithms can accurately solve the robust regression problem which is, in general, a combinatorial problem. We present experimental results that demonstrate the efficacy of the proposed algorithms. We also analyze the intrinsic parameter space of robust regression and identify an efficient and accurate class of algorithms for different operating conditions. An application to facial age estimation is presented.