Classification probability analysis of principal component space analysis

TitleClassification probability analysis of principal component space analysis
Publication TypeConference Papers
Year of Publication2004
AuthorsVaswani N, Chellappa R
Conference NamePattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on
Date Published2004/08//
Keywordsalgorithm;, analysis;, approximate, classification, classification;, component, covariance, discriminant, error, intraclass, linear, matrices;, matrix;, noise;, nonwhite, object, pattern, PCA;, principal, probability;, recognition;, space, space;, statistics;, subspace, variance;

In a previous paper, we have presented a new linear classification algorithm, principal component space analysis (PCNSA) which is designed for problems like object recognition where different classes have unequal and non-white noise covariance matrices. PCNSA first obtains a principal components space (PCA space) for the entire data and in this PCA space, it finds for each class "i", an Mi dimensional subspace along which the class's intra-class variance is the smallest. We call this subspace an approximate space (ANS) since the lowest variance is usually "much smaller" than the highest. A query is classified into class "i" if its distance from the class's mean in the class's ANS is a minimum. In this paper, we discuss the PCNSA algorithm more precisely and derive tight upper bounds on its classification error probability. We use these expressions to compare classification performance of PCNSA with that of subspace linear discriminant analysis (SLDA).