You have requested a machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Neither SPIE nor the owners and publishers of the content make, and they explicitly disclaim, any express or implied representations or warranties of any kind, including, without limitation, representations and warranties as to the functionality of the translation feature or the accuracy or completeness of the translations.
Translations are not retained in our system. Your use of this feature and the translations is subject to all use restrictions contained in the Terms and Conditions of Use of the SPIE website.
2 December 2011A weak component approach of subspace analysis
In the linear discriminative analysis, especially in the high dimension case, it is insufficient to project the data onto a
one-dimensional subspace for the two-category classification problem. Therefor a weak component approach (WCA)
was proposed to project patterns to a low dimensional subspace with rich number of classification features. The role of
the weak component in pattern classification was discussed. And the abundance of discriminative information contained
in weak components was explored. Firstly, a definition of the weak component was given. Secondly, an improved
regularization method was proposed. The regularization is a biased estimate of the variance in the corresponding
dimension of the training data and the population data. Then a construction method of the feature subspace based on
weak component was given, which extracts the eigenvector of the scatter matrixes according to their discriminative
information. Finally, the proposed approach was validated in the experiments by comparing it with LDA. A better
classification accuracy of the presented method was achieved. As WCA extracts the dims on which the data distributes
closer, it is applicable to the high-dimensional data which distributes elliptically.