PlumX Metrics
Embed PlumX Metrics

Locally minimizing embedding and globally maximizing variance: Unsupervised linear difference projection for dimensionality reduction

Neural Processing Letters, ISSN: 1370-4621, Vol: 33, Issue: 3, Page: 267-282
2011
  • 6
    Citations
  • 0
    Usage
  • 9
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

Article Description

Recently, many dimensionality reduction algorithms, including local methods and global methods, have been presented. The representative local linear methods are locally linear embedding (LLE) and linear preserving projections (LPP), which seek to find an embedding space that preserves local information to explore the intrinsic characteristics of high dimensional data. However, both of them still fail to nicely deal with the sparsely sampled or noise contaminated datasets, where the local neighborhood structure is critically distorted. On the contrary, principal component analysis (PCA), the most frequently used global method, preserves the total variance by maximizing the trace of feature variance matrix. But PCA cannot preserve local information due to pursuing maximal variance. In order to integrate the locality and globality together and avoid the drawback in LLE and PCA, in this paper, inspired by the dimensionality reduction methods of LLE and PCA, we propose a new dimensionality reduction method for face recognition, namely, unsupervised linear difference projection (ULDP). This approach can be regarded as the integration of a local approach (LLE) and a global approach (PCA), so that it has better performance and robustness in applications. Experimental results on the ORL, YALE and AR face databases show the effectiveness of the proposed method on face recognition. © 2011 Springer Science+Business Media, LLC.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know