A linear subspace learning approach via sparse coding

$59

 

Description

As a popular dimensionality reduction and feature extraction technique, linear subspace learning (LSL) has been successfully used in various computer vision and pattern recognition applications, for example, appearance based face recognition (FR)Representative LSL methods include principal component analysis (PCA), e.g., Eigenface , Fisher linear discriminant analysis (FLDA) , the manifold learning based locality preserving projection (LPP) , local discriminant embedding (LDE) , graph embedding , etc. According to if the class label information of the training samples is exploited, the LSL methods can be categorized into unsupervised methods (e.g., PCA and LPP) and supervised methods (e.g., FLDA, regularized LDA (RLDA) and LDE).

 

In this project, we proposed a novel linear subspace learning (LSL) method via sparse coding and feature grouping. A patch based dictionary with k atoms was first learned from the training set. Then each training image can be decomposed as a linear combination of k components. 

These components were grouped into two parts: a more discriminative part (MDP) and a less discriminative part (LDP). Finally, a desired linear subspace was sought by preserving the MDP component while weakening the LDP component. The experimental results on benchmark face databases showed that the proposed sparse coding induced LSL methods outperform many representative and state-ofthe-art LSL methods.

 

 

ref :

Zhang, Lei, Pengfei Zhu, Qinghua Hu, and David Zhang. “A linear subspace learning approach via sparse coding.” In Computer Vision (ICCV), 2011 IEEE International Conference on, pp. 755-761. IEEE, 2011.

Combination the non-local means and sparse coding approaches to image restoration

Bayesian Structured Sparse Coding into Image Restoration

 

 

Reviews

There are no reviews yet.

Be the first to review “A linear subspace learning approach via sparse coding”
SKU: P2018F150 Category: Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,