next up previous
Next: Example Up: Classwise Principal Component Analysis Previous: Classwise Principal Component Analysis

Introduction

This tutorial is an accompanying document to the computer code for classwise principal component analysis (CPCA). The method inherently represents a feature extraction/dimensionality reduction technique, though it is often coupled to other feature extraction methods (e.g. those that cannot handle the small sample size problem) and classification algorithms. The computer code is written in MATLAB TM and the details of the method can be found in [1]. The method was first applied in [2] to classify high-dimensional brain data such as electrocorticograms (ECoG) and electroencephalograms. Comprehensive testing of the method on ECoG data has been presented in [3]. The code consists of the following functions:

  1. dataproc_func_cpca.m
  2. dataproc_func_princomp.m
  3. choose_subspace.m

dataproc_func_cpca.m is one of the two main functions. Its goal is to use training samples to estimate a family of subspaces for dimensionality reduction. A detailed description of this function will be given in Section 2. You can type help dataproc_func_cpca in MATLAB command prompt to learn more about this function. dataproc_func_princomp.m is an auxiliary function that does basic PCA decomposition. choose_subspace.m is a function that chooses one out of C (C is the number of classes) subspaces estimated by dataproc_func_cpca.m above.


next up previous
Next: Example Up: Classwise Principal Component Analysis Previous: Classwise Principal Component Analysis
Zoran Nenadic 2010-06-26