Computing Publications

Publications Home » An eigenvalue-problem formulation...

An eigenvalue-problem formulation for non-parametric mutual information maximisation for linear dimensionality reduction

Raymond Liu, Duncan Gillies

Conference or Workshop Paper
The 2012 International Conference on Image Processing, Computer Vision, and Pattern Recognition (IPCV 2012)
July, 2012

Well-known dimensionality reduction (feature extraction) techniques, such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA), are formulated as eigenvalue-problems, where the required features are eigenvectors of some objective matrix. Eigenvalue-problems are theoretically elegant, and have advantages over iterative algorithms. In contrast to iterative algorithms, they can discover globally optimal features in one go, thus reducing computation times and avoiding local optima. Here we propose an eigenvalue-problem formulation for linear dimensionality reduction based on maximising the mutual information between the class variable and the extracted features. Mutual information takes into account all moments of the input data while PCA and LDA only account for the first two moments. Our experiments show that our proposed method achieves better, more discriminative projections than PCA and LDA, and gives better classification results for datasets in which each class is well-represented.

PDF of full publication (634 kilobytes)
(need help viewing PDF files?)
BibTEX file for the publication
Conditions for downloading publications from this site. built & maintained by Ashok Argent-Katwala.