Fast Discriminative Component Analysis for Comparing Examples

Reference:

Jaakko Peltonen, Jacob Goldberger, and Samuel Kaski. Fast discriminative component analysis for comparing examples. In NIPS'06 Workshop on Learning to Compare Examples, December 8, 2006, Whistler, BC, Canada, 2006. http://www.idiap.ch/lce/.

Abstract:

Two recent methods, Neighborhood Components Analysis (NCA) and Informative Discriminant Analysis (IDA), search for a class-discriminative subspace or discriminative components of data, equivalent to learning of distance metrics invariant to changes perpendicular to the subspace. Constraining metrics to a subspace is useful for regularizing the metrics, and for dimensionality reduction. We introduce a variant of NCA and IDA that reduces their computational complexity from quadratic to linear in the number of data samples, by replacing their purely non-parametric class density estimates with semiparametric mixtures of Gaussians. In terms of accuracy, the method is shown to perform as well as NCA on benchmark data sets, outperforming several popular linear dimensionality reduction methods.

Suggested BibTeX entry:

@inproceedings{Peltonen06lce,
    author = {Jaakko Peltonen and Jacob Goldberger and Samuel Kaski},
    booktitle = {NIPS'06 Workshop on Learning to Compare Examples, December 8, 2006, Whistler, BC, Canada},
    note = {http://www.idiap.ch/lce/},
    title = {Fast Discriminative Component Analysis for Comparing Examples},
    year = {2006},
}

This work is not available online here.