Informative Discriminant Analysis

Reference:

Samuel Kaski and Jaakko Peltonen. Informative discriminant analysis. In Tom Fawcett and Nina Mishra, editors, Proceedings of ICML-2003, the Twentieth International Conference on Machine Learning, pages 329–336, Menlo Park, CA, 2003. AAAI Press.

Abstract:

We introduce a probabilistic model that generalizes classical linear discriminant analysis and gives an interpretation for the components as informative or relevant components of data. The components maximize the predictability of class distribution which is asymptotically equivalent to (i) maximizing mutual information with the classes, and (ii) finding principal components in the so-called learning or Fisher metrics. The Fisher metric measures only distances that are relevant to the classes, that is, distances that cause changes in the class distribution. The components have applications in data exploration, visualization, and dimensionality reduction. In empirical experiments the method outperformed a Renyi entropy-based alternative and linear discriminant analysis.

Suggested BibTeX entry:

@inproceedings{Kaski03icml,
    address = {Menlo Park, CA},
    author = {Samuel Kaski and Jaakko Peltonen},
    booktitle = {Proceedings of ICML-2003, the Twentieth International Conference on Machine Learning},
    editor = {Tom Fawcett and Nina Mishra},
    pages = {329-336},
    publisher = {AAAI Press},
    title = {Informative Discriminant Analysis},
    year = {2003},
}

See www.cis.hut.fi ...