Supervised LogEuclidean Metric Learning for Symmetric Positive Definite Matrices

F. Yger,Masashi Sugiyama

Published 2015 in arXiv.org

ABSTRACT

Metric learning has been shown to be highly effective to improve the performance of nearest neighbor classification. In this paper, we address the problem of metric learning for Symmetric Positive Definite (SPD) matrices such as covariance matrices, which arise in many real-world applications. Naively using standard Mahalanobis metric learning methods under the Euclidean geometry for SPD matrices is not appropriate, because the difference of SPD matrices can be a non-SPD matrix and thus the obtained solution can be uninterpretable. To cope with this problem, we propose to use a properly parameterized LogEuclidean distance and optimize the metric with respect to kernel-target alignment, which is a supervised criterion for kernel learning. Then the resulting non-trivial optimization problem is solved by utilizing the Riemannian geometry. Finally, we experimentally demonstrate the usefulness of our LogEuclidean metric learning algorithm on real-world classification tasks for EEG signals and texture patches.

PUBLICATION RECORD

  • Publication year

    2015

  • Venue

    arXiv.org

  • Publication date

    2015-02-11

  • Fields of study

    Mathematics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-45 of 45 references · Page 1 of 1

CITED BY

Showing 1-28 of 28 citing papers · Page 1 of 1