A Large Dimensional Analysis of Least Squares Support Vector Machines

Zhenyu Liao,Romain Couillet

Published 2017 in IEEE Transactions on Signal Processing

ABSTRACT

In this paper, a large dimensional performance analysis of kernel least squares support vector machines (LS-SVMs) is provided under the assumption of a two-class Gaussian mixture model for the input data. Building upon recent advances in a random matrix theory, we show, when the dimension of data <inline-formula><tex-math notation="LaTeX">$p$</tex-math></inline-formula> and their number <inline-formula><tex-math notation="LaTeX">$n$</tex-math></inline-formula> are both large, that the LS-SVM decision function can be well approximated by a normally distributed random variable, the mean and variance of which depend explicitly on a local behavior of the kernel function. This theoretical result is then applied to the MNIST and Fashion-MNIST datasets which, despite their non-Gaussianity, exhibit a convincingly close behavior. Most importantly, our analysis provides a deeper understanding of the mechanism into play in SVM-type methods and in particular of the impact on the choice of the kernel function as well as some of their theoretical limits in separating high-dimensional Gaussian vectors.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-43 of 43 references · Page 1 of 1

CITED BY

Showing 1-44 of 44 citing papers · Page 1 of 1