Online Learning, Stability, and Stochastic Gradient Descent

T. Poggio,S. Voinea,L. Rosasco

Published 2011 in arXiv.org

ABSTRACT

In batch learning, stability together with existence and uniqueness of the solution corresponds to well-posedness of Empirical Risk Minimization (ERM) methods; recently, it was proved that CVloo stability is necessary and sufficient for generalization and consistency of ERM ([2]). In this note, we introduce CVon stability, which plays a similar role in online learning. We show that stochastic gradient descent (SDG) with the usual hypotheses is CVon stable and we then discuss the implications of CVon stability for convergence of SGD.

PUBLICATION RECORD

  • Publication year

    2011

  • Venue

    arXiv.org

  • Publication date

    2011-05-24

  • Fields of study

    Mathematics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

CITED BY

Showing 1-31 of 31 citing papers · Page 1 of 1