We analyze continual learning on a sequence of separable linear classification tasks with binary labels. We show theoretically that learning with weak regularization reduces to solving a sequential max-margin problem, corresponding to a special case of the Projection Onto Convex Sets (POCS) framework. We then develop upper bounds on the forgetting and other quantities of interest under various settings with recurring tasks, including cyclic and random orderings of tasks. We discuss several practical implications to popular training practices like regularization scheduling and weighting. We point out several theoretical differences between our continual classification setting and a recently studied continual regression setting.
Continual Learning in Linear Classification on Separable Data
Itay Evron,E. Moroshko,G. Buzaglo,Maroun Khriesh,B. Marjieh,N. Srebro,Daniel Soudry
Published 2023 in International Conference on Machine Learning
ABSTRACT
PUBLICATION RECORD
- Publication year
2023
- Venue
International Conference on Machine Learning
- Publication date
2023-06-06
- Fields of study
Mathematics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-57 of 57 references · Page 1 of 1
CITED BY
Showing 1-29 of 29 citing papers · Page 1 of 1