An asynchronous parallel stochastic coordinate descent algorithm

Ji Liu,Stephen J. Wright,C. Ré,Victor Bittorf,Srikrishna Sridhar

Published 2013 in Journal of machine learning research

ABSTRACT

We describe an asynchronous parallel stochastic coordinate descent algorithm for minimizing smooth unconstrained or separably constrained functions. The method achieves a linear convergence rate on functions that satisfy an essential strong convexity property and a sublinear rate (1/K) on general convex functions. Near-linear speedup on a multicore system can be expected if the number of processors is O(n1/2) in unconstrained optimization and O(n1/4) in the separable-constrained case, where n is the number of variables. We describe results from implementation on 40-core processors.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-35 of 35 references · Page 1 of 1

CITED BY

Showing 1-100 of 393 citing papers · Page 1 of 4