GraphLab: A New Framework For Parallel Machine Learning

Yucheng Low,Joseph E. Gonzalez,Aapo Kyrola,Danny Bickson,Carlos Guestrin,Joseph M Hellerstein

Published 2010 in Conference on Uncertainty in Artificial Intelligence

ABSTRACT

Designing and implementing efficient, provably correct parallel machine learning (ML) algorithms is challenging. Existing high-level parallel abstractions like MapReduce are insufficiently expressive while low-level tools like MPI and Pthreads leave ML experts repeatedly solving the same design challenges. By targeting common patterns in ML, we developed GraphLab, which improves upon abstractions like MapReduce by compactly expressing asynchronous iterative algorithms with sparse computational dependencies while ensuring data consistency and achieving a high degree of parallel performance. We demonstrate the expressiveness of the GraphLab framework by designing and implementing parallel versions of belief propagation, Gibbs sampling, Co-EM, Lasso and Compressed Sensing. We show that using GraphLab we can achieve excellent parallel performance on large scale real-world problems.

PUBLICATION RECORD

  • Publication year

    2010

  • Venue

    Conference on Uncertainty in Artificial Intelligence

  • Publication date

    2010-06-25

  • Fields of study

    Mathematics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-21 of 21 references · Page 1 of 1

CITED BY

Showing 1-100 of 924 citing papers · Page 1 of 10