Worker quality control is a crucial aspect of crowdsourcing systems; typically occupying a large fraction of the time and money invested on crowdsourcing. In this work, we devise techniques to generate confidence intervals for worker error rate estimates, thereby enabling a better evaluation of worker quality. We show that our techniques generate correct confidence intervals on a range of real-world datasets, and demonstrate wide applicability by using them to evict poorly performing workers, and provide confidence intervals on the accuracy of the answers.
Evaluating the crowd with confidence
Manas R. Joglekar,H. Garcia-Molina,Aditya G. Parameswaran
Published 2013 in Knowledge Discovery and Data Mining
ABSTRACT
PUBLICATION RECORD
- Publication year
2013
- Venue
Knowledge Discovery and Data Mining
- Publication date
2013-08-11
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-32 of 32 references · Page 1 of 1