When faced with the task of building machine learning or NLP models, it is often worthwhile to turn to active learning to obtain human annotations at minimal costs. Traditional active learning schemes query a human for labels of intelligently chosen examples. However, human effort can also be expended in collecting alternative forms of annotations. For example, one may attempt to learn a text classifier by labeling class-indicating words, instead of, or in addition to, documents. Learning from two different kinds of supervision brings a new, unexplored dimension to the problem of active learning. In this paper, we demonstrate the value of such active dual supervision in the context of sentiment analysis. We show how interleaving queries for both documents and words significantly reduces human effort -- more than what is possible through traditional one-dimensional active learning, or by passive combinations of supervisory inputs.
Active Dual Supervision: Reducing the Cost of Annotating Examples and Features
Published 2009 in HLT-NAACL 2009
ABSTRACT
PUBLICATION RECORD
- Publication year
2009
- Venue
HLT-NAACL 2009
- Publication date
2009-06-05
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-26 of 26 references · Page 1 of 1
CITED BY
Showing 1-33 of 33 citing papers · Page 1 of 1