We study a problem where a group of agents has to decide how a joint reward should be shared among them. We focus on settings where the share that each agent receives depends on the subjective opinions of its peers concerning that agent's contribution to the group. To this end, we introduce a mechanism to elicit and aggregate subjective opinions as well as for determining agents' shares. The intuition behind the proposed mechanism is that each agent who believes that the others are telling the truth has its expected share maximized to the extent that it is well-evaluated by its peers and that it is truthfully reporting its opinions. Under the assumptions that agents are Bayesian decision-makers and that the underlying population is sufficiently large, we show that our mechanism is incentive-compatible, budgetbalanced, and tractable. We also present strategies to make this mechanism individually rational and fair.
A truth serum for sharing rewards
Published 2011 in Adaptive Agents and Multi-Agent Systems
ABSTRACT
PUBLICATION RECORD
- Publication year
2011
- Venue
Adaptive Agents and Multi-Agent Systems
- Publication date
2011-05-02
- Fields of study
Computer Science, Economics, Psychology
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-15 of 15 references · Page 1 of 1
CITED BY
Showing 1-13 of 13 citing papers · Page 1 of 1