We consider a distributed multiuser system where individual entities possess observations or perceptions of one another, while the truth is only known to themselves and they might have an interest in withholding or distorting the truth. We ask the question whether it is possible for the system as a whole to arrive at the correct perceptions or assessment of all users, referred to as their reputation, by encouraging or incentivizing the users to participate in a collective effort without violating private information and self-interest. In this paper, we investigate this problem using a mechanism design theoretic approach. We introduce a number of utility models representing users' strategic behavior, each consisting of one or both of a truth element and an image element, reflecting the user's desire to obtain an accurate view of others and an inflated image of itself. For each model, we either design a mechanism that achieves the optimal performance (solution to the corresponding centralized problem), or present individually rational suboptimal solutions. In the latter case, we demonstrate that even when the centralized solution is not achievable, by using a simple punish-reward mechanism, not only does a user have the incentive to participate and provide information, but also that this information can improve the system performance.
Perceptions and Truth: A Mechanism Design Approach to Crowd-Sourcing Reputation
Parinaz Naghizadeh Ardabili,M. Liu
Published 2013 in IEEE/ACM Transactions on Networking
ABSTRACT
PUBLICATION RECORD
- Publication year
2013
- Venue
IEEE/ACM Transactions on Networking
- Publication date
2013-06-01
- Fields of study
Computer Science, Economics
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-25 of 25 references · Page 1 of 1
CITED BY
Showing 1-16 of 16 citing papers · Page 1 of 1