Reputation
Primavera De Filippi, French National Centre for Scientific Research
Ori Shimony, dOrg
Antonio Tenorio-Fornés, Universidad Complutense de Madrid
Definition of the term
A. Origin
Technologies such as the internet, or blockchain, enable large scale interactions among total strangers. Reputation systems (Resnick et al., 2000) appeared as a solution to facilitate these interactions when some level of trust was required, such as for online shopping in peer to peer marketplaces like eBay, or online production communities (Benkler, 2006). Yet, these systems generally relied on a centralised operator, in charge of managing user reputation.
Many decentralised reputation systems have been developed before the advent of blockchain technology (Hendrikx, 2015), most relying either on maintaining a personal list of trusted and untrusted nodes; aggregating such reputation information from other trusted nodes (with a certain degree of transitivity such as in web-of-trust); or using Distributed Hash Tables to manage a global reputation system (Chawathe et al., 2003). These decentralised reputation systems have been used in particular for applications such as file-sharing (Napster) and bandwidth routing (Tor).
The advent of blockchain technology introduced new opportunities for the development of next-generation reputation systems that rely on persistent global state and immutable transaction histories. This allows for transparency and security guarantees that were unavailable in previous distributed systems. Furthermore, blockchain technology made it possible to achieve portability and interoperability between different reputation systems, serving as an open neutral shared data store that can be leveraged by multiple services.
B. Evolution
We delineate below the different types of reputation systems that emerged over time with the development of new blockchain-based networks and decentralised applications. As opposed to the early blockchain networks like Bitcoin, whose governance is inherently plutocratic, these new applications have developed new decentralised reputation systems in order to implement more sophisticated governance systems which are not exclusively based on market dynamics.
Bitcoin (Nakamoro, 2009) first used blockchain technology to create a distributed payment system operating on top of a peer-to-peer network. The governance of Bitcoin did not rely on trust or reputation. Instead, the influence of every network node is determined by the amount of resources engaged into the network: the greater the amount of resources, the more influence one has in the network. Many of the other blockchain-based networks that followed suit relied on similar protocols, also based on a plutocratic governance model (i.e., the amount of hashing power in the case of proof-of-work or the amount of tokens holding in the case of proof-of-stake).
The introduction of reputation in the context of these early blockchain-based networks was seemingly driven by a desire to move away from a model of plutocratic governance, towards a more meritocratic governance system. Early reputation systems have been implemented at the infrastructure layer, as trust-based alternatives to the proof-of-work or proof-of-stake consensus algorithm. For instance, delegated proof-of-stake (Larimer, 2014) allows for a more meritocratic system, based on merit or perceived trustworthiness. As a result, anyone holding a particular amount of reputation within a blockchain community will have influence in proportion to that amount of reputation.
At the application layer, the introduction of reputation in the blockchain space was also an attempt to move away from the perception of blockchain technology as a purely trustless system, to enable the establishment of more sophisticated systems where some actors can be trusted. As argued by Hawlitschek & al. (2018), the introduction of reputation is necessary for the establishment of architecturally trustless systems (i.e., blockchains) that operationally rely on trust. In other words, while at the infrastructure level, the operations of a blockchain-based network do not depend on any trusted third party, at the application level, it might be beneficial to allow for certain interactions to be strengthened or facilitated by a particular amount of trust. The need for a reputation system thus ultimately depends on the types of applications at hand. On the one hand, trustless systems such as Bitcoin are based on the assumption that no one can or shall be trusted. Hence, these systems are designed to entirely eliminate the need for trust, relying on cryptographic primitives and proofs in order to ensure that people behave according to the rules (Ali & al., 2016). On the other hand, there are many human-sensitive services (e.g., peer-to-peer marketplaces like Uber, Airbnb, or eBay) based on the assumption that some actors can be trusted to behave honestly. These systems rely on reputation in order to help users assess the trustworthiness of the other users interacting on these platforms. In order to provide these types of human-sensitive mediation services, blockchain-based applications need to also rely on some kind of reputation system.
C. Coexisting uses/meanings
Existing blockchain reputation systems vary widely in how reputation is earned and utilised.
In many blockchain-based marketplaces, reputation does not have an explicit or software-defined role, but acts as a signal of trustworthiness. For instance, in service marketplaces (e.g., Gitcoin, Bounties Network), users can decide who to hire or work for based on transaction histories and summary statistics. Similarly, in digital goods marketplaces (e.g., Rarible, OpenSea), a buyer can review the seller’s transaction history to evaluate the quality of goods for sale before making a purchase.
In blockchain-based social media (e.g., Steemit, Hive, Sapien, Relevant) and work networks (e.g., Colony, Sourcecred), reputation represents a user’s evaluation weight on other users’ contributions. Reputation can be global in scope or limited to a specific community or domain. Evaluation-weighting alters reputation dynamically, as users continuously influence each other’s reputation scores in proportion to their own reputation. Some systems also incorporate time-based mechanisms to decay reputation with inactivity.
In blockchain-based governance frameworks (e.g., Aragon, DAOstack, Moloch), reputation often determines a user’s voting weight on proposals in a given organisation. Reputation can also entitle the user to a proportional claim of the organisation’s assets or ongoing revenues. Reputation is often modified through community voting, where the votes of community members are weighted by their reputation (e.g., a community can vote whether to give 50 reputation points to Alice or remove 100 reputation from Bob). Just as in social media cases, reputation can also be modified by dynamic criteria stipulated by the community, such as reputation rewards for voting with the majority, creating proposals that pass, or reputation penalties for the reverse.
Issues currently associated with the term
A. Different types of reputation
First of all, it is important to distinguish between two different types of reputation systems: “personal” and “global” reputation systems (Hendrikx 2015).
- Personal reputation systems are specific to an individual. They represent the standard mechanism of peer-to-peer reputation assignment. These systems are designed to assign a personal reputation score to each member of a particular network or community, although such a score will ultimately be relevant only to one specific individual. Hence, these systems necessarily rely on direct user input: users are expected to score each of their interactions with other community members, in order to help the system compute their corresponding reputation score. However, these systems often suffer from scalability issues. Indeed, the purpose of a reputation system is to provide information about the qualities of different users in a given domain, so that other users can make informed decisions about who they wish to interact with. Yet, a personal reputation system has limited capacity to do so, because it is not possible (or too costly) for a single user to evaluate the qualities of all the users in the system. In order to overcome these limitations, many of these reputation systems often implement a web of trust mechanism, leveraging the information submitted by other people (who are regarded as trustworthy by the user) in order to compute the personal reputation score of those with whom such user did not yet have a sufficient amount of interaction.
- Global reputation systems are not specific to any community member, but rather to the community as a whole. These systems assign a single and unique reputation score to the different actors in a particular community or network, which will be regarded by all community members as the sole and legitimate score. Besides, these reputation systems are rather easy to implement in a centralised platform; they are much more difficult to implement in a decentralised setting, since they require highly sophisticated mechanisms of reputation transfer that will not fall prey to Sybil attacks.
B. Sybil attacks and identity
Unlike popular online services, most blockchain-based systems have no central party to verify user identities, remove fake accounts, or patrol spam. While beneficial for privacy, this opens the door to Sybil attacks, where anyone can create multiple pseudonymous accounts to gain disproportionate influence over the system. The ongoing threat of Sybil attacks is a central challenge for building trust and authentic engagement in blockchain-based systems.
Well-designed reputation systems can mitigate the threat of Sybil attacks. First, systems can avoid Sybil attacks through the web of trust model, where a small set of known users slowly invites more users to be peer verified over time. Alternatively, blockchain reputation systems could rely on an external source of identity verification.
In cases where trust is needed but identity is not required, reputation scores can be used to avoid Sybil attacks. In these cases, the mechanism for earning reputation should measure difficult-to-forge value-added through peer verification. This way, an individual has to contribute just as much value regardless of how many accounts they spread the effort over, so there is no added incentive for Sybil attacks.
C. Privacy
In light of its attributes of transparency, censor-resistance, and immutability, blockchain technology can be instrumental to the operations of both personal and global reputation systems, enabling anyone to access and retrieve these scores, in order to compute both a personal and a global reputation score.
However, in order to protect the privacy of users, the reputation system should avoid permanently registering in a blockchain the association between real-world identities and the identities of the reputation system. In addition, users should be aware of the risks of linking real-world identities to their blockchain accounts. Maintaining this separation makes it possible for users to protect their privacy while allowing for anyone interacting within their blockchain-based identity to evaluate the risks of each user in that domain.
This is especially relevant in light of new European General Data Protection Regulation, which provides users with the possibility to request the erasure of specific information deemed inaccurate, inappropriate, or obsolete. Given the immutability of a blockchain, the recording of any type of data that can affect the reputation of a particular persona would potentially violate the provisions of the law, insofar as the persona can be linked back to a real-world identity.
D. Oligarchies and power distribution
The use of reputation systems also raises concerns about power concentration. The creation and consolidation of oligarchies are common in online communities. However, reputation systems might reinforce inequalities in such communities, as powerful actors are more likely to be trusted and increase their reputation while those with low reputation will have fewer opportunities to increase their reputation. Blockchain systems use reputation as a source of economic or political power: these options are explicitly made available in many governance frameworks. Thus, the accumulation of reputation in such blockchain systems might result in even stronger power inequalities than in other online communities.
E. Amplification of social inequalities
It is worth considering the potential biases reputation systems incorporate and reproduce. First, not all activities or contributions are a source of reputation in online communities (Rozas & Gilbert, 2015). Some activities, such as contributing source code to free software projects are explicitly valued in these systems, while others such as community organising, or affective labour, typically carried by women (Iosub et al., 2014) are often invisible to these reputation systems. These types of biases can trigger new forms of inequalities incorporated directly into the algorithms managing a platform, such as higher work time and lower average wage for women in the so-called gig-economy (Barzilay, 2016). We have briefly considered the reproduction of gender inequalities by reputation systems. However, other dimensions of social injustice such as race or class, and their interactions, should also be considered when studying how reputation systems reproduce them.
Conclusion
Coming up with a single definition of reputation is a difficult task, in light of the different ways in which the term has been used over time and in different disciplines. From an empirical standpoint, there are many different typologies of reputation systems (centralised vs. decentralised, personal vs. global), each with their own advantages and drawbacks. We provide here a definition of the term reputation that is specific to the blockchain space and hopefully generic enough to encompass the wide variety of decentralised reputation systems which have been developed so far.
In general terms, a decentralised reputation system is an online mechanism that enables participants to evaluate each other's trustworthiness. Reputation of a particular actor is usually calculated by aggregating the evaluations of multiple peers—each assigning a score that represents the standing, status, or reputation of such an actor within the system, based on past actions or behaviours. In a blockchain-based system, these evaluations are recorded in a blockchain and used by the system in order to calculate a final score. This score can be leveraged both explicitly through functions in the code (voting power, economic rights) or implicitly as a means of signalling an entity’s trustworthiness.
References
Ali, M., Nelson, J., Shea, R., & Freedman, M. J. (2016). Bootstrapping trust in distributed systems with blockchains. login: USENIX Mag., 41(3)
Almasoud, A. S., Hussain, F. K., & Hussain, O. K. (2020). Smart contracts for blockchain-based reputation systems: A systematic literature review. Journal of Network and Computer Applications, 102814.
Barzilay, A. R., & Ben-David, A. (2016). Platform inequality: gender in the gig-economy. Seton Hall L. Rev., 47, 393.
Benkler, Y. (2006). The wealth of networks: How social production transforms markets and freedom. Yale University Press.
Chawathe, Y., Ratnasamy, S., Breslau, L., Lanham, N., & Shenker, S. (2003, August). Making gnutella-like p2p systems scalable. In Proceedings of the 2003 conference on Applications, technologies, architectures, and protocols for computer communications (pp. 407-418).
Gai, F., Wang, B., Deng, W., & Peng, W. (2018, May). Proof of reputation: a reputation-based consensus protocol for peer-to-peer network. In International Conference on Database Systems for Advanced Applications (pp. 666-681). Springer, Cham.
Hawlitschek, F., Notheisen, B., & Teubner, T. (2018). The limits of trust-free systems: A literature review on blockchain technology and trust in the sharing economy. Electronic commerce research and applications, 29, 50-63.
Hendrikx, F., Bubendorfer, K., & Chard, R. (2015). Reputation systems: A survey and taxonomy. Journal of Parallel and Distributed Computing, 75, 184-197.
Iosub, D., Laniado, D., Castillo, C., Morell, M. F., & Kaltenbrunner, A. (2014). Emotions under discussion: Gender, status and communication in online collaboration. PloS one, 9(8), e104880.
Larimer, D. (2014). Delegated proof-of-stake (dpos). Bitshare whitepaper.
Nakamoto, S. (2009). Bitcoin: A peer-to-peer electronic cash system, May 2009. URL http://www. bitcoin. org/bitcoin. Pdf.
Resnick, P., Kuwabara, K., Zeckhauser, R. and Friedman, E., (2000). Reputation systems. Communications of the ACM, 43(12), pp.45-48.
Rozas, D., & Gilbert, N. (2015). Talk is silver, code is gold? Contribution beyond source code in Free/Libre Open Source Software communities. CRESS Working Papers.
Balázs Bodó, University of Amsterdam
PUBLISHED ON: 26 Nov, 2020 - 13:05
The article gives a good overview, but seems to omit the problems around reputation being reduced to a single measure or score. To give an example, Trump's reputation is very high in one group of voters, and very low in another group, even tough they are literally neighbors. That example suggests at least three considerations, that might be part of the definition. First, reputation cannot simply be averaged, if valuations are very heterogeneous. Second, reputation valuations must closely be tied to very well defined characteristics, or properties (I trust someone to do some particular thing, but not something different). Third, reputation valuations can be based on objectively quantifiable facts as much as subjective opinions. Mixing the two can lead to misleading aggregate reputation signals.
Martin Florian (not verified)
PUBLISHED ON: 02 Dec, 2020 - 19:48
A very interesting article! The definition and summary at the end are on-point. I don't share the optimism w.r.t. to decentralized reputation systems, however, and have a few more smaller comments.
Why shouldn't we be optimistic? Because building well-working, and thus trustworthy, reputation systems is *extremely hard*. Not even Amazon seems to be able to police its reviewers well enough anymore [1]. Imagine how much more difficult things get if we want to go decentralized.
If we can't assume any centralized and universally trusted point from which to derive reputation scores, sybil-proof reputation systems are perhaps even a theoretical impossibility (see Cheng/Friedman 2005, [2]). I feel that such limiting results should be discussed in the scope of an article on reputation systems.
I would also be careful with statements such as "Well-designed reputation systems can mitigate the threat of Sybil attacks". The original Sybil paper (Douceur 2002 [3]) very impressively shows how difficult it is to mitigate Sybil attacks without (globally) trusted (i.e., centralized) parties. Under which circumstances can a web of trust-based approach succeed despite Douceur's proofs? Also note that with a web of trust / social graph-based Sybil-resistance approach, a Sybil attacker can at the very least maintain its percentage of Sybil nodes. This can lead to significant numbers of Sybils, depending on when the attacker joins.
Smaller issues:
- "bandwidth routing (Tor)" - I don't really see where reputation comes into play here? Bandwidth routing means "nodes with more bandwidth get more traffic" - this is more akin to PoW that to something that I would call reputation.
- "blockchain technology made it possible to achieve portability and interoperability" - I'm pretty sure that both were achievable before blockchain technology and are still very much possible without it. Unless you mean that thanks to the blockchain hype decision makers are now more supportive towards open interfaces and standardized data formats :)
- I'm not sure about conflating PoW with governance, at least in the context of Bitcoin. PoW is used for consensus (as in "consensus protocol", not humans!) and minting new bitcoins, whereas most of the governance (the little that there is in Bitcoin) seems to stem from a handful of core developers trying to accommodate stakeholder wishes voiced over mailing lists and Twitter... There were one or two occasions where they had miners vote one some protocol change (via PoW) - but is this already governance by PoW? Feels like a stretch to me.
[1] A nice experience report: https://thehustle.co/amazon-fake-reviews
[2] Cheng, Alice, and Eric Friedman. "Sybilproof reputation mechanisms." Proceedings of the 2005 ACM SIGCOMM workshop on Economics of peer-to-peer systems. ACM, 2005, [PDF](http://www.eecs.harvard.edu/cs286r/courses/fall08/files/paper-CheFri.pdf)
[3] Douceur, John R. "The sybil attack." International workshop on peer-to-peer systems. Springer, Berlin, Heidelberg, 2002, [PDF](https://www.microsoft.com/en-us/research/wp-content/uploads/2002/01/IPT…)