Social norms are the informal rules of society that largely govern human interactions. What a norm prescribes one ought to do hinges on others’ (expected) behaviors. A fundamental example is when a norm directs one to engage in some behavior only if others do so too. This begs the question: who engages in the behavior first?
This “chicken and egg” problem can be remedied by shaping people’s expectations. If everyone believes that everyone else is doing it – whether they initially are or not – then they all will. Although, typically, not everyone will have the same conditions at which they’ll behave as others do. Some will adopt the behavior if only a few do, some only if many do, and some will never adopt it.
One example of the role of expectations in shaping behavior comes from peer-to-peer file-sharing software. Piggybacking on our innate sense of fairness, sharing files can be promoted by convincing users that many others also share – even if they do not. Sharing can thereby be greater than it would be if everyone knew the degree to which everyone else was sharing. Inversely, this suggests that convincing users that sharing is rare can undermine file-sharing networks.
Normative expectations also play a role in college binge drinking. Partially due to binge drinkers being conspicuous, the prevalence of it is exaggerated, which results in students drinking more than they would had they known the true prevalence.
Another example comes from the Minnesota Income Tax Compliance Experiment, a widely cited study where a group of taxpayers became more tax compliant when informed that compliance was the norm. In these examples, there is evidence that behavior can be altered by informing people of the true behavior of their peers.
Along with my colleague Erol Akçay, an Associate Professor of Biology at the University of Pennsylvania, I constructed a mathematical model to understand such social dynamics. Consider a group of people that may adopt a behavior conditional on others adopting it, and let us falsely inform them that everyone is engaging in this behavior. Next, assume that they may be able to learn the true behavior of their peers. This is salient as experimental public goods games show that cooperation can erode when conditional cooperators learn that others are cheating. Finally, let people enter or leave the group. New entrants will be naive about the true behavior of their peers, while current members that glean the truth can become discouraged and leave. In such a setting, obscuring the truth can promote the adoption of the behavior.
Essentially, false beliefs act as a catalyst pushing the population toward the behavior after which it can be maintained. This can work even if people will only weakly adopt the behavior. Although this scenario requires a steady stream of new members, or that savvy members rapidly become discouraged and leave.
Our main findings can be summarized as follows:
Increasing the proportion of those who believe that the behavior is prevalent is essential in promoting that behavior. This can be achieved by having a high influx of new people, making it difficult to learn the true behavior of ones’ peers, and making it easy for those who know the true behavior to leave.
These group dynamics can lead to a group “crash” where everyone leaves, or boom-bust cycles where the group repeatedly undergoes massive growth and decline.
There may be a trade-off between the effects of misinformation we studied and punishments for those who don’t engage in the behavior. Punishment can promote norm following in others since naturally one wants to avoid the punishment. However, the visibility of punishment can undermine its impact if it also reveals that it’s common to not adopt the behavior.
Our work addresses how social norms and obscuring the truth can be used to propel a population to engage in a behavior. Though the beliefs are initially false, they can end up being completely or nearly true over time as expectations become reality. The initial (mis)information is thus self-fulfilling.