There’s a brilliant little web-tool called The Evolution of Trust that demonstrates how networks of trust emerge and breakdown within a society. It takes about 15-30 minutes to go through it, but it’s definitely worth it if you have the time. The rest of this article will assume you’ve gone through the tool.

The TL;DR of the tool above is this: In most social circumstances, we all default to a “tit-for-tat” strategy of trust—that is, we trust people until they give us a reason to distrust them, and then we distrust them until they give us a reason to trust them again. The tit-for-tat strategy produces the best results for the most people under normal conditions and ultimately ends up generating societies full of trust.

But the tool also lets you experiment with some variables in the system. For example, what if lying or cheating gives you a disproportionately large reward compared to cooperating with others? What then?

Well, the result is predictable: increased incentives result in more cheaters and more cheaters result in greater distrust, and greater distrust results in even more cheating, and so on.

Conversely, if you arrange it so that cooperation gives a disproportionately large reward compared to cheating, people have more incentive to cooperate, which generates trust, which then generates more cooperative behavior, and so on.

In this way, trust within a society organizes into an upward or downward spiral. The more people trust each other, the more trustworthy people tend to cooperate, thus generating more trust, and so on. The more people who distrust each other, the more people behave in untrustworthy ways.

Anyone who has lived in some of the poorer and more corrupt parts of the world has certainly seen this play out. I remember while living in South America, the corruption was so rampant that you felt obliged to behave in untrustworthy ways, otherwise you would just get screwed over constantly. It sucked.

One could say that the role of “civilization” itself is to construct systems of incentives to get people to cooperate with each other, thus creating an upward spiral of trust. Civilization breaks down when those systems or incentives are dismantled, returning us to our more innate, tribal nature of screwing everyone over, except maybe a few friends and family.

The Evolution of Trust tool demonstrates this well, but what is eye-opening is that it also demonstrates how sensitive these systems of trust can be. Social trust is fragile. It doesn’t take much to kick the first domino in a wave of distrust and watch society spiral downward into selfishness and tribalism.

We know that incentives matter. As humans, we suck at thinking about the greater social good or how our actions subtly affect the systems we participate in. We figure that if we are speeding, it’s no big deal—but if everybody speeds, then clearly it’s a problem. As a result, we have to institute penalties that discourage us from breaking the rules. These incentives shape behavior and that behavior then creates a cascade of trust that continues reinforcing the fair and just social systems.

But shift those incentives, remove some punishment or add some reward, and suddenly that spiral of trust breaks down. Remove the penalties for speeding and suddenly everyone thinks it’s okay if they drive faster on the highway. They see everyone else doing it, so why not them? Before you know it, you have cataclysmic car accidents happening on the regular.

These shifts in incentives have been well studied over the centuries. Corrupt leaders who install friends and family members into leadership positions and abuse their power can instigate this sort of downward spiral within a society. Poorly thought-out laws and policies that destroy good incentive structures can send a country into a tailspin of distrust and corruption.

Similarly, societies that form strong and fair legal institutions, write constitutions, and introduce representative democracy, often instigate the upward spiral of trust that produces a safe, functional society.

The Dangers of Distrust in the Information Age

The frightening implication at the end of The Evolution of Trust is that the incentives created by the internet—the ease with which one can promote fraudulent products, profit from disinformation and get away with antisocial behavior—may skew the delicate equilibrium required to operate within a trustworthy society.

Think about it. Before the internet, there were real negative social repercussions for being a total jackass and a troll. You were shunned by society and hated by all. Today, not only are you protected from the social repercussions of your trolling, but you are rewarded with attention, and in some cases, even fans.

Or think about posting disinformation. Again, before the internet, generating fake news would  lead to legal action and result in your broadcast license being revoked. Today, fake news spreads faster than real news and receives more clicks, thus generating more revenue. Meanwhile, the purveyors of the disinformation aren’t able to be punished for what they’re doing. It’s too easy to muddy the waters by spamming social media and attacking legitimate experts and call them liars.

Don’t worry, this is not a (yet another) “the internet ruined everything” article. The internet has simply shifted the incentive structures—in terms of the Evolution of Trust game, it changed the point values of cooperation and cheating. And now we have to find new institutions, policies, and cultural norms to shift the equilibrium back, before we spiral into a society full of distrust.

Research is showing that social trust is deteriorating worldwide, and has been for a long time now. People have less trust today in their governments, media outlets, and in random people on the street, than ever before.

If our default strategy is “tit-for-tat,” and we are constantly exposed to the most untrustworthy actors in society via social media and the internet, then how is that going to affect our behavior going into the future?

Our social institutions rely on trust to function. A government can’t pass laws, much less enforce them, if the people don’t put their trust in the government as being legitimate. Media and educators cannot investigate and study issues if people don’t trust them to tell the truth. Businesses cannot sell products if people don’t trust them to behave ethically. Charities cannot provide much-needed aid if people don’t trust them to use their funds wisely.

When trust breaks down, everything breaks down. Research shows that low-trust societies struggle with economic growth, have public health and safety problems, and end up with less effective governments.1 Basically: things go to shit. And I fear our incentives have been sufficiently nudged in the wrong direction, and we’re currently living through the chain reaction of distrust.

So, what can we do?

Well, first, I can already see my inbox filling with angry readers who spend way too much time watching cable news, screaming, “Oh, so we should just blindly trust our leaders now? Is that what you’re saying?”

No, that’s not what I’m saying. There is a lot of corruption out there. And there are a lot of shady governments and businesses and media outlets.

I suppose what I’m asking for is a little bit less of a demand for perfection and a more robust understanding that, well, humans suck. Every organization fucks up. Every leader puts their foot in their mouth. I might loathe the president or my congress, but at the end of the day, I have to remind myself that I am cheering for their success, despite all of my antipathies. While I may not trust their actions, I still trust the institutions and, in the long run, for those institutions’ ability to function. I’m still a member of this country and society and that counts for something. I think in the age of the clickbait article and the “gotcha” video clip, we’re all losing sight of that. We’re on the same team here.

We just seem to have forgotten.

Footnotes

  1. See this 2021 UN report for example.