This excerpt is from 'Liars and Outliers: Enabling the Trust that Society Needs to Thrive,' by Bruce Schneier. For further publisher info, visit www.wiley.com/buy/9781118143308.
[ Win a copy of Liars and Outliers: Enabling the Trust that Society Needs to Thrive. Enter now! ]
Chapter 4: A Social History of Trust
James Madison famously wrote: “If men were angels, no government would be necessary.” If men were angels, no security would be necessary. Door locks, razor wire, tall fences, and burglar alarms wouldn’t be necessary. Angels never go where they’re not supposed to go. Police forces wouldn’t be necessary. Armies? Countries of angels would be able to resolve their differences peacefully, and military expenses would be unnecessary.
Currency, that paper stuff that’s deliberately made hard to counterfeit, wouldn’t be necessary, as people could just write down how much money they had. Angels never cheat, so nothing more would be required. Every security measure that isn’t designed to be effective against accident, animals, forgetfulness, or legitimate differences between scrupulously honest angels could be dispensed with.
We wouldn’t need police, judges, courtrooms, jails, and probation officers. Disputes would still need resolving, but we could get rid of everything associated with investigating, prosecuting, and punishing crime. Fraud detection would be unnecessary: the parts of our welfare and healthcare system that make sure people fairly benefit from those services and don’t abuse them; and all of the anti-shoplifting systems in retail stores.
Entire industries would be unnecessary, like private security guards, security cameras, locksmithing, burglar alarms, automobile anti-theft, computer security, corporate security, airport security, and so on. And those are just the obvious ones; financial auditing, document authentication, and many other things would also be unnecessary.
Not being angels is expensive.
We don’t pay a lot of these costs directly. The vast majority of them are hidden in the price of the things we buy. Groceries cost more because some people shoplift. Plane tickets cost more because some people try to blow planes up. Banks pay out lower interest rates because of fraud. Everything we do or buy costs more because some sort of security is required to deliver it.
Even greater are the non-monetary costs: less autonomy, reduced freedom, ceding of authority, lost privacy, and so on. These trade-offs are subjective, of course, and some people value them more than others. But it’s these costs that lead to social collapse if they get too high. Security isn’t just a tax on the honest, it’s a very expensive tax on the honest. If all men were angels, just think of the savings!
It wasn’t always like this. Security used to be cheap. Societal pressures used to be an incidental cost of society itself. Many of our societal pressures evolved far back in human prehistory, well before we had any societies larger than extended family groups. We touched on these mechanisms in the previous chapter: both the moral mechanisms in our brains that internally regulate our behavior, and the reputational mechanisms we all use to regulate each other’s behavior.
Morals and reputation comprise our prehistoric toolbox of societal pressures. They are informal, and operate at both conscious and subconscious levels in our brains: I refer to the pair of them, unenhanced by technology, as social pressures. They evolved together, and as such are closely related and intertwined in our brains and societies. From a biological or behaviorist perspective, there’s a reasonable argument that my distinction between moral and reputational systems is both arbitrary and illusory, and that differentiating the two doesn’t make much sense. But from our perspective of inducing trust, they are very different.
Despite the prevalence of war, violence, and general deceptiveness throughout human history -- and the enormous amount of damage wrought by defectors -- these ancient moral and reputational systems have worked amazingly well. Most of us try not to treat others unfairly, both because it makes us feel bad and because we know they’ll treat us badly in return. Most of us don’t steal, both because we feel guilty when we do and because there are consequences if we get caught. Most of us are trustworthy towards strangers -- within the realistic constraints of the society we live in -- because we recognize it’s in our long-term interest. And we trust strangers because we recognize it is in their interest to act trustworthily. We don’t want a reputation as an untrustworthy, or an untrusting, person....
In a primitive society, these social pressures are good enough. When you’re living in a small community, and objects are few and hard to make, it’s pretty easy to deal with the problem of theft. If Alice loses a bowl at the same time Bob shows up with an identical bowl, everyone in the community knows that Bob stole it from Alice and can then punish Bob. The problem is that these mechanisms don’t scale. As communities grow larger, as they get more complex, as social ties weaken and anonymity proliferates, this system of theft prevention -- morals keeping most people honest, and informal detection, followed by punishment, leading to deterrence to keep the rest honest -- starts to fail.