Smart people have identified many bugs in human software. The list of cognitive biases and fallacies is long and growing, much like my… aaanyway.
The classic example is the overestimation of the risk of shark attacks after watching Jaws. Every year, millions of people approach tepid coastal waters with apprehension. But the total annual number of shark attacks worldwide is around 80.
There are many other examples of fiction oozing into perceptions of reality:
- The public debate about artificial intelligence is basically shaped by Terminator.
- Gene editing to prevent and cure horrible diseases is opposed on grounds of Nazi supersoldier programmes in video games.
- Particle accelerators are opposed by lobotomized baboons who imagine they will create a black hole.
- GMOs have been proven safe, but are hysterically opposed by people who have mental images of being chased around by sabre-toothed tomato plants and growing extra limbs.
- Nuclear energy is the safest and cleanest energy source we have, but the incredibly infrequent incidents – and their cartoonish representations in the public imagination – are so graphic that people completely fail to consider statistics and tradeoffs.
- Neither last nor least, few people have an understanding of market economics (or “capitalism”) that extends beyond Wolf of Wallstreet / Lego the Movie / Michael Moore videos, which it would be too kind to treat even as works of fiction.
It might sound funny and cute, but there are immense civilizational costs. When people oppose things with real benefits because of imaginary or vastly overestimated risks, that’s a net ouch for humanity. It would be funny and cute if it didn’t inform widespread moral intuitions about enormously important things like bioethics, robotics and physics research – basically anything people see in movies more often than in real life.
Which is almost everything. When evaluating new technologies, the only reference point people have is too often a work of fiction, or worse, a hostile, propagandistic misrepresentation meant to prejudice them against the technology.
Latent luddism, fear of the unknown and the naturalistic fallacy get amplified by works of fiction that play to them, and activists who abuse them, and who often serve interests with a material stake in sabotaging promising new technologies, or at least creating a false equivalency where the facts are resoundingly and unequivocally clear. For example, the anti-nuclear movement is, and ever has been, the propaganda arm of fossil fuel producers (including nation states), and a case study of manipulation of public opinion through scaremongering – and Greenpeace in particular can go suck a fuel rod.
The problem is that the distinction between fact and fiction does not easily exist for our monkey brains. Information is weighed not on quality, but on availability. In fact, the Hollywood fallacy is a special case of the availability heuristic, which says that whatever comes more easily to mind feels true.
What comes to mind most easily? The more spectacular and memorable, the better! This is also why many people, influenced by negatively biased news reporting, believe that the world is going to hell in a handbasket. In fact, the world has not only never been better, it has never been improving at a faster rate than now. For more on this topic, read this and this, and of course Pinker’s fabulous Enlightenment Now.
Since repetition is persuasive, mulling and obsessing over a spectacular falsehood – such as the risk of a shark attack – is effectively self-inflicted brainwashing. Striking imagery encourages rumination, which feels like evidence. Whatever you may think of the relative artistic merits of The Quran, Terminator and Das Kapital, the mechanism is the same.
But it only works when people lack good information. Therefore, the antidote is good information – and because you can’t be expected to know about everything, to notice patterns and improve your heuristics.
This, unlike the accidental destruction of the universe by a metal tube under Switzerland, is possible.
If you want other people to be less wrong, you should buy me a coffee now, so I can enlighten them.