Corona Bias

John Lubbock
10 min readOct 26, 2020

--

Wikipedia’s complete (as of 2016) list of cognitive biases, arranged and designed by John Manoogian III (jm3). Categories and descriptions originally by Buster Benson. Wikimedia Commons CC BY-SA 4.0

It dawned on me some time ago that humans, far from being the rational actors predicted in some economic models, are remarkably stupid and inconsistent in their behaviour. I started to get interested in cognitive biases, and the research into how our brains work which can show when and to some extent why our brains behave so oddly. The most famous popular summary of this work is Daniel Kahneman’s Thinking, Fast and Slow, which is not a book that I would necessarily tell you to read for a laugh. If you’re fascinated by humans and their stupid brains, however, it’s worth a read.

Finishing the book at the start of the COVID-19 pandemic, I started trying to summarise what I had learned from it, and try to apply it to the mistakes I was seeing governments make. I’ve reanimated this text, which I wrote at the beginning of the year, because I’m starting to teach a module in Digital Media at the University of Westminster, and part of this course includes how to use Medium, so I thought I would post something again.

Irrational actors

The rational actor model in economics is hopelessly outdated, but is an enduring feature of economic and political discourse, perhaps because it’s so central to conservative beliefs about how markets lead to the best allocations of resources when everybody is competing against each other to maximise their own utility. Utility itself is a problematic concept, as Kahneman also explains, because what brings people satisfaction is not necessarily economically useful to them. ‘Faith in human rationality is closely linked to an ideology in which it is unnecessary and even immoral to protect people against their choices’, Kahneman says.

Faith in human rationality is itself a simplified model for how people behave, and when our brains construct simplified models of the world in order to understand it, we end up with cognitive biases produced by the assumptions of our mental models, or heuristics.

Kahneman posits a different heuristic which he calls System 1 and System 2 to describe the division of labour in the brain which produces cognitive biases. Your System 1 creates instant judgements, looks for associations, and does lots of other automatic tasks which you don’t think about. System 2 is associated with intentional focus. When a question is too complex for an automatic answer from System 1, System 2 swings into gear. Kahneman says of System 1 that,

‘System 1 is designed to jump to conclusions from little evidence — and it is not designed to know the size of its jumps’, and that, ‘The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable and coherent than it really is.’ This becomes a problem when crisis strikes, and the world gets more complex, and less coherent than we want it to be.

Kahneman again: ‘The attentive System 2 is who we think we are. System 2 articulates judgements and makes choices, but it often endorses or rationalizes ideas and feelings that were generated by System 1.’

When all you have is a hammer

Human brains are fundamentally lazy. Even System 2 which is designed to check our impulses and automatic reactions often simply confirms their prejudices. Our brains are associative, more easily able to recall information which we have been exposed to recently, and it’s in that information where our System 2 looks to explain unusual things. The ‘God of the Gaps’ where deities are invoked to explain things humans cannot, is a monument to confirmation bias. Don’t know why rainbows happen? Must be God. Big flood? God. Plague? God. And so on.

This is a kind of availability heuristic, ‘a mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific topic’, according to Wikipedia.

The reaction of religious leaders to Coronavirus still treads this reliably anachronistic path. The head of Turkey’s Religious Affairs body, Diyanet, blamed Coronavirus on the LGBT community. This kind of Confirmation Bias has a lot to answer for in the current misinformation swirling around the internet about Coronavirus. Human brains are biased towards looking for active agency as a cause for things, and they don’t deal well with catastrophes which have a natural cause, rather than a human one. So it makes sense that there’s a lot of appetite for the belief that Coronavirus was created by humans.

People protesting against the government lockdown in London.

This is a kind of causality bias. Kahneman suggests that looking for causality could have had evolutionary advantages, but it tends to make us look for agency in random patterns. This also explains ascribing natural disasters to a deity, and the utterly mad set of conspiracy theories about the virus being a biological weapon, or that it’s cover for Donald Trump to liberate America by arresting all his enemies (see the crazy world of QAnon believers). A preference for causality means that many governments do not prepare as well for random natural disasters than for emergencies caused by people, like terrorism.

(Since writing this, the QAnon conspiracy theory has taken off in a huge way, and there is probably a whole book to be written about how cognitive biases and various social factors play into a propensity to believe in such wild theories. I hope one of the guys from the great QAnonAnonymous podcast writes this book, or else it will be left to academic researchers writing paywalled studies that nobody will read. The recent episode is particularly good but if you’re not up to date with the whole strange phenomenon of this collaborative fictional conspiracy world, you’ll want to start at the beginning.)

As this MIT management blog points out, there’s also political bias involved here. If you dislike Trump or Johnson already, you’re more likely to blame them personally for mishandling the crisis, than if you support those leaders. In fact, according to Kahneman, “The operations of associative memory contribute to a general confirmation bias… A deliberate search for confirming evidence, known as positive test strategy, is also how System 2 tests a hypothesis.’

Too much information and wishful thinking

The human brain constructs narratives about the facts it encounters using associative memory, inferring causation, creating coherence, and simplifying the world using substitution and mental heuristics or models (even the System 1 and System 2 dichotomy is a heuristic, rather than some real structure of the brain). There is just too much information going on for the limited capacity of our brains to cope with, and so we create simplified models which we hope will enable us to understand what we see.

Humans are also insensitive to things with small probabilities. Because many people never experience rare events, they often receive less attention than their probabilities deserve. The public’s lack of attention to climate change is a good example of this. Kahneman says there is “a basic limitation in the ability of our mind to deal with small risks: we either ignore them altogether or give them far too much weight”. In the case of a global pandemic, it seems the UK was woefully underprepared and ignored the possibility entirely.

The models we make of the world in our heads often contain a lot of wishful thinking. We want to maintain the advantages we currently enjoy, and our brains are very loss averse. When presented with the choice of a gamble on a big win or a safe bet with a smaller reward, most humans choose the safe bet. However, when faced with the choice of a sure loss, or a gamble with a high probability of a bigger loss, most humans gamble on the small possibility of not losing anything.

Loss Aversion

I think that this cognitive bias for Loss Aversion may explain some of the poor response by the UK government to the pandemic. Faced with losses (either shutting down the economy and creating a recession, or trying to keep the economy open and risking massive loss of life), humans tend to gamble. The government gambled on its herd immunity strategy for a while, until this was seen to be an obviously poor choice. For Kahneman, “Loss aversion is a powerful conservative force that favors minimal changes in the status quo in the lives of both institutions and individuals.” It seems clear to me that this has had an effect on the initial desire of the UK government not to go into a full lockdown.

There may also be something of what Kahneman calls the Affect Heuristic in the wishful thinking of the government when facing losses. In this bias, “people make judgements and decisions by consulting their emotions: Do I like it? Do I hate it?” This gives a view of people as “guided by emotion rather than reason”. Did the government want to underwrite the losses of thousands of businesses across the UK? No, so they avoided it until it was necessary. “Reliance on the affect heuristic is common in politically charged arguments”, Kahneman says.

The UK government is presumably fully aware of how behavioural economics affects the population at large, given that they themselves have what people commonly refer to as the Nudge Unit, inspired by Kahneman’s colleague Richard Thaler’s book, Nudge. Just the other week, government officials were worrying that their message to people to stay at home had been ‘too successful’, and subsequently government sources began briefing lobby journalists that the lockdown would be eased, a message which the government now appear to have flip flopped on again.

The illusion of skill

Having recently won a landslide General Election, you might be forgiven for thinking that some in government (*cough*, Dominic Cummings, *cough*) appear to believe that the public are an instrument on which they can play any tune they like. But they appear to have forgotten the underlying rule of behavioural economics, which is that people are not particularly rational or reliable, and Cummings is most certainly not the Beethoven of social engineering that he might think he is. Political scientists anyway operate in what Kahneman calls a ‘zero-validity environment’, where there are so many variables and such irregular occurrences that experts are limited in their ability to forecast anything. Perhaps this is why we are so impressed by Dominic Cummings’ ability to win elections that we got Sherlock Holmes to play him in a Channel 4 drama.

Speaking of the illusion of skill, “Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it”, Kahneman says. It’s not hard to imagine that Boris Johnson’s lengthy holiday following his December election win, and his choice to skip five meetings of the government’s Cobra emergency committee in January/February could be put down to complacency, as well as Johnson’s long record of arrogantly bumbling through life rather than preparing for anything.

Kahneman says that “the most valuable contribution of the corrective procedures I propose is that they require you to think about how much you know.” The System 1 and System 2 processes in our brains attempt to construct narratives out of the information they can access, but that information is likely to be partial. For the brain, however, what you see is all there is (WYSIATI), which for Kahneman means that “paradoxically, it is easier to construct a story when you know little”. This can apply to conspiracist narratives, of course, but equally to the poor planning of some governments at the early stages of the Coronavirus pandemic.

Kahneman describes a planning fallacy, where forecasts are unrealistically close to best case scenarios. Many organisations face the “challenge of controlling executives competing for resources to present overly optimistic plans.” Governments, as we have seen, are loss averse, and, wanting to believe that their skill and competence will lead to a good outcome, may fall into the trap of wanting to believe the most optimistic forecasts. Again, we can see this planning fallacy reflected in the reluctance of the US and UK governments to implement a full lockdown early on in the crisis. The planning fallacy, Kahneman says, is only one of manifestations of a pervasive optimistic bias [which] In terms of its consequences for decisions… may be the most significant of the cognitive biases.” Humans “focus on the causal role of skill and neglect the role of luck… We focus on what we know and neglect what we do not know”. For Kahneman, “Overconfidence is another manifestation of WYSIATI”, and in groups where only supporters of the group’s decisions have a voice, overconfidence leads to the suppression of doubt.

Look back in anger

On the other hand, we should be aware of a hindsight bias when criticising the early actions of these governments based on what we know now. “The worse the consequence, the greater the hindsight bias. In the case of a catastrophe, such as 9/11, we are especially ready to believe that the officials who failed to anticipate it were negligent or blind.”

If you’ve been appalled by the combination of stupidity, terrible planning, overconfidence and ignorance displayed not just by governments but by individuals of all kinds behaving irresponsibly in the face of the current pandemic, then you’re certainly not alone. Perhaps it shouldn’t come as a shock to anybody who’s been conscious for the past decade that governments are incompetent and callous, while many people are stupid and irresponsible, but I like to believe that my capacity to still even type *smdh* means that I retain more essential humanity than the average jaded cynic.

We are living in a world in which there are more people doing more things which produce more consequences and data about the world than ever before. I’ve written elsewhere about how humanity is not really well adapted to living in a world with billions of other human agents doing billions of unpredictable things daily. As well as being disappointed in ourselves and humanity generally, I hope that we can also be compassionate with each other and admit that we are all fallible, and that what we should be working on is trying to figure out how to be less stupid both as individuals and collectively. Understanding how our brains work, and how they are prone to errors, is an important part of this task.

--

--

John Lubbock
John Lubbock

Written by John Lubbock

Journalist, video maker, will never log off

No responses yet