Methodologies & Techniques

Bias in the Spotlight: availability bias

By Crawford Hollingworth and Liz Barker

The availability bias refers to how we often judge the likelihood of an event, or frequency of its occurrence by the ease with which examples and instances come to mind.

But why do people do this?

More emotionally impactful, disastrous or memorable events are likely to be considered more probable. Things that are more commonly or vividly shown in the media; violent murders for example, or a disaster like 9/11, are a lot easier to remember than less noteworthy or salient news. A minor accident in the workplace or a statistic is not as memorable as a tornado.

Because newsworthy events can be readily recalled from memory, the availability bias causes people to rate them as more likely to happen than they are in reality. As a result, vividly memorable (and widely reported) events like homicides, airplane disasters and shark attacks are considered much more common than they really are. This is because they are reported much more than other less sensational causes of injury or death.

A classic example of this phenomenon was illustrated before Hurricane Irma hit Florida in September 2017, when local mayor Rick Kriseman noted:

“Having seen what happened with [Hurricane] Harvey,” residents were “really ready to believe that this could happen here … I hate to say it worked to our benefit, but I think it did.”

Harvey had hit Texas hard just a few weeks earlier.

In contrast, there are many other things which appear harmless, yet on investigation turn out to pose a bigger threat than we might think.

  • For example, around 10,000 people a year in the UK are injured in their homes in incidents involving socks and tights – seemingly innocuous items. However, incidents of electrocution – something people view as a significant danger – number only around 3,000 each year.
  • It also turns out that cheerleading is a far more dangerous sport than the archetypal ‘danger’ sports of skiing and rugby. 66% of “catastrophic” sporting injuries suffered by women in the US (resulting in permanent disability or chronic medical conditions) are caused by cheerleading. And in universities the figure is even higher at 70%

Daniel Kahneman simply illustrates availability bias in his bestselling book about behavioural science, ‘Thinking Fast and Slow’:

Because of the coincidence of two planes crashing last month, she now prefers to take the train. That’s silly. The risk hasn’t really changed; it’s an availability bias.

As Kahneman says, if we hope to avoid availability bias we must make decisions based on statistics or factual evidence. The extent to which estimates of causes of death can be warped by availability bias and media coverage can be seen in a study by Paul Slovic and his colleagues which showed that people judge death by car accident to be more than 300 times more likely than death by diabetes whereas, in reality, at the time of the study, it was only around four times more likely.

So what does this all mean?

So, before we make a decision or judgement based on the horrors that come easily to mind, we could try googling the real facts first. It might help to make us a little more rational. In research, it can often be useful to understand how availability bias might be preventing uptake of a behaviour, or vice versa, due to what people believe is least or most likely to happen.

Next in the series..

Every three weeks The Behavioural Architects will put another cognitive bias or behavioural economics concept under the spotlight. Our next article features inattentional blindness.

By Crawford Hollingworth and Liz Barker, The Behavioural Architects

Crawford Hollingworth is co-Founder of The Behavioural Architects – an award-winning global insight, research and consultancy business with behavioural science at its core, which he launched in 2011 with co-Founders Sian Davies and Sarah Davies.

Liz Barker is Global Head of BE Intelligence & Networks at The Behavioural Architects.

www.thebearchitects.com

@thebearchitects

PREVIOUS ARTICLES IN THE SERIES:

System 1 & 2

Heuristics

Optimism Bias

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.
Please note that your e-mail address will not be publicly displayed.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles