Copy
Human Risk Newsletter

Notre-Dame Edition

Estimated Reading Time 6 mins
Welcome back to the second Human Risk Newsletter featuring the finest in lovingly curated Behavioural Science (BeSci) inspired content. 

If you’re a recent subscriber or didn't get a chance to read the Launch Edition, then I recommend you catch up here before reading on.

Been forwarded this email and want to subscribe yourself? 
Just click the button below:
Subscribe
Coming up in this edition
 
1. What the Notre-Dame cathedral fire can teach us about "Human Risk in action";

2. In the Bi-weekly Cognitive Bias, I highlight something that makes us susceptible to believing conspiracy theories; 

3. A BeSci Intervention from Le Monde newspaper that is a prime example of sludge;

4. In Something that made me think, I explore when we’re likely to overestimate our abilities and when we’re likely to underestimate them; and

5. The Something for the weekend recommendation is a podcast that helps Californians prepare for an earthquake. It's more relevant to you than you might think.

Human Risk in action

The sad news of the fire at Notre-Dame cathedral in Paris, provides some interesting insights about how we react to losses and the importance that culture and heritage play in our understanding of the world.

It’s too early to determine what caused the Notre-Dame fire, though Human Risk is highly likely to have played a part. 

What we do know from the International Association of Fire and Rescue Services report is that early fire warnings were ignored, costing valuable time that might have saved more of the cathedral. "Total devastation", it stated, "was very close at hand". As research entitled "Why Building occupants ignore fire alarms" explores, not responding to alarms until we see other cues is more common than we might like to believe. 


Perhaps more surprising was the extent of the reaction to the incident which one psychologist explained as follows

"Confronted with material loss and the fragility of a building associated with transcendence and eternity, people have responded with tears, rage, attempts at analysis, confusion and an outpouring of support". 

All of which helps explain why a quest to find the two individuals in this photo, taken just before the fire broke out, went viral:

Source: @BrookeAWindsor on Twitter
 
As donations poured in for the restoration fund, particularly from France's wealthiest individuals, journalists asked questions about why people seemed to care more about a building, than local humanitarian causes. 

I'll let an art historian offer a potential explanation:


When a place is no more—or, as we've seen, chunks of that physical site have been destroyed—it becomes a kind of anthropomorphic feeling, as though we lost that part of ourselves: a loss of body parts, to put it bluntly... A loss of something physical relates to how we understand the world.

That connection with our understanding of the world may explain why many people found solace in literature:  Livres Hebdo (literally "Books Weekly") reported that just 24 hrs after the fire, the top 18 best selling books on Amazon's French website were all related to the cathedral. Top of the list was Victor Hugo's Notre-Dame De Paris (in English The Hunchback of Notre-Dame) featuring the world's most famous bellringer.

The concept of posthumous posterity (nicely explored in this article) is more commonly associated with artists and performers. Though that’s arguably more logical; when artists die, it means the supply of their material is going to run out, and a human being is no longer with us. The cathedral will be rebuilt.   

But we know that no matter how faithfully they do so, it won't be the same Notre-Dame it was before. The emotional response we see here is illustrating something called loss aversion; the concept that losses impact us much more profoundly than gains do.

Readers of a certain age will probably recognise this sentiment in Joni Mitchell's song Big Yellow Taxi:

"Don't it always seem to go
That you don't know what you've got
'Till it's gone".
Back to the top

Bi-Weekly Cognitive Bias

The Proportionality Bias is a tendency to believe that causes are proportional to their effect.  It not only makes us more susceptible to believing conspiracy theories but can also hinder us in managing risk.

To navigate the complex world in which we live, our brains like to create a sense of order.  When things happen, we want to understand what caused them.

As Prof Rob Brotherton, author of Suspicious Minds: Why We Believe Conspiracy Theories explains
:  Our desperate, deep-rooted desire to explain the inexplicable can lead us up garden paths, and down dark alleys.”

Drawing on a study that showed that gamblers throw dice gently for low numbers, but forcefully when aiming for a double six, he highlights something called the Proportionality Bias: a desire that drives us to “want the magnitude of the event to match the magnitude of whatever caused it

Source: Sketchplanations.com

Proportionality Bias helps to explain why we are all, to some extent, susceptible to conspiracy theories.  When things don't turn out the way we expect them to, it is more comforting and comprehensible if the explanation is proportionate to the outcome. 

The idea that President John F. Kennedy was the victim of a lone gunman is intuitively less plausible than the theory that his assassination was part of a much larger plan. 


Proportionality Bias also has implications for risk management as it can make us miscalibrate the significance of particular elements. It is often only with hindsight that the importance of small details is fully considered. For example, the 1986 Challenger Space Shuttle disaster was later identified to have been caused by the failure of an O-ring; a very minor, yet critical part of the shuttle.

For a weekly dose of thought-provoking sketches like the one I've featured above, visit the excellent Sketchplanations website and subscribe to that newsletter.

Back to the top

A BeSci Intervention

A digital newspaper subscription that’s unnecessarily hard to cancel is an excellent example of “Sludge”; a behavioural intervention designed to act against the interests of the target audience.

Usually in this section, I feature BeSci interventions I like. This time, I'm showcasing one I really don’t.

Most readers will be familiar with subscriptions that are easy to sign up to and difficult to cancel. Something media companies, in particular, like to deploy.

Photo by Elijah O'Donnell on Unsplash
The most commonly deployed tactic is to force subscribers to make a phone call to cancel, thereby giving the publisher a final chance to dissuade them from doing so. However, the French newspaper Le Monde has adopted an even more bureaucratic approach. 

To cancel one of their digital subscriptions, you have to fill in the following form and submit it by registered post:
Source: LeMonde

You don’t have to read French to know that they’re asking for a lot of unnecessary, duplicative information: why is it relevant to cancellation which subscription plan you're on, or how much you've paid for it?  

I'm sure Le Monde would claim that this is all necessary to ensure no mistakes are made and meet legal requirements. But you don't need to be a BeSci expert to know what they’re up to here; making it harder for people to cancel. 

As one commentator wryly pointed out, making it hard to cancel, might explain the growth in their online subscriber base:

There’s no reasonable justification for this process. After all, if you can activate a subscription digitally, then it’s possible to deactivate it by the same mechanism. 

Proving precisely that point is the unsubscribe process for digital subscribers to Germany's Frankfurter Allgemeine Zeitung:
Here you can simply cancel your subscription online at any time.  Just the way it should be.

Next time, we'll return to more positive BeSci interventions, but in the meantime, do share examples of Behavioural Sludge you've come across.
Back to the top

Something that made me think

One thing BeSci teaches us is that we’re often not very good at assessing our own abilities.  Knowing when we're likely to get it wrong is valuable in mitigating Human Risk. 

Social scientists and viewers of TV “talent” shows have long known that we can be prone to something called “illusory superiority”. In other words, we overestimate our abilities at a particular task.

However, recent research has also discovered that there are situations in which the opposite happens and we underestimate our abilities.  

A fascinating NYTimes article You Are Not as Good at Kissing as You Think. But You Are Better at Dancing, summarised some of the things researchers discovered:

Source: New York Times

Fortunately, they didn't just stop there. After all, reciting the alphabet backwards or getting an animal to like us, are probably not skills most of us will need to deploy that often.


The researchers were able to identify four dynamics when we might tend to over- or under-estimate our abilities:

  1. We tend to be overconfident when the skill reflects on our personality or character.  So in matters of ethics or reliability, we’ll assume we’re better than others;
     
  2. If the skill in question is a matter of opinion rather than measurable fact (say how funny we are) then we’ll tend to rate ourselves more highly;
     
  3. We’ll tend to be overconfident on easy tasks, and underconfident on more challenging ones; and
     
  4. We’ll consider our own previous experience of undertaking a task.  Things we’ve done more often, we’ll assume we’re more skilled at than tasks we have rarely or never done

All of which is potentially worrying from a Human Risk perspective as none of these factors is a consistently reliable predictor of actual skill.

Back to the top

Something for the weekend

Even readers who don't live in a region at risk of earthquakes will learn something from The Big One, a podcast targetted at helping those who do, prepare for the worst.

The Big One is an NPR podcast focussing on helping the people of Los Angeles to prepare for an earthquake. The planning isn't preparation for a potential risk; it's an absolute certainty that one will hit the area at some point.

To quote one leading earthquake scientist: "Southern California's section of the San Andreas fault is "locked, loaded and ready to roll."

The Big One is a thought-provoking combination of dramatisation and investigative reporting.

As well as being a useful "how to" guide, it explores how earthquake ready LA and its population are.  Spoiler alert: they are, and they aren't!

Listening to it made me think about which aspects of my own life I need to make more operationally resilient.

You can subscribe to The Big One wherever you get your podcasts. When you do, listen out for the messages from sponsors. There's one with a fascinating business model targetting "preppers", people who actively prepare for emergencies.
Back to the top

Until the next time...

That's it for another edition.  Thanks for subscribing and reading my BeSci labour of love.

If you enjoyed it, then do let me know using the links below. If you didn't then let me know why. Feedback of whatever kind is hugely appreciated.

Above all, please share it with friends and colleagues you think might be interested.

Christian

Subscribe
Back to the top
Feedback
www.human-risk.com
@humanriskblog
Facebook
Copyright © 2019 Human Risk, All rights reserved.


To update your preferences or unsubscribe from this list.


 
Copyright © 2019 Human Risk, All rights reserved.


Want to change how you receive these emails?
You can
update your preferences or unsubscribe from this list.


 






This email was sent to <<Email Address>>
why did I get this?    unsubscribe from this list    update subscription preferences
Human Risk · Sutherland Street · London, SW1V 4LA · United Kingdom

Email Marketing Powered by Mailchimp