Resilience > defence

Emergency response by Jon Tandy

There is a great conversation between Sam Harris and Julia Kayyem about what makes us safer.

Julia is a homeland security and counter-terrorism expert who suggests that one of the most important things for us to develop as societies is resilience. She says that defending against specific terrorist threats is a fool’s errand and we, as a society need to accept that living in an open, democratic and free way will carry a certain level of risk.

Shit happens, you can’t stop it from happening, but you can improve how you respond when it does happen. She says that if you spend money defending airports, and the attack happens somewhere else, or the attack is a hurricane and not a bomb, then you’ve wasted your money. This is where resilience comes in.

Kayyem argues that in addition to accepting that risk is part of life, we should stop trying to defend against specific, unlikely threats, but we should improve our response to more general ones. We can do this by funding things like emergency response; things which work well when shit happens. Things which help us respond quickly and bounce back no matter what flavour the shit is. This is something the excellent Bruce Schneider has also been saying for a long time.

This works on at a societal level, but it’s also super-useful at a personal level as well. Shit happens, but how’s your emergency response? Can you stay calm, together, focussed and flexible when the shit hits the fan? What’s your strategy for dealing with different flavours of shit?

We can spend a lot of time defending against specific threats in our own lives (and sometimes that’s fully warranted), but it’s important to maintain a broad emergency response as well – anything which serves us well in the face of adversity no matter how it manifests.

We need to develop our own, personal resilience.

Thinking about belief

Our milky way above a dead forest, aka 'Woodhenge', near the banks of the river Maas, near the Dutch-Belgian border.

I’ve been thinking about belief, about how it works (and doesn’t). I’ve also been thinking about and why I don’t think about it more. Why we all, collectively, don’t think about it more.

One of the things that is unique to us as humans is our ability to engage in meta cognition – to step outside ourselves and look at how we think. But thinking about thinking and understanding how our beliefs work seems almost as unpalatable as updating our beliefs when new information comes along.

Two things set me down this path:

– An episode of the You Are Not So Smart podcast on Bayes’ Theorem

Sam Harris reading from his book, The End of Faith

Both are about the nature of belief but come at it from completely different sides.

You Are Not So Smart is looking at belief as a greyscale where things are neither true or untrue, but they have a probability of being either. Once we accept this, we can use Bayes’ Theorem to understand the probability around our beliefs and update them when new information comes to hand.

Sam Harris on the other hand is taking a logical sledgehammer to some of the most cherished beliefs we have – those around religion. He shines a light on the dangers inherent in taking our beliefs from ancient texts which don’t stand up to any kind of modern scrutiny and the importance of reforming those beliefs as quickly as possible.

Even if you don’t agree with Harris on religion, there are interesting though experiments and ideas to take away.

Both are broadly concerned with the mechanisms which can either reinforce or erode our beliefs and the value or otherwise in doing so.

I’m still doing the work to required to have an opinion on all of this, but if you’re interested in understanding and challenging what you believe about what you believe – these are two good places to start.

Is that smoke really the cause of your problem?

5029 Nunney Castle In Sunny Shropshire

I was listening to a great conversation between Sam Harris and David Chalmers on the nature of consciousness, when they raised the notion of the epiphenomenon.

An epiphenomenon is something which happens (a phenomenon) alongside, or at the same time as something else you’re observing (the primary phenomenon).

They gave a great example of the smoke that will rise out of the top of an old steam locomotive as it moves. You might notice that when it’s still, there’s no smoke, but that when it moves (especially quickly), a lot of smoke will appear.

If you didn’t know how a train worked, you might then infer, that the smoke coming out of the top of the locomotive is what makes it move.

But we know that’s not right.

The smoke is a byproduct of the fire which boils the water for steam which in turn produces the movement of the train. The smoke is an epiphenomenon.

It just happens that there’s smoke when the train is moving, but that doesn’t mean that the smoke causes the movement of the train.

It made me then wonder: in how many other areas of our life are we looking at the smoke and thinking it’s making the train move?

How often are we looking at the epiphenomenon and confusing it with the cause of the primary phenomenon?

This awareness won’t always stop us from making the mistake of confusing the two, but at least it gives us a framework for asking “Is this the cause of the problem, or is it just smoke?”

Changing your mind about changing your mind

I’m a big fan of changing my mind – of updating my opinion/position as new information becomes available.

It’s not an easy process, personally or socially, as changing your mind can be an incredibly humbling experience. But it’s something that we all have to get better at. Given the polarity of opinions on hot button issues, at least some people are going to have to change their mind if we hope to reach any kind of consensus.

In that respect, the ability to change our minds, especially about emotionally-charged situations is fundamental to resolving some of the world’s stickiest issues.

The problem we have is that changing our mind is often perceived, both by ourselves and others, as a sign of weakness, when it should be our greatest strength. Politicians are lambasted for “flip-flopping” on issues, and even us mortals can get a roasting, especially from those who’s position we abandon.

I (currently:) believe that the ability and will to honestly and publicly change our position on key issues is a super-power which we should cultivate as broadly as possible.

There are few who willingly undertake this process publicly, Sam Harris being one of the notable exceptions. Regardless of what you think of Sam’s ideas or politics, you’ve got to admire the way he will wade into a discussion, prepared to update his position, and to openly acknowledge that he has done so. It’s an admirable approach, but you can see that it is costly for him (although it’s admittedly difficult to discern how much flak he gets from changing his position on inflammatory issues as opposed to being vocal on sensitive issues in the first place).

This is a rather long way of saying that I really enjoyed the latest episode of The Knowledge Project podcast which spends quite a bit of time focussing on the benefits and mechanics of changing our mind.

The Knowledge Project is made by Shane Parrish who runs the excellent blog Farnam Street. Each episode is a deep dive into the world of an expert whose work touches one of Shane’s areas of interest. The guest for this episode was Julia Galef who hosts her own podcast (Rationally Speaking) and is a co-founder of the Centre for Applied Rationality.

This episode is full of cool insights about influence and persuasion and the ethics of both, but the best bit (at least from my point of view), was the brief section where Julia and Shane discuss actual tactics for being open to having your own mind changed.

It sounds like something that we should be able to do as humans without tips and tricks from experts, but we’re so hard-wired to hang on our beliefs and reject conflicting information, that sometimes we need a helping hand.

One of these tactics is to have a Trigger Action Plan. This kind of plan, is a deliberate attempt to take a specific course of action in a situation in which would otherwise have a more automatic response.

An example of this might changing your automatic response upon hearing some information which seriously conflicts with your current world view (e.g. the sky is made of marmalade). Instead of automatically dismissing the information out of hand or seeking out answers which support your view, your deliberately seek out information which supports the conflicting viewpoint. In this way, you will be able to get a better understanding of the arguments and evidence supporting this claim and be able to make a better assessment of whether the claim is valid.

Another useful tactic was specifically geared towards hearing information from people we don’t like. In this case, our feelings for the person will prejudice our perception of the information, usually for the worse. The intervention here, is to imagine that we are receiving the same information from someone we like and/or respect, with a view to noting how much of our resistance is due to our personal feelings.

This second tactic seemed particularly useful in situations where you genuinely want to make progress with difficult people, a situation which I think we all find ourselves in.

There is a lot more in the episode which is worth checking out, but at the very least, I hope that I’ve given some food for thought on the value of having a more open mind about changing your mind.