The Scout Mindset
by Julia GalefScout Mindset
Holding your beliefs as part of your identity (i.e. the Soldier Mindset) can lead to poor thinking, but is appealing because the rewards are more immediate. The Scout Mindset puts intellectual honesty at the core of identity, and then builds beliefs from that foundation. It is self-reinforcing. We’re aiming to Incrementally improve thinking habits.
The scout mindset is not necessarily always the right tool, but we naturally have a bias towards the soldier mindset. See also Box Error.
We talk about our beliefs as if they’re military positions, or even fortresses, built to resist attack. Beliefs can be deep-rooted, well-grounded, built on fact, and backed up by arguments. They rest on solid foundations. We might hold a firm conviction or a strong opinion, be secure in our beliefs or have unshakeable faith in something.
In scout mindset, there’s no such thing as a “threat” to your beliefs. If you find out you were wrong about something, great—you’ve improved your map, and that can only help you.
Instead of dismissing observations that contradict your theories, get curious about them. Instead of writing people off as irrational when they don’t behave the way you think they should, ask yourself why their behavior might be rational. Instead of trying to fit confusing observations into your preexisting theories, treat them as clues to a new theory.
The benefit is in the habits and skills you’re reinforcing. Even when you’re reasoning about something like foreign politics that doesn’t impact your life directly, the way you think still impacts you indirectly, because you’re reinforcing general habits of thought.
We overestimate social costs
People withhold key information from their doctor because correct treatment is less important than judgement. This is part of the reason why the solider mindset is attractive - maintaining identity is more important than correctness. We choose beliefs because they are easy, safe, or defensible.
In reality, other people aren’t thinking about you nearly as much as you intuitively think they are, and their opinions of you don’t have nearly as much of an impact on your life as it feels like they do.
We choose beliefs in a similar way. Psychologists call it impression management, and evolutionary psychologists call it signalling: When considering a claim, we implicitly ask ourselves, “What kind of person would believe a claim like this, and is that how I want other people to see me?”
That the principles you’re inclined to invoke or the objections that spring to your mind depend on your motives: the motive to defend your image or your in-group’s status; the motive to advocate for a self-serving policy; fear of change or rejection.
Counterfactual tests
- Double standard test - “Am I judging other people’s behavior by a standard I wouldn’t apply to myself?”
- Outsider test - Imagine someone else stepped into your shoes—what do you expect they would do in your situation?
- Conformity test - Imagine someone’s whose opinion you respect told me that they no longer held this view. Would I still hold it?
- Selective skeptic test - Imagine this evidence supported the other side. How credible would you find it then?
- Status quo bias test - Imagine your current situation was no longer the status quo. Would you then actively choose it?
- Opposite bet test - What would I wager if I was better on the opposite outcome?
Social vs epistemic confidence
See above Social costs are overestimated - understanding this can lead to higher social confidence instead of faking / exaggerating epistemic confidence.
It is not necessarily true that 100% belief in an outcome (extreme epistemic confidence) is necessary to influence other people. You probably don’t want to be influencing people who respond this way.
Yet [Benjamin] Franklin paired his abundance of social confidence with an intentional lack of epistemic confidence. It was a practice he had started when he was young, after noticing that people were more likely to reject his arguments when he used firm language like certainly and undoubtedly.
Showing that you’re well-informed and well prepared on a given topic doesn’t require you to overstate how much certainty is possible on that topic.