As I see it, there are three main causes for our struggles to see the truth on any particular topic:
1. Mimicry: when our in-group promotes falsity that we copy
2. Incentives: when we predict that knowing the truth would feel bad or harm our objectives
3. Complexity: when the truth is hard to figure out
Examples:
1. Mimicry
• Some are Christians because all their friends and family are, too; some are atheists for the same reason.
• Some think that it makes sense to circumcise baby boys because the people they know think it’s healthy and normal; others think it’s bizarre because the people they know think foreskin is healthy and normal.
• Some believe it would harm Black Americans to defund police because their friends say so; others think it would help Black Americans because their friends say it.
2. Incentives
• If you make more money believing X, it’s going to be harder to stop believing it.
• If the idea of permanent death is terrifying to you, it’s going to be harder to stop believing in reincarnation.
• If it would make you feel really bad to find out you were wrong about something you posted online, your immediate reaction may be to deny being wrong (to others and to yourself) to shield yourself from the negative feelings.
Note that mimicry and incentives can blend together. Sometimes we mimic to fit in or to avoid being socially punished. But mimicry is even more basic than that: we seem to have a natural, in-built strong tendency to copy others. If all the people around us believe something, we usually will come to believe it too, without even questioning whether it might be false or even being aware that we copied the belief from others.
If we see others all behave in a certain way, we’ll probably behave that way, too, unless we have strong reasons not to. This appears to be an evolutionary survival mechanism – it’s risky to (for example) eat plants that are different than the ones your tribe eats (they might be poisonous) or to avoid the behaviors everyone else does (those behaviors might be key to survival in some hidden way). In the wilderness, you can’t figure out how to survive from first principles (chances are you’ll die way too fast for that) – you need to mimic what has worked for centuries (some of which will be key to survival, some of which will be pointless, but evolutionary pressures will have weeded out most of the really harmful stuff and hung on to the most useful stuff).
3. Complexity
• It’s really not obvious how to prevent future risks from advanced artificial intelligence (though it often seems obvious to folks who’ve spent almost no time thinking about it).
• How best to prevent economic crashes is a fundamentally complicated question.
• Nobody actually seems to know how to cure cancers in general.
A weird thing about these three causes for believing falsehoods is that we are usually not aware of their effects.
1. We don’t usually realize it when we believe something just because we copied our social group.
2. We don’t usually realize when we believe something just because it would hurt us not to believe it.
3. We don’t usually realize when our analysis of a complex issue is oversimplified and misses important considerations.
Believing in falsehoods feels just like believing the truth – until the moment we genuinely face up to the possibility of being wrong.
So how do we be right more often?
1. To combat mimicry: we can keep our identities smaller (or have more of them), be more willing to be viewed as having “weird” beliefs, join social groups that value diversity of thought, learn to do less social mimicry (e.g., having greater skepticism towards in-group consensus). We can recognize that every in-group will get some things wrong (including ours) and that it’s worth investigating where ours is wrong. In summary, we can combat mimicry with social resilience and skepticism.
2. To combat harmful incentives: we can recognize that, while there can be short-term pain from accepting the truth, being truth-seeking is usually a better long-term strategy (especially because you can’t just suddenly decide to be truth-seeking when it’s convenient – it’s best to have a habit of being truth-seeking all the time). We can consider thought experiments like: “If X were true, would I rather believe it or be wrong about it?”. We can also leave “lines of retreat” so that we can decide what we’d do and how we’d move forward if we turn out to be wrong on something important. In summary, we can combat bad incentives with a Scout Mindset and a focus on seeking the truth.
3. To tackle complexity: we can use probabilistic thinking, consider multiple hypotheses, and consider the evidence for and against each one. We can train ourselves in evidence and argument evaluation, practice steel-manning arguments, talk to smart and knowledgeable people with different views and fight back against overconfidence (e.g., through calibration practice). We can also do large amounts of research when it’s important to be right. In summary, we can combat complexity with good epistemic hygiene, honed thinking skills, and self-skepticism.
This piece was first written on June 3, 2021, and first appeared on this site on November 11, 2022.
Comments