There is an important division between people who want to improve the world that few seem to be aware of. Inspired by Julia Galef’s new book (The Scout Mindset), I’ll call this division: Soldier Altruists vs. Scout Altruists.
1. Soldier Altruists think it’s obvious how to improve the world and that we just need to execute those obvious steps. They see the barriers to a better world as:
(i) not enough people taking action (e.g., due to ignorance, selfishness, or propaganda), and
(ii) bad groups blocking things (e.g., corrupt politicians or greedy corporations).
2. Scout Altruists think it’s hard to figure out how to improve the world – and most attempts either don’t work, only slightly help, or make things worse. They see the barriers to a better world as:
(i) not enough understanding of causal mechanisms (e.g., due to a lack of high-quality evidence, not enough attention to the evidence we do have, not enough careful reasoning, etc.), and
(ii) too much investment in bad solutions (e.g., due to people jumping to conclusions, doing what feels good emotionally rather than what is effective, ideological blindspots, inertia, etc.)
Soldier Altruists say we need to DO and FIGHT more. Scout Altruists say we need to THINK and TEST more. Soldier Altruists are more likely to think that if we could just get people to be less selfish and more motivated to act, we would make a lot of progress towards a better world. Scout Altruists are more likely to think that if we could just get people to pay more attention to evidence and to have more good-faith debates with strong norms around the quality of argumentation, we would make a lot more progress.
Soldier Altruists may think Scout Altruists are far too reluctant to act and are wasting their time on research and debate. Scout Altruists may think Soldier Altruists are far too confident in their conclusions and are wasting their effort pushing for changes that aren’t going to help much (and which, in some cases, might even make things worse). Of course, in reality, there is a continuum between these two positions. So, on a scale from 0 (Soldier Altruist) to 10 (Scout Altruist) where do you fall? I’m probably a 7 or 8.
As some commenters have pointed out, there is a relationship between this distinction and “Conflict Theory” vs. “Mistake Theory.” I think it is related – but also distinct in important ways. Conflict theory says that there is a giant zero-sum struggle (groups fighting over fixed resources). Whereas in this case, we’re operating from a framework of altruism: “the world can be made a lot better – what’s the big barrier to that happening? Is it that we know what to do and we’re not doing it enough/with enough energy, or is it that we don’t really know what to do?”
Also, to clarify another important point brought up in the comments: I’m not asking, “do you think it’s obvious how we should improve the world if we had a magic wand that could change whatever we wanted?” – instead, the question is: “is it obvious what to do to improve the real world, given that we don’t have a magic wand?” Do we just need to put more money/time/effort/people into executing the current “obvious” strategies because they will work well if we just scale them up? Or is it pretty unclear what strategies we should even be putting more resources into (meaning that a lot of thinking, research, debate and/or evidence evaluation will typically be necessary to even figure out what is worth scaling up)?
Julia’s book (which I highly recommend): https://www.amazon.com/Scout-Mindset-Perils-Defensive-Thinking/dp/0735217556/ref=nodl_
This piece was first written on April 23rd, 2021, and first appeared on this site on January 7th, 2022.
Comments