There is a particular failure pattern I’ve seen in many different areas. Society as a whole holds view A on subject X. A small sub-group holds opposing view B. Members of the sub group have generally put more thought into subject X and they have definitely spent more time arguing about it than the average person on the street. Many A-believes have never heard of View B or the arguments for it before.
A relative stranger shows up at a gathering of the subgroup and begins advocating view A, or just questioning view B. The sub-group assumes this is a standard person who has never heard their arguments and launches into the standard spiel. B doesn’t listen, A gets frustrated and leaves the subgroup, since no one is going to listen to their ideas.
One possibility is that the stranger is an average member of society who genuinely believes you’ve gone your entire life without hearing the common belief and if they just say it slowly and loud enough you’ll come around.* Another possibility is they understand view B very well and have some well considered objections to it that happen to sound like view A (or don’t sound that similar but the B-believer isn’t bothering to listen closely enough to find out). They feel blown off and disrespected and leave.
In the former scenario, the worst case is that you lose someone you could have recruited. Oh well. If the latter, you lose valuable information about where you might be wrong. If you always react to challenges this way you become everything hate.
For example: pop evolutionary psychology is awful and people are right to ignore it. I spent years studying animal behavior and it gave me insights that fall under the broad category of evopsych, except for they are correct. It is extremely annoying to have those dismissed with “no, but see, society influences human behavior.”
Note that B doesn’t have to be right for this scenario to play out. Your average creationist or anti-vaxxer has thought more about the topic and spent more time arguing it than almost anyone. If an ignorant observer watched a debate and chose a winner based on fluidity and citations they would probably choose the anti-vaxxer. They are still wrong.
Or take effective altruism. I don’t mind losing people who think measuring human suffering with numbers is inherently wrong. But if we ignore that entire sphere we won’t hear the people who find the specific way we are talking dehumanizing, and have suggestions on how to fix that while still using numbers. A recent facebook post made me realize that the clinical tone of most EA discussions plus a willingness to entertain all questions (even if the conclusion is abhorrent) is going to make it really, really hard for anyone with first hand experience of problems to participate. First hand experience means Feelings means the clinical tone requires a ton of emotional energy even if they’re 100% on board intellectually. This is going to cut us off from a lot of information.
There’s some low hanging fruit to improve this (let people talk before telling them they are wrong), but the next level requires listening to a lot of people be un-insightfully wrong, which no one is good at and EAs in particular have a low tolerance for.
Sydney and I are spitballing ideas to work on this locally. I think it’s an important problem at the movement-level, but do not have time to take it on as a project.** If you have thoughts please share.
*Some examples: “If you ate less and exercised more you’d lose weight.” “If open offices bother you why don’t you use headphones?”, “but vaccines save lives.”, “God will save you…”/”God isn’t real”, depending on exactly where you are.
**Unexpected benefit of doing direct work: 0 pangs about turning down other projects. I can’t do everything and this is not my comparative advantage.
Strongly related is the thing where people who place high weight on logical argument as a means of updating belief aren’t aware that it is *totally reasonable* for someone who has never specialized in constructing, following, and refuting logical arguments not to update just because you are saying things that they don’t have an answer to. If they had the heuristic of updating on everything that they heard that sounded rightish they would have fallen for a pyramid scam or some such. This is both a bit Hansonish and related to the Valley of Bad Rationality I guess.
One way to test for this in yourself is to notice when your viewpoint is getting jerked back and forth by big updates over and over again. Good heuristics will make the big updates early after which updates will on average grow smaller as the domain is mapped out. Two responses are to throw up your hands and leave it to the experts (if unimportant) or upgrade your epistemics until you can start identifying flaws in both sides (if important).