Wednesday, December 10, 2014

Epistemic Trust

My attraction to LessWrong is partially predicated on a feeling that a better culture can be created, raising the sanity waterline to improve society overall.

Recently, though, I've somewhat given up on that.

First, I was overestimating the degree to which LessWrong had created such a culture already. I'll explain why.

Talking to core LessWrong people is different. It feels highly reflective, with each person cognizant not only of the subject at hand but also the potential flaws in their reasoning process. Much less time is wasted trying to sort out such flaws because an objection is not met with defensiveness. If you're already looking for the holes in your own arguments, you'll try to understand a counter-argument rather than trying to protect yourself from it. The attitude is contagious.

I call this intellectual honesty: being up-front not just about what you believe, but also why you believe it, what your motivations are in saying it, and the degree to which you have evidence for it. Feynman discussed the necessary attitude, although he didn't give it a name that I'm aware of.

There are many forces working against intellectual honesty in everyday life, but the most important one is face culture: status in the group is being signaled by agreeing and disagreeing, arguing for or against people. This can be nasty, but it isn't always; in fact, the most common type of face-culture I experience is cooperative: people in the group are trying to be friendly by accepting and encouraging each other.

For example, suppose that a group of engineers are meeting to discuss a technical problem. We will focus on one of them; I will call this person X. X is eager to find acceptance in the group. The other members of the group are also eager to make X feel accepted. During the discussion, X is looking for opportunities to interject with something relevant and useful. At some point, X is reminded of something: a problem which was encountered in a similar project at a previous work-place. X interjects that there might be a problem, and proceeds to tell the story. As X recalls the details, there's a critical difference between the old situation and the new one which makes it unlikely for the problem to arise in the current situation.

Intellectual-Honesty Culture: If everyone at the table is intellectually honest, someone points out the disanalogy. X likely concedes the point and the discussion moves on. (Often X will be the first to notice the disanalogy, and will point it out him/her-self.) If X thinks the objection is mistaken, a discussion in which both participants try to understand what each other is saying ensues.

Face Culture: In face culture, people will focus more on trying to make X feel included. Although the story's conclusion is unlikely to apply in the current situation, it's worthwhile to comment on the story in an agreeable way. Because agreement is a social currency, it is somewhat noncommittal; perhaps the best move is to agree that this problem can arise but then do little about it. Bold disagreement with the point is seen as (and often would be) an attempt to take X down a peg.

The critical point here is that when the two cultures mix, a face-culture person will see intellectual honesty as an attack.

It is worth emphasizing that face culture is not dishonest, not in the normal sense. Face culture is nice; face culture is friendly; face culture is welcoming. (Although, it can be vicious when it gets competitive.) Face culture is filled with white lies, especially lies by omission (such as acting as if a comment were relevant and made a good point), but if you try to call out any of these lies you will utterly fail. They are not lies in the common conception of lie. They are not dishonest in the common conception of honesty.

Moreover, face-culture is important. If X is an old hand whose status in the group is secure, face-culture would be babying. If X is a newcomer, however, face-culture niceties can establish a welcoming environment. I don't mean to suggest that there is an absolute opposition between being nice and being truthful -- often the two don't even come into conflict. There is a very real trade-off, though. At times you simply must choose one or the other.

Attempting to call out someone for following face culture rather than being intellectually honest is, as far as I know, doomed to failure. Any such call-out will be perceived as a threat, and will ramp up the defensive face-culture behavior.

Ok, so, there are these two cultures and LessWrong succeeds at intellectual honesty. I said at the beginning that I've (partially) given up on improving broader culture via LessWrong, though. Why?

Well, I talked with someone who worked at a Christian school (as I understand it, a very fundamentalist one). They described what sounded like the same thing I experience with LessWrong: the community was very high in intellectual honesty.

Why would this be?

If LessWrong's high intellectual honesty is a result of being devoted to rationality and reflective thinking, shouldn't we expect the exact opposite in highly religious organizations?

I think what's happening here is that LessWrong is intellectually honest not because we explicitly think about rationality quite a bit, not because LessWrong is in possession of improved ideas of what rationality is about, but instead, because there is a high degree of intellectual trust.

Intellectual trust occurs when the group has common goals, mutual respect, and a largely-shared ideological framework.

When people have intellectual trust, they do not need to worry as much about why the other person is saying what they are saying. You know they are on your side, so you are free to worry about the topic at hand. You are free to point out flaws in your own reasoning because you are relatively secure in your social status and share the common goal of arriving at the correct conclusion. Likewise, you are free to find flaws in their reasoning without worrying that they will hate you for it.

This sort of intellectual trust cannot be created by simply "raising the rationality waterline".

1 comment: