Posts

Comments

Comment by kliv on How Can Rationalists Join Other Communities Interested in Truth-Seeking? · 2019-07-17T04:52:03.064Z · LW · GW

I'm not sure there *are* other communities interested in truth-seeking, at least not in the generalized way that rationalists are. (Obviously there are lots of communities seeking the truth in some particular domain.) Do you have some in mind?

If I can reinterpret the question a bit, a similar question might be how to find common ground with people who are not part of the rationality community. In that case I think the relevant question is "to what *end* do you want to be rational?" When I think of a typical highly rational person who doesn't identify with the rationalist community, I think of someone who sees rationality in large part as an instrument to achieve goals, rather than a pastime. If one can find other people with similar goals, and then select from them the ones pursuing those goals rationally, one might find some commonality of culture/values/interests.

Comment by kliv on Integrity and accountability are core parts of rationality · 2019-07-17T04:25:50.660Z · LW · GW
I have come to believe that people's ability to come to correct opinions about important questions is in large part a result of whether their social and monetary incentives reward them when they have accurate models in a specific domain.

I think this can be read many ways. First, obviously if a person is subject to an incentive to hold true beliefs about X, they will start trying to learn about X and their beliefs will become more accurate. This part isn't very interesting.

The more interesting parts of your idea, I think, are the notions that

(1) In the absence of incentives to have true beliefs about X, people don't just have no beliefs about X, but in fact tend to have beliefs that are wrong.

(2) In the presence of incentives to have wrong beliefs about X, people tend to adopt those wrong beliefs.

I'm less convinced that these things are true generally. I do think they are true of many people if we define "belief" as "an opinion that a person expresses". But whether that's a reasonable definition of belief is unclear---I think that often the people for whom (1) and (2) are true are the same people who don't care whether their expressed opinions are correct. In that case the observation reduces to "if people don't care about saying true things, they will say things they are incentivized to say", which isn't surprising.

For the average LessWrong reader, I'm not convinced (1) and (2) are accurate. The observation that people tend to have beliefs that align with their incentives might instead be explained by a tendency for people with belief X to gravitate towards a position that rewards them for having it.