Posts

Comments

Comment by aprilsr on Has "politics is the mind-killer" been a mind-killer? · 2019-03-17T05:05:30.032Z · score: 8 (7 votes) · LW · GW

I feel like “Politics is the Mind-Killer” made two points that came out pretty clearly to me and, I’d assume, most other people.

  1. It is very hard to discuss politics rationally.
  2. Therefore, avoid political examples (or use historical ones) when discussing rationality.

For example, Eliezer would advocate against saying “Hey, those stupid [political party] people made a huge mistake in supporting [candidate] in the 20XX election. Let’s learn from their mistake,” unless you were quite confident people could discuss the rationality and not the politics.

I think a lot of the “might”s and “could”s were avoided mainly for emphasis. Unless you have a strong reason to believe that someone will be able to be rational about politics, you can very safely assume they won’t be. “You have to support every argument on one side,” for example, is basically saying that most people don’t understand the nuance in saying that you think an argument is flawed even if you agree with its conclusion. I very commonly see people male horribly incorrect arguments for positions I strongly support, but pointing these out as flawed is rarely looked upon nicely among people who lack rationality skills.

While the conclusions you drew from the post were obviously harmful, I feel like very few people interpreted it that way.

Comment by aprilsr on In what ways are holidays good? · 2018-12-28T06:38:48.986Z · score: 4 (3 votes) · LW · GW

I think you've summarized the question we're trying to answer pretty well. Does Daniel want to go on vacations? We don't know. How would one go about deciding whether they want to go on vacations? You seem to be missing the fact that one might be unsure about their preferences.

Comment by aprilsr on Quantum immortality: Is decline of measure compensated by merging timelines? · 2018-12-12T00:45:13.550Z · score: 1 (1 votes) · LW · GW

This assumes that there's some point where things sharply cut off between being me and not being me. I think it makes more sense for my utility function to care more about something the more similar it is to me. The existence of a single additional memory means pretty much nothing, and I still care a lot about most human minds. Something entirely alien I might not care about at all.

Even if this actually raises my utility, it does it by changing my utility function. Instead of helping the people I care about, it makes me care about different people.