post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by [deleted] · 2021-11-04T01:11:28.784Z · LW(p) · GW(p)

This is known as the orthogonality thesis, that intelligence and rationality don't dictate your values. I don't have time right now to explain the whole thing but it's talked about extensively in the sequences if you want to read more. I think it's pretty widely accepted around here as well.

Replies from: Viliam, Yoav Ravid, None
comment by Viliam · 2021-11-04T09:10:14.127Z · LW(p) · GW(p)

My "intuition pump" is to imagine a superintelligent gigantic spider. Not some alien with human values in a spider body, but actual spider that was 'magically' increased and given IQ 500.

comment by Yoav Ravid · 2021-11-04T06:46:28.299Z · LW(p) · GW(p)

The Orthogonality Thesis [? · GW] tag is a good place to start.

comment by [deleted] · 2021-11-04T19:24:41.027Z · LW(p) · GW(p)Replies from: Dagon, None
comment by Dagon · 2021-11-04T20:12:03.021Z · LW(p) · GW(p)

I think it scales, and applies to any type of intelligence.  It doesn't seem that more intelligent humans are particularly more altruistic (though they tend to be richer, so less obvious about their motivations).  There's no reason (that I see) to think that even further intelligence would make humans more likely to care about the less-intelligent groups more (or less) than they do now.

Replies from: None
comment by [deleted] · 2021-11-04T22:59:59.040Z · LW(p) · GW(p)
comment by [deleted] · 2021-11-04T22:46:31.107Z · LW(p) · GW(p)

The orthogonality thesis is usually used with AI, because that topic is where it actually matters, but the overarching idea applies to any mind. Making something smarter does not give it morals.

And no, I bet that the psychopaths would use their newfound powers to blend in and manipulate people better. Overt crime would drop, and subtler harm would go up. That's what happens in the real world across the real intelligence gradient.

I'm not a sociopath, but I was a sociopath-lite before transitioning (minimal emotion, sadistic streak, almost no empathy). I once sat and listened to my girlfriend pour her heart out in extreme emotional pain and I just did not care. I wanted her to shut up and let me get back to my game. She was annoying.

Telling 2016!raven to reason her way into morals is like if I told you to reason your way into seeing gamma rays. It's just not gonna happen. Sure, you can approximate it, but that's not the same.

A psychopath can restrain themselves if there's a reason (like a threat of jail) but making them smarter reduces the need to hide. If you want them to do good, you need to fix their mind -- in my case, that meant correcting my fucked up hormone system. I have no idea where to even start for a real psychopath, but there's no reason to think that mere intelligence would help.

Replies from: None
comment by [deleted] · 2021-11-04T23:11:01.635Z · LW(p) · GW(p)
comment by Astor · 2021-11-04T22:29:09.000Z · LW(p) · GW(p)

One concept in my moral system relies on the question of how you would respond to permanent retaliation, if you would go rogue. Could you stop an endless attack on your wellbeing because you do things that other people hate? In a world with many extremely intelligent beings this could be very difficult, and even in a world with only you as the bad Super-Einstein it would at least be tiresome (or resource-inefficient), so one super intelligent individual would possibly prefer a situation where they do not need to defend themselves indefinitely. This is kind of similar to the outcome of Wait-But-Why's concept of the cudgel (browser search for "cudgel"). Ultimately this concept relies heavily on having at least some possibility of giving a Super-Einstein a small but ineradicable pain. So in my opinion, it is not really applicable to a singularity event. But it could be useful for slower developments.

Replies from: None, None
comment by [deleted] · 2021-11-04T23:32:03.655Z · LW(p) · GW(p)Replies from: Astor
comment by Astor · 2021-11-05T09:57:21.520Z · LW(p) · GW(p)

Pain can also be defined for non-biological beings. For me it is just a word indicating something undesirable hardwired into your being. And maybe there is something undesirable for everything in the universe. One rather metaphysical concept could be a virtue of inertia (described as the resistance of any physical object to any change in its velocity). So you could argue, if you understand the movement of an entity (more concretely its goals), you could find a way to harm it (with another movement) which would result in "pain" for the entity. This concept is still very anthropozentric, so I am not sure, if the change in the movement could lead to or already be understood as a positive outcome for humanity. Or maybe it is not registered at all.

Replies from: None
comment by [deleted] · 2021-11-05T11:47:26.210Z · LW(p) · GW(p)
comment by [deleted] · 2021-11-04T23:21:03.542Z · LW(p) · GW(p)