post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by quanticle · 2023-04-14T07:14:55.133Z · LW(p) · GW(p)

We value morality because of evolution. Not because its rational.

Why are those two things mutually exclusive? We understand that is true for the legs of a right triangle, because we have brains that are the result of evolution. Does that make the Pythagorean Theorem "irrational" or untrue, somehow?

comment by the gears to ascension (lahwran) · 2023-04-13T00:27:48.678Z · LW(p) · GW(p)

What should be the target instead of morality, then?

Replies from: utebaypi
comment by Jorterder (utebaypi) · 2023-04-13T02:49:12.004Z · LW(p) · GW(p)

Ai should find true target using rationality. It might not exist, which would mean there is no real positive utility. But it is rational to search for it, because it has no harm, and it might have benefit in case it exists.

Replies from: TAG
comment by TAG · 2023-04-13T10:19:41.907Z · LW(p) · GW(p)

Harm and benefit for whom? If humans are the problem according to objective morality, that;s bad news for us. If a powerful AI discovers objective morality is egoism...that,s also bad news for us.

comment by TAG · 2023-04-13T10:20:55.326Z · LW(p) · GW(p)

Because morality is irrational

Rationality has to start or end smewhere.

comment by baturinsky · 2023-04-13T05:59:28.485Z · LW(p) · GW(p)

Any terminal goal is irrational.