0 comments
Comments sorted by top scores.
comment by quanticle · 2023-04-14T07:14:55.133Z · LW(p) · GW(p)
We value morality because of evolution. Not because its rational.
Why are those two things mutually exclusive? We understand that is true for the legs of a right triangle, because we have brains that are the result of evolution. Does that make the Pythagorean Theorem "irrational" or untrue, somehow?
comment by the gears to ascension (lahwran) · 2023-04-13T00:27:48.678Z · LW(p) · GW(p)
What should be the target instead of morality, then?
Replies from: utebaypi↑ comment by Jorterder (utebaypi) · 2023-04-13T02:49:12.004Z · LW(p) · GW(p)
Ai should find true target using rationality. It might not exist, which would mean there is no real positive utility. But it is rational to search for it, because it has no harm, and it might have benefit in case it exists.
Replies from: TAGcomment by baturinsky · 2023-04-13T05:59:28.485Z · LW(p) · GW(p)
Any terminal goal is irrational.