twkaiser's Shortform
post by twkaiser · 2023-03-24T05:11:13.519Z · LW · GW · 2 commentsContents
2 comments
2 comments
Comments sorted by top scores.
comment by twkaiser · 2023-03-24T05:11:13.728Z · LW(p) · GW(p)
How NOT to align AI #34.
What is humanity aligned to? Let’s hypothetically say humans are aligned by evolution for the following: “Your DNA is the most important substance in the universe; therefore maximize the amount of similar DNA in the universe”. Therefore, we align AGI to the following: “human (or similar) DNA is the most important substance in the universe; therefore maximize the amount of human or similar DNA in the universe.
Wait, I’m pretty sure there is already rule #34 on this, brb.