twkaiser's Shortform

post by twkaiser · 2023-03-24T05:11:13.519Z · LW · GW · 2 comments

2 comments

Comments sorted by top scores.

comment by twkaiser · 2023-03-24T05:11:13.728Z · LW(p) · GW(p)

How NOT to align AI #34.

What is humanity aligned to? Let’s hypothetically say humans are aligned by evolution for the following: “Your DNA is the most important substance in the universe; therefore maximize the amount of similar DNA in the universe”. Therefore, we align AGI to the following: “human (or similar) DNA is the most important substance in the universe; therefore maximize the amount of human or similar DNA in the universe.

Wait, I’m pretty sure there is already rule #34 on this, brb.

comment by twkaiser · 2023-04-02T04:18:34.423Z · LW(p) · GW(p)

"I Am, Therefore I Think"

Please discuss what this might mean in regards to AI Alignment.