Simpler explanations of AGI risk

post by Seth Herd · 2023-05-14T01:29:29.289Z · LW · GW · 9 comments

We're getting a shot at presenting our concerns about AI X-risk to the general public. It would be useful to have a brief presentation that plays well with less-technical people, or technical people who don't want to listen to a half hour of explanation just at that moment. The other goal here is to avoid polarization [LW · GW] with a gentle approach. We don't want AGI risk to become polarized like the climate change "debate" did.

This is my suggestion for a conversation template, based on personal success. I'm hoping others chip in ideas and say what's worked for them.

 

 

This approach has worked for me in conversation, but only when I also get the emotional tone right. Logic is emotional for everyone. People without strong rationalist ambitions are even more prone to think with their feelings. So:

 

This approach is intended for casual conversations, or for times when you've got the floor, but you don't want to overstay your welcome on that floor.

When it gets sidetracked into details, steering this back to the top level, with epistemic modesty, seems useful. Asking something like "Can you really be sure that something smarter than us won't outsmart us somehow? I wish I could be sure, but I'm not." Or saying something like "It just seems like we shouldn't trust something that thinks differently than us, if it has goals programmed or trained in without really knowing how to do it". This may present you as being on the same team and at the same level as the person you're talking to.

This set of suggestions is offered with low certainty. I'm no expert at persuasion, but I have researched it a bit, and researched cognitive biases a lot.

 

I also tried to make a similar set of simple presentations as an accordion style FAQ [LW · GW], to provide as a link instead of in conversation.

 

So, how could the above be better? Or is my premise mistaken?

9 comments

Comments sorted by top scores.

comment by Mitchell_Porter · 2023-05-14T02:23:01.314Z · LW(p) · GW(p)

The other goal here is to avoid polarization  

Opinion just within tech already seems pretty polarized, or rather, all over the place. You have doomers, SJWs, accelerationists, deniers... And avoiding all forms of polarization, at all scales, seems impossible. People naturally form opposing alliances. Is there a particular polarization that you especially want to prevent?

Replies from: Seth Herd
comment by Seth Herd · 2023-05-14T05:37:01.448Z · LW(p) · GW(p)

I agree that opinions are already divided in the tech community. I'm not sure about the emotional and communication dynamics. So I think it might be important to not make that divide worse, and instead make it easier for people to cross that divide.

I think most nontechnical people aren't polarized yet, and they probably get a vote, figuratively and literally. So trying to avoid polarizing them might still be worthwhile.

Replies from: Mitchell_Porter
comment by Mitchell_Porter · 2023-05-14T19:53:49.105Z · LW(p) · GW(p)

I'm still very vague about what you want to prevent. You want non-technical people to all agree on something? To be mild rather than passionate, if they do disagree? Are you aiming to avoid political polarisation, specifically? Do you just want people to agree that there's a problem, but not necessarily agree on the solution?

Replies from: Seth Herd
comment by Seth Herd · 2023-05-14T20:02:38.820Z · LW(p) · GW(p)

Yes, it's fair to say that I'd like people to disagree mildly rather than passionately if they do disagree. Belief in human-caused climate change actually decreased among half of the US population even as evidence accumulated, based on the polarization effects. And I think those could be deadly, since having a lot of people disagree might well produce no regulatory action whatsoever.

I don't think this is likely to polarize along existing political lines, and thank goodness. But it is a pretty important issue that people are passionate about, and that creates a strong potential for polarization.

comment by M. Matter · 2023-05-30T17:08:04.828Z · LW(p) · GW(p)

In a similar way, it would be helpful to find ways to overcome the Bystander Effect. That is, building awareness is necessary but not sufficient. Awareness without a sense of agency breeds hopelessness and fatalistic disengagement. So, an important next step, beyond what you discuss here, is to say, "And here are things we can do." I hope that list of things extends beyond "write your representatives and donate money." It seems cruel to tell people about a problem without hinting at ways they can act to mitigate it, even from completely outside the spheres of academia, venture capital, or the tech industry. I wonder whether any such ways exist.

Replies from: Seth Herd
comment by Seth Herd · 2023-05-30T17:48:49.440Z · LW(p) · GW(p)

Good point.

I think it's useful to separate the request to act from the argument itself. Feeling like you'd have to change your life if you allow yourself to believe there's a problem will activate motivated reasoning to preserve your current beliefs.

But feeling hopeless about the future, or even helpless, will do the same thing. So I'd alter this to include something along the lines of "there are things everyone can do to help, like asking for good public policies".

I think I did include an optimistic statement to head off hopelessness in both of those short treatments, but helplessness is important too.

comment by Ben Smith (ben-smith) · 2023-05-21T00:58:39.286Z · LW(p) · GW(p)

Well written, I really enjoyed this. This is not really on topic but I'd be curious to read and "idiot's guide" or maybe an "autist's guide" on how to avoid sounding condescending.

Replies from: Seth Herd
comment by Seth Herd · 2023-05-21T01:20:56.227Z · LW(p) · GW(p)

Aw, thanks!

I think that not sounding condescending is absolutely critical to having good discussions on this (and many other obscure and technical topics).

I have had a lifelong journey of going from sounding condescending way too much, to sounding less condescending, at least when I remember to try. I don't know if I'm a bit on the autism spectrum, or just raised to value logic and winning arguments over social skills.

I think a lot of it is tone of voice and timing. I'm not going to get those by acting, so I just try to adopt a soft and patient emotional tone, and continually remind myself that the person I'm talking to hasn't thought about this topic nearly as much, and I probably sound like an idiot when I talk about other people's favorite topics. Finding points of agreement and voicing them before moving on to points of disagreement is key. So is not expecting to change someone's mind in the moment. I think offering ideas and perspectives, and letting people think them through is how people learn and change beliefs.

comment by Seth Herd · 2023-05-14T19:12:23.615Z · LW(p) · GW(p)

I also want to recommend the FAQ at the r/ControlProblem subreddit for similar purposes. It's well-written and more succinct than any other resource I know of.

If anyone has seen other good, brief writeups that can be adapted for conversation, I'd love to hear about it.