Banish the Clippy-creating Bias Demon!

post by Stuart_Armstrong · 2013-01-18T14:57:15.907Z · LW · GW · Legacy · 37 comments

Contents

37 comments

I posted in Practical Ethics, arguing that if we mentally anthropomorphised certain risks, then we'd be more likely to give them the attention they deserved. Slaying the Cardiovascular Vampire, defeating the Parasitic Diseases Death Cult, and banishing the Demon of Infection... these stories give a mental picture of the actual good we're doing when combating these issues, and the bad we're doing by ignoring them. Imagine a politician proclaiming:

An amusing thing to contemplate - except, of course, if there were a real Cardiovascular Vampire, politicians and pundits would be falling over themselves with those kinds of announcements.

The field of AI is already over-saturated with anthropomorphisation, so we definitely shouldn't be imagining Clippy as some human-like entity that we can heroically combat, with all the rules of narrative applying. Still it can't hurt to dream up a hideous Bias Demon in its mishaped (though superficially plausible) lair, cackling in glee as someone foolishly attempts to implement an AI design without the proper safety precautions, smiling serenely as prominent futurist dismiss the risk... and dissolving, hit by the holy water of increased rationality and proper AI research. Those images might help us make the right emotional connection to what we're achieving here.

37 comments

Comments sorted by top scores.

comment by Luke_A_Somers · 2013-01-18T19:08:39.699Z · LW(p) · GW(p)

:facepalm:

This is dark artsy to the point of self-parody. The outcome seems to me highly dependent on the parity of the number of meta levels the viewer goes to.

comment by Qiaochu_Yuan · 2013-01-18T19:54:31.804Z · LW(p) · GW(p)

I would suggest that this is a useful thing to do on an individual level (to adjust for scope insensitivity and so forth) but a terrible thing to do on a group level (because it's mind-killing). Smells too much like the Yellow Peril for my taste.

The Anthropomorphization Cannon is a powerful weapon, and if it were to fall into the wrong hands...

Replies from: patrickmclaren
comment by patrickmclaren · 2013-01-19T06:34:41.909Z · LW(p) · GW(p)

I feel that this position could be equally argued if the scopes were switched, given the following motivation.

...if we mentally anthropomorphised certain risks, then we'd be more likely to give them the attention they deserved. -- OP

For example, a harmless :-) play on your comment. All the while, keeping the above maximization criteria in mind.

I would suggest that this is a useful thing to do on a group level (because it's mind-killing; take Yellow Peril for example) but a terrible thing to do on an individual level (to adjust for scope insensitivity and so forth).

comment by Emile · 2013-01-18T15:28:24.516Z · LW(p) · GW(p)

Many religions do anthropomorphize evil - the devil may not actually exist, but we may all be better off if we talk about him as if he did.

I suspect that there are quite a few things like this, where religion is kinda right, as long as you don't take it too literally. Maybe the best solution isn't to reject religion wholesale, but to reform it so that it's tacitly acknowledged that it isn't really true, a bit like Santa Claus, or professional wrestling. Arguably that may already be the attitude of many Anglicans and Unitarian Universalists.

Replies from: Peterdjones
comment by Peterdjones · 2013-01-21T18:02:26.605Z · LW(p) · GW(p)

The extreme is Bokonism.

That reminds me of when I shared an office with a scorrsh catholic atheist and a scottish protestant atheist, who still managed to wrangle all the time.

comment by ESRogs · 2013-01-18T22:43:04.538Z · LW(p) · GW(p)

I'm reminded of this early GiveWell post :)

"When I was younger, I loved playing video games. [...] I just liked killing bad guys. Well, more than that, I hated not killing bad guys. When Heat Man killed my guy and stood around smugly, I wanted to throw the TV across the room, and I couldn’t stop until he was dead.

What sucked about this experience was that it was all fake, and in the back of my head I knew that. In the end I felt pretty empty and lame. Enter altruism – where the bad guys are ACTUALLY BAD GUYS. [...] it’s infinitely better because it’s real. I don’t care whether the kids are cute, or whether the organizations are nice to me, or whether my friends like my decisions. As with video games, I probably spend 99% of my time frustrated rather than happy. But … Malaria Man just pisses me off. It’s that simple." http://blog.givewell.org/2007/04/03/charity-the-video-game-thats-real/

Replies from: Document
comment by Document · 2013-01-27T04:43:08.847Z · LW(p) · GW(p)

I'd play a game where scoring points or the equivalent wired tiny payments to a nonprofit of my choice.

Replies from: Alicorn
comment by Alicorn · 2013-01-27T19:46:43.103Z · LW(p) · GW(p)

You don't get to pick the nonprofit, but there's Free Rice.

Replies from: Document
comment by Document · 2013-01-27T19:53:57.246Z · LW(p) · GW(p)

I meant payments out of funds I provided, the idea being to maximize the fuzzies produced by a donation by increasing the effort expended to make it. But thanks for the link.

comment by drethelin · 2013-01-18T19:11:31.378Z · LW(p) · GW(p)

I don't think anthropomorphising al qaeda in the form of Osama Bin Laden or demonizing Saddam Hussein was a net good for america. Framing the arguments over drug control as "The War On Drugs" has almost certainly led to the loss of billions of dollars and many lives. Do you really think encouraging this idea in general is good?

Replies from: Stuart_Armstrong
comment by Stuart_Armstrong · 2013-01-18T19:21:27.647Z · LW(p) · GW(p)

Do you really think encouraging this idea in general is good?

I'd certainly prefer if the serious risks were the anthropomorphised ones, rather than the trivial ones.

Replies from: Pavitra
comment by Pavitra · 2013-01-19T04:41:48.554Z · LW(p) · GW(p)

So it's a great idea as long as only causes you agree with get to use the superweapon?

Replies from: Desrtopa
comment by Desrtopa · 2013-01-19T05:14:43.251Z · LW(p) · GW(p)

Well, if you can't stop people from using a superweapon for bad causes, it may be an improvement to see to it that it's also used for good causes.

Replies from: Pavitra
comment by Pavitra · 2013-01-20T13:33:10.627Z · LW(p) · GW(p)

The original question was:

Do you really think encouraging this idea in general is good?

That is: assuming it is possible to reduce bad uses at the cost of also reducing good uses, should one do so?

Your reply seems to assume that the bad uses can't be reduced, which contradicts the pre-established assumptions. If you want to change the assumptions of a discussion, please include a note that you are doing so and ideally a short explanation of why you think the previous assumptions should be rejected in favor of the new ones.

Replies from: Desrtopa
comment by Desrtopa · 2013-01-20T13:52:27.882Z · LW(p) · GW(p)

I don't assume that bad uses can't be reduced, and my answer is somewhat tongue in cheek, but I do suspect that getting people to stop using this mode of thought for bad ideas would be very difficult. Getting people to apply it to good causes as well might be worse, outcome-wise, than getting them to stop applying it all, but trying to get people to apply it to good causes might still have a better return on investment than trying to get them to stop, simply because it's easier.

Replies from: Pavitra
comment by Pavitra · 2013-01-20T13:55:31.123Z · LW(p) · GW(p)

You may be right, but I don't trust a human to only arrive at that conclusion if it's true. I think we ought to refrain from pressing D, just in case.

comment by DaFranker · 2013-01-18T17:37:20.218Z · LW(p) · GW(p)

Your example political speech makes me want to just run for office and do it.

"I now solemnly vow, on all honors, to rid our country of the vile terrorists who call themselves the Slippery Baths. If I am elected, our people shall be safe and squicky-clean once more!"

Hey, I figure it's almost worth a try. If someone could find the right Mass Media people to bribe for help, I think there's a lot of potential here.

Replies from: shminux
comment by Shmi (shminux) · 2013-01-18T17:41:36.284Z · LW(p) · GW(p)

What about the mindless roaring four-wheeled blood-thirsty flashy-eyed monsters roaming our streets?

Replies from: None, DaFranker
comment by [deleted] · 2013-01-18T19:21:34.633Z · LW(p) · GW(p)

Adjusts her prior that you are a biker in Seattle waaaaay upward

comment by DaFranker · 2013-01-18T17:49:18.523Z · LW(p) · GW(p)

I thought of them too, but they've got their filthy money-laundering hands in too many pockets and they're controlling too many people - it would be a losing battle. The triads would be a more realistic target.

Besides, they literally take our people hostage, wear ablative carbon-composite / high-tech-metal-alloy armor, and lug around gallons of flamethrower fuel. They also tend to hunt in packs¹ and establish war camps on our bridges every morning.

We'll need a lot more than one good politician and a few bribes to the media to win that war.²


  1. Most deaths involve multiple of them, IIRC.

  2. But please, if you can, I strongly encourage anyone to prove me wrong. The implication here is that lots of science and engineering and money is needed to fix the dangers and reduce the risks. The kind of science and engineering and money that Google already started doing a while ago.

Replies from: shminux
comment by Shmi (shminux) · 2013-01-18T18:34:37.867Z · LW(p) · GW(p)

Well, there is a movement afoot to tame their wild nature. Some day being trampled or squished into pulp by these creatures will be but a distant memory, as their descendants follow the path of domestication well traveled by other animals, the past perils replayed only in the highly scripted spectacle of corrida de coches.

Replies from: Desrtopa
comment by Desrtopa · 2013-01-19T05:11:56.530Z · LW(p) · GW(p)

What's probably going to be really difficult is not getting automated cars on the market, but getting all the non-automated cars off the road. An entirely automated traffic flow would be much safer than a partly automated traffic flow, but there are going to be lots of holdouts who refuse to trust an automated car over their own driving ability, or who simply can't or won't buy an up-to-date car.

Replies from: shminux, Stuart_Armstrong
comment by Shmi (shminux) · 2013-01-19T19:27:45.235Z · LW(p) · GW(p)

All good points, I addressed some of them in my previous comment on self-driving cars.

comment by Stuart_Armstrong · 2013-01-22T11:51:54.133Z · LW(p) · GW(p)

When automated cars are at 90% or so, and if you keep on getting statistics like how many accidents and deaths are caused by humans versus machines, I think the pressure to go all automated will be strong. Some municipalities and states will go for it, and then it'll be hard to get anywhere with a human-driven car.

Replies from: Desrtopa
comment by Desrtopa · 2013-01-22T14:09:02.171Z · LW(p) · GW(p)

I suspect getting the prevalence as high as 90% will be pretty difficult itself.

comment by DanArmak · 2013-01-18T16:47:35.367Z · LW(p) · GW(p)

The great enemy of humanity is already anthropomorphised: it is the Death Himself) we do battle with, the Lord of Entropy.

Replies from: CronoDAS, ikrase
comment by CronoDAS · 2013-01-21T22:18:10.024Z · LW(p) · GW(p)

Nah, he's actually a pretty nice guy once you get to know him. He doesn't cause deaths; he's just the one who cleans up afterward. And he'd probably be grateful for a chance to retire peacefully.

The proper incarnation of entropy is the Frost Giant, not the bony-looking guy in a cape.

comment by ikrase · 2013-01-20T00:37:59.245Z · LW(p) · GW(p)

You beat me to it. I already tend to narrativise this. Other cases, though, are very risky; an alternative, striving-based narrative might be better.

comment by [deleted] · 2013-01-19T22:45:30.863Z · LW(p) · GW(p)

Isn't this basically what the saner strand of occultists do when they personify archetypes and aspects of humanity into minor deities?

comment by Shmi (shminux) · 2013-01-18T16:58:39.932Z · LW(p) · GW(p)

The field of AI is already over-saturated with anthropomorphisation

Actually, on this forum Clippymorphisation is rather prevalent.

Replies from: private_messaging
comment by private_messaging · 2013-01-20T16:21:04.187Z · LW(p) · GW(p)

Clippy is very anthropomorphic though - it magically has a real world goal, it equates the algorithm that's it from the inside with the hardware it sees through it's eyes from the outside, it will 'improve' that hardware inclusive of paperclip counter's accuracy but it won't improve the output of paperclip counter. It's easy to imagine - in your mind you have the number of paperclips counted externally, and a paperclip maximizer increases this count - and hard or impossible to actually define, let alone build.

comment by CronoDAS · 2013-01-21T22:19:03.289Z · LW(p) · GW(p)

I'm reminded of The Fable of the Dragon Tyrant...

comment by private_messaging · 2013-01-20T16:06:28.081Z · LW(p) · GW(p)

How's about a bias demon where people who read far too much scifi, via the same biases that on national scale produced the TSA, are overly concerned about things like clippy, creating the concept. Now that's a clippy-creating bias demon.

Replies from: David_Gerard
comment by David_Gerard · 2013-01-20T16:50:41.613Z · LW(p) · GW(p)

Yes, the whole thing is a bit close to "but some biases are good!" No, they aren't.

comment by Dre · 2013-01-19T06:59:47.925Z · LW(p) · GW(p)

I worry that this would bias the kind of policy responses we want. I obviously don't have a study or anything, but it seems that the framing of the War on Drugs and the War on Terrorism have encouraged too much violence. Which sounds like a better way to fight the War on Terror, negotiating in complicated local tribal politics or going in and killing some terrorists? Which is actually a better policy?

I don't know exactly how this would play out in a case where no violence makes sense (like the Cardiovascular Vampire). Maybe increased research as part of a "war effort" would work. But it seems to me that this framing would encourage simple and immediate solutions, which would be a serious drawback.

comment by [deleted] · 2013-01-18T19:37:16.402Z · LW(p) · GW(p)

Religion being the only social structure that is know to be able to endure even a fraction of the time required, it has been proposed that religion is the least worse means to warn distsnt future generations about nuclear waste sites. Not money or architecture or language, but ghost stories.

comment by Rain · 2013-01-30T20:47:35.406Z · LW(p) · GW(p)

This was used in an episode of South Park.