Is it possible to prevent the torture of ems?
post by NancyLebovitz · 2011-06-29T07:42:11.889Z · LW · GW · Legacy · 31 commentsContents
31 comments
When I was reading The Seven Biggest Dick Moves in the History of Gaming, I was struck by the number of people who are strongly motivated to cause misery to others [1], apparently for its own sake. I think the default assumption here is that the primary risk to ems is from errors in programming an AI, but cruelty from other ems, from silicon minds closely based on humans but not ems (is there a convenient term for this?) and from just plain organic humans strikes me as extremely likely.
We're talking about a species where a significant number of people feel better when they torture Sims. I don't think torturing Sims is of any moral importance, but it serves as an indicator about what people like to do. I also wonder how good a simulation has to be before torturing it does matter.
I find it hard to imagine a system where it's easy to upload people which has security so good that torturing copies wouldn't be feasible, but maybe I'm missing something.
[1] The article was also very funny. I point this out only because I feel a possibly excessive need to reassure readers that I have normal reactions.
31 comments
Comments sorted by top scores.
comment by Kaj_Sotala · 2011-06-29T08:56:37.506Z · LW(p) · GW(p)
The Sysop Scenario, with an FAI essentially becoming the operating system of all the matter in the solar system, would do it.
Other than that, I don't really see a way. I expect that uploading technology might very well lead to countless of ems being tortured: many people tend to behave in a rather nasty fashion when they're given unlimited power over someone.
comment by Mitchell_Porter · 2011-06-29T08:54:44.357Z · LW(p) · GW(p)
This is not so different to any other question of law, especially law in cyberspace. Can I stop people gambling online? It depends who I am and what measures I allow myself to use. If I am the state, and I ban computers from my territory, there's no more online gambling because there's no more online anything. If I believe it's a human right for people to spend their money as they wish, I am left only with appeals to reason and similar soft measures. If I allow myself to use physical coercion but intend to coexist with the Internet, then it's the usual situation with respect to cybercrime, or with respect to crime in general: there's a persistent underworld, and steady employment for law enforcement, and busts, confiscations, and court cases are just an ongoing fact of life.
You may be looking for answers to this problem which don't involve the state. Well, there are various software and hardware measures which are possible. You can make an upload physically un-copyable. You can give the upload an internal interface to its experience which renders it immune to coercion - all a hostile party can do is delete it. (Such measures seem to require that how the upload's defenses work is heavily obfuscated, at the level of source code.)
But of course, people who just want to torture and kill may be able to get copies of vulnerable minds from somewhere, or may even be able to generate them according to recipes. There's still a third class of solution, apart from 'the state' and 'technical security', and that is to change human nature itself. One would expect a lot of that to be happening anyway, in a society with the capacity for mind uploading. Also, this third solution naturally mingles with the first - sadists who enjoy their sadism aren't just going to volunteer for barbarectomies, and even ordinary people would feel some fear at the prospect of psychosurgically-induced pacifism, as it threatens to make them the prey of others who still have their vicious side intact.
The recent novel by Iain M. Banks, Surface Detail, is about a galactic war intended to shut down "hells" created by unfortunate civilizations which believe in technologically creating the afterlife of punishment that they believed in during their superstitious eras. One issue is that no-one knows where the hells are physically located, or what physical medium is hosting them.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2011-06-29T15:00:18.062Z · LW(p) · GW(p)
I'd be curious about anything, governmental or not, which even vaguely resembles a solution.
On the torture vs. eye specks scale, the risks to ems strike me as not needing a lot of exponents.
Nevfgbv ol Jnygre Wba Jvyyvnzf has a similar situation to Surface Detail. The possibility of a hell planet isn't revealed till halfway through the book, so I've rot13ed the author and title. However, the book is a classic of transhumanism if you ignore the administrative problems.
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2011-06-30T08:05:40.425Z · LW(p) · GW(p)
The title doesn't look rot13:ed to me.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2011-06-30T09:43:36.034Z · LW(p) · GW(p)
Nevfgbv isn't rot13ed?
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2011-06-30T10:37:24.223Z · LW(p) · GW(p)
Oh, sorry. I misread "a similar situation to Surface Detail" as "a similar situation in Surface Detail". (And also only read your comment, without the context of the parent...)
comment by lucidfox · 2011-06-29T08:21:22.240Z · LW(p) · GW(p)
What are ems?
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2011-06-29T08:40:52.813Z · LW(p) · GW(p)
Emulations of human minds in computers.
comment by James_Miller · 2011-06-29T17:39:21.631Z · LW(p) · GW(p)
An anti-torture Association could form with the following rules:
1) All members interact only with people in the Association.
2) Everyone in the Association agrees to submit to random surprise inspections of their computing hardware to see if they're mistreating ems. Anyone found to be mistreating ems will be expelled.
3) Anyone willing to follow these rules can join this Association.
4) We will seek to use violence to prevent anyone not in this Association from having the technological capacity to make emulations.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2011-06-29T18:46:45.097Z · LW(p) · GW(p)
It could form, but I don't see how much good it would do unless there was a substantial consensus in favor of not torturing ems, so that people in the Association gain by having more/better people to associate with than those not in the Association, and so that the Association has a chance of succeeding in its use of violence.
There are also practical problems-- some people would probably like to spend time in challenging simulations. What's the boundary between that and torture, and how do you verify consent?
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2011-06-30T17:36:06.994Z · LW(p) · GW(p)
It could form, but I don't see how much good it would do unless there was a substantial consensus in favor of not torturing ems, so that people in the Association gain by having more/better people to associate with than those not in the Association, and so that the Association has a chance of succeeding in its use of violence.
Indeed.
Compare to the anti-abuse Association, which I don't see happening any time soon:
1) All members interact only with people in the Association.
2) Everyone in the Association agrees to submit to random surprise inspections of their homes to see if they're mistreating their children, spouses, elderly relatives or pets. Anyone found to be mistreating other people or animals will be expelled.
3) Anyone willing to follow these rules can join this Association.
4) We will seek to use violence to prevent anyone not in this Association from living together with someone, or from having pets.
Replies from: MixedNuts↑ comment by MixedNuts · 2011-07-01T06:52:13.280Z · LW(p) · GW(p)
Isn't this what we have, except it's opt-out?
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2011-07-04T14:05:52.055Z · LW(p) · GW(p)
I don't understand what you mean.
Replies from: gwillen↑ comment by gwillen · 2011-07-04T15:04:04.119Z · LW(p) · GW(p)
The claim is that the association is "government" or "modern society".
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2011-07-04T15:22:42.527Z · LW(p) · GW(p)
That's what I suspected. Obviously, neither of those conducts random surprise inspections on people's homes without evidence, which is what'd be required.
Replies from: gwillen↑ comment by gwillen · 2011-07-05T03:35:18.440Z · LW(p) · GW(p)
Well, in point of fact the police are empowered to do so if they have reason to believe you are committing abuse. The requirement for a reason does make the analogy imperfect.
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2011-07-05T12:24:42.943Z · LW(p) · GW(p)
The requirement for a reason does make the analogy imperfect.
More than just imperfect. The police being able to do so if there's a good enough reason is very, very different from everyone being constantly aware of the fact that their home may be audited at any moment, and indeed will be audited many times over.
comment by bipolar · 2011-06-29T19:41:50.248Z · LW(p) · GW(p)
I think that you raise a legitimate concern. I think that as opportunity for growth increases, it will be to people's advantage to rewire themselves to be more empathetic so that they can cooperate with one another more. So I think that people's enjoyment of torture will go down on average. But this doesn't entirely preclude the concern that you raise. I think that all that one can say is that while there's a good chance that there will be torture in the future if the human race survives, there will be a lot of counterbalancing ecstatic experiences. Whether the latter can balance out the former is in some measure a matter of perspective.
comment by fubarobfusco · 2011-06-30T03:41:54.275Z · LW(p) · GW(p)
What leads some people to enjoy torturing sims, anyway?
Some fiction writers report that a sufficiently well-developed fictional character has some degree of cognitive independence — perhaps as much as a "part" in the IFS sense — and struggle with the idea of doing horrible things to their carefully-created characters in order to produce engaging fiction. How seriously should this metaphor be taken?
On the other hand, the motivation behind explicitly mistreating sims as illustrated in the VGCats comic CronoDAS linked to, seems to be less productive and more morbid: less like posing hard problems and struggles for a character, and more like bullying a weaker kid or pulling the wings off flies.
Given that a simulation game is created by a game designer and played by players, some of this could be explained as testing the limits of the game, or revealing the (possibly unintended!) consequences of the game's design. If your game is sold as a brightly-colored domestic setting where the ostensible goal is to make a small family of sims very happy, then it is noteworthy if the winning conditions can be equally satisfied by casting Parfit's repugnant conclusion to create a hell-world packed with a teeming incestuous horde of sims who are each borderline-suicidal.
Presumably, at some point people get bored with this sort of thing. A person who constructs one simulated finite hell-world, then shuts it down and moves on to go to something else, is not especially worrisome. A person who spends days on end constructing larger and more elaborate hells is probably presumed to be somewhat deranged.
Yet at the same time, why should the simulated-misery of a simulated-being bear any moral significance for us? If you replaced the sampled-audio screams of "Oh God! No!" with "Oh God! Yes!" and replaced the graphics of bruises and tears with graphics of delight and pleasure, would this change anything? Is the sim-torturer problematic to us because they enjoy creating things that look like pain, or because they create simulated conditions that actually count as pain?
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2011-06-30T08:05:01.853Z · LW(p) · GW(p)
Is the sim-torturer problematic to us because they enjoy creating things that look like pain, or because they create simulated conditions that actually count as pain?
Because they create conditions that actually count as pain.
What leads someone to enjoy torturing sims - what leads someone to enjoy torturing people?
comment by RolfAndreassen · 2011-06-30T02:35:19.625Z · LW(p) · GW(p)
It seems to me that there is some question of psychology here. Those who enjoy torturing Sims do so, I think, because they know that there is no conscious being who suffers; so it is not real torture, it's roleplay. Presumably you are not worried about people who enjoy gunning down row upon row of zombies in a shooter. Now, it is permissible to question whether this whatever-it-is that makes people want to play the role of torturers is something we want to keep around in the human psyche; perhaps we'd like to self-edit it out. (Or perhaps not. I have no strong feeling either way.) But the point is that the difficulty doesn't lie in the roleplay, but in not recognising where roleplay ends and inflicting suffering on a real, conscious being begins. So to answer your question, I would work to explain to people that computer entities will eventually be conscious, and thus deserving of the same treatment we give other humans - yes, even if they look like just lines of code; and explaining the concept of Nonperson Predicates that shows why it's permissible to torture Sims but not ems. Then, for those few who would still insist on torturing ems, there is either law, or the social mechanisms that currently prevent people from torturing dogs even when it might not be strictly illegal.
It is probably not possible to avoid all em torture, just as we cannot avoid all torture of humans today. But with good education the problem needn't be worse.
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2011-06-30T07:59:34.973Z · LW(p) · GW(p)
Those who enjoy torturing Sims do so, I think, because they know that there is no conscious being who suffers; so it is not real torture, it's roleplay.
While this is probably true to a large extent, there are plenty of cases of people abusing weaker beings they fully well know are conscious. Just look at the number of cases of animal abuse, child abuse and spousal violence filed, and remember that for every reported case there are likely several which go unreported. Heck, see almost any of the reports of the conditions in which factory farm animals are commonly kept. See also various reports of prison violence, police / hired guards abusing their authority, common treatments of prisoners of war, et cetera. Don't forget various cults using emotional or physical violence to maintain obedience among their followers. That's not even mentioning the various cases that are considered extreme even in Western society, e.g. serial murderers who torture their victims first.
Now take into account that there are probably plenty of people with leanings towards abusive behavior, who nonetheless abstain from it because they're too afraid of the social consequences. Then think of a scenario where anyone can run ems on their desktop computer and there's essentially no risk of ever getting caught. Furthermore, the risk of maltreatment grows dramatically if one can think of their victim as non-human and therefore not deserving of moral treatment. If your victim is an em and you're not, thinking like that isn't exactly hard.
Replies from: RolfAndreassen↑ comment by RolfAndreassen · 2011-06-30T21:43:05.789Z · LW(p) · GW(p)
Ems will make torture cheaper, just as the Internet made pornography cheaper, and so there will probably be more of it, yes. I am trying to suggest that the problem is not overwhelming; that the elasticity at the relevant margin is small, as it were, and can be further lessened by the outreach that we ought to be doing anyway.
comment by Vaniver · 2011-06-29T23:51:04.135Z · LW(p) · GW(p)
How much effort is it worth to prevent the torture of ems?
Replies from: VNKKET↑ comment by VNKKET · 2011-06-30T06:18:11.677Z · LW(p) · GW(p)
Are you unsure about whether em torture is as bad as non-em torture? Or do you just mean to express that we take em torture too seriously? Or is this a question about how much we should pay to prevent torture (of ems or not), given that there are other worthy causes that need our efforts?
Or, to ask all those questions at once: do you know which empirical facts you need to know in order to answer this?
Replies from: Vaniver↑ comment by Vaniver · 2011-06-30T21:47:58.834Z · LW(p) · GW(p)
Are there empirical facts that can answer that question? It looks like a question about preferences to me, which are difficult to measure.
Replies from: VNKKET↑ comment by VNKKET · 2011-07-01T04:57:01.143Z · LW(p) · GW(p)
I think you're right that many of the relevant empirical facts will be about your preferences. At risk of repeating myself, though, there are other facts that matter, like whether ems are conscious, how much it costs to prevent torture, and what better things we could be directing our efforts towards.
To partially answer your question ("how much effort is it worth to prevent the torture of ems?"): I sure do want torture to not happen, unless I'm hugely wrong about my preferences. So if preventing em torture turns out to not be worth a lot of effort, I predict it's because there are other bad things that can be more efficiently prevented with our efforts.
But I'm still not sure how you wanted your question interpreted. Are you, for example, wondering whether you care about ems as much as non-em people? Or whether you care about torture at all? Or whether the best strategy requires putting our efforts somewhere else, given that you care about torture and ems?
Replies from: Vaniver↑ comment by Vaniver · 2011-07-01T06:11:45.198Z · LW(p) · GW(p)
I suppose I will go with statements, rather than a question: I suspect the returns to caring about ems are low, I suspect that defining, let alone preventing, torture of ems will be practically difficult or impossible; I suspect that value systems that simply seek to minimize pain are poor value systems.
Replies from: VNKKET↑ comment by VNKKET · 2011-07-02T06:24:37.283Z · LW(p) · GW(p)
I suspect that value systems that simply seek to minimize pain are poor value systems.
Fair enough, as long as you're not presupposing that our value systems -- which are probably better than "minimize pain" -- are unlikely to have strong anti-torture preferences.
As for the other two points: you might have already argued for them somewhere else, but if not, feel free to say more here. It's at least obvious that anti-em-torture is harder to enforce, but are you thinking it's also probably too hard to even know whether a computation creates a person being tortured? Or that our notion of torture is probably confused with respect to ems (and possibly with respect to us animals too)?
Replies from: Vaniver↑ comment by Vaniver · 2011-07-02T08:25:03.139Z · LW(p) · GW(p)
If you express the preferences in terms of tradeoffs, it does not seem likely that the preference against the torture of ems will or should be 'strong.'
Both. It seems difficult to define torture (and decide what tradeoffs are worthwhile), and even if you could define torture it seems like there is no torture-free way to determine whether or not particular code is torturous.