Brain-in-a-vat Trolley Question

post by nick012000 · 2012-12-30T03:22:51.335Z · LW · GW · Legacy · 20 comments

Contents

20 comments

Just saw this on another forum. I figured I'd repost it here, since it'd be interesting to see you guy's answer to it.

Consider the following case:

On Twin Earth, a brain in a vat is at the wheel of a runaway trolley. There are only two options that the brain can take: the right side of the fork in the track or the left side of the fork. There is no way in sight of derailing or stopping the trolley and the brain is aware of this, for the brain knows trolleys. The brain is causally hooked up to the trolley such that the brain can determine the course which the trolley will take.

On the right side of the track there is a single railroad worker, Jones, who will definitely be killed if the brain steers the trolley to the right. If the railman on the right lives, he will go on to kill five men for the sake of killing them, but in doing so will inadvertently save the lives of thirty orphans (one of the five men he will kill is planning to destroy a bridge that the orphans' bus will be crossing later that night). One of the orphans that will be killed would have grown up to become a tyrant who would make good utilitarian men do bad things. Another of the orphans would grow up to become G.E.M. Anscombe, while a third would invent the pop-top can.

If the brain in the vat chooses the left side of the track, the trolley will definitely hit and kill a railman on the left side of the track, "Leftie" and will hit and destroy ten beating hearts on the track that could (and would) have been transplanted into ten patients in the local hospital that will die without donor hearts. These are the only hearts available, and the brain is aware of this, for the brain knows hearts. If the railman on the left side of the track lives, he too will kill five men, in fact the same five that the railman on the right would kill. However, "Leftie" will kill the five as an unintended consequence of saving ten men: he will inadvertently kill the five men rushing the ten hearts to the local hospital for transplantation. A further result of "Leftie's" act would be that the busload of orphans will be spared. Among the five men killed by "Leftie" are both the man responsible for putting the brain at the controls of the trolley, and the author of this example. If the ten hearts and "Leftie" are killed by the trolley, the ten prospective heart-transplant patients will die and their kidneys will be used to save the lives of twenty kidney-transplant patients, one of whom will grow up to cure cancer, and one of whom will grow up to be Hitler. There are other kidneys and dialysis machines available, however the brain does not know kidneys, and this is not a factor.

Assume that the brain's choice, whatever it turns out to be, will serve as an example to other brains-in-vats and so the effects of his decision will be amplified. Also assume that if the brain chooses the right side of the fork, an unjust war free of war crimes will ensue, while if the brain chooses the left fork, a just war fraught with war crimes will result. Furthermore, there is an intermittently active Cartesian demon deceiving the brain in such a manner that the brain is never sure if it is being deceived.

QUESTION: What should the brain do?

[ALTERNATIVE EXAMPLE: Same as above, except the brain has had a commisurotomy, and the left half of the brain is a consequentialist and the right side is an absolutist.]

20 comments

Comments sorted by top scores.

comment by JonathanLivengood · 2012-12-30T07:04:43.573Z · LW(p) · GW(p)

Seems to me that curing cancer swamps out everything else in the story. Supposing that World War 2 was entirely down to Hitler, the casualties came to about 60-80 million. By comparison, back of the envelope calculations suggest that around 1.7 million people die from cancer each year* in the U.S., E.U., and Japan taken together. See the CDC numbers and the Destatis numbers (via Google's public data explorer) for the numbers that I used to form a baseline for the 1.7 million figure.

That means that within a generation or two, the cancer guy would have saved as many lives (to speak with the vulgar) as the Hitler guy would have killed. Plus, the cancer guy would have improved quality of life for a lot more people than that. Maybe we have to go another couple of generations to balance life years if the Hitler casualties are all young and the cancer savings are all old. But under the assumption that a solution to cancer is very unlikely without the cancer guy, the right decision seems clearly to be to steer the trolley left.

* Things get more complicated if we suppose that the Hitler guy will bring about a new world war and attempted genocide, which might involve full-on nuclear war, rather than a sort of repeat of real-Hitler's consequences. I am choosing to understand the Hitler guy as being responsible for 60 or 80 million deaths -- or make the number a bit larger, like 100 million, if you like.

Replies from: TrE
comment by TrE · 2012-12-30T08:21:35.122Z · LW(p) · GW(p)

It's also conceivable that with his compelling story of kidney failure with his life saved by transplantation, Hitler gets admitted to art school, creating beautiful landscape paintings for the rest of his life. At the same time, the person who cures cancer may also accidentally create a virus that destroys every single living cell on earth within one week after its accidental release into the environment. The only person surviving would be a brain emulation prototype which copies itself, rebuilding human society such that nobody dies or feels pain anymore.

Replies from: JonathanLivengood
comment by JonathanLivengood · 2012-12-30T20:05:25.928Z · LW(p) · GW(p)

Oh, the under-specification! ;)

comment by Desrtopa · 2012-12-30T15:52:59.625Z · LW(p) · GW(p)

I feel like criticizing this as a moral dilemma would be somewhat missing the point, so I'm just going to say that as a joke, it tries too hard.

Replies from: FiftyTwo
comment by FiftyTwo · 2012-12-30T20:34:15.798Z · LW(p) · GW(p)

I don't get it. Is the joke that Philosophical thought experiments are sometimes overcomplicated? If so that's not exactly a deep or controversial insight.

Replies from: gjm
comment by gjm · 2012-12-30T22:19:38.578Z · LW(p) · GW(p)

No, I think the joke is just "see how many philosophical thought-experiment cliches we can pack into a short space". Some people find that sort of thing funny, some don't.

Replies from: Armok_GoB
comment by Armok_GoB · 2013-01-16T18:15:48.940Z · LW(p) · GW(p)

Needs more extremely graphical descriptions of infinite torture.

comment by palladias · 2012-12-30T04:21:41.668Z · LW(p) · GW(p)

Ok. if this is getting a thread, I really can't resist promoting the trolleyology t-shirt my roommates and I made. Ever wonder whose running all these ethically problematic trolleys? It's the Metaphysical Transit Authority!

comment by Richard_Kennaway · 2012-12-30T11:08:26.636Z · LW(p) · GW(p)

Original source.

Replies from: TrE
comment by TrE · 2012-12-30T23:13:16.693Z · LW(p) · GW(p)

They were the ones who put the 'troll' into 'trolley'.

comment by A1987dM (army1987) · 2012-12-30T10:26:15.964Z · LW(p) · GW(p)

Left, because that would kill the author of the example. :-) (Does this comment fall within the scope of the new censorship policy?)

Replies from: DanielLC
comment by DanielLC · 2012-12-30T18:28:47.943Z · LW(p) · GW(p)

If he goes right, he'll spare lefty, who will kill five men including the author. If he goes left, he'll spare righty, who will kill the same five men. The author dies either way.

Replies from: army1987
comment by A1987dM (army1987) · 2012-12-30T21:39:11.454Z · LW(p) · GW(p)

Right. Epic reading comprehension fail on my part. -_-'

Replies from: Fadeway
comment by Fadeway · 2013-01-01T13:58:21.070Z · LW(p) · GW(p)

Don't blame yourself, blame the author. (which you kinda sorta did but then didn't)

comment by PECOS-9 · 2012-12-30T03:55:11.928Z · LW(p) · GW(p)

I think this belongs in an open thread, not a discussion post.

Replies from: Emile
comment by Richard_Kennaway · 2012-12-30T11:04:00.535Z · LW(p) · GW(p)

This is a joke, right?

comment by [deleted] · 2012-12-30T03:40:54.849Z · LW(p) · GW(p)

This is stupidly contrived. Normally i try to actually answer the question and engage the problem at its heart, but this is too much. You won't be nerd-sniping me today.

ie. flip a coin

Replies from: pragmatist, army1987
comment by pragmatist · 2012-12-30T06:42:52.350Z · LW(p) · GW(p)

This is stupidly contrived.

That's the joke.

comment by A1987dM (army1987) · 2012-12-30T10:33:26.910Z · LW(p) · GW(p)

Least convenient possible world: if you try to make a non-deterministic choice, Omega kills the discoverer of the cure for cancer and the inventor of the pop-top can, kill 3^^^3 kittens, and an unjust war fraught of war crimes will ensue, on top of whatever else your decision causes as described in the scenario.