# Torture Simulated with Flipbooks

post by Amanojack · 2011-05-26T01:00:12.649Z · score: 9 (21 votes) · LW · GW · Legacy · 33 comments

What if the brain of the person you most care about were scanned and the entirety of that person's mind and utility function at this moment were printed out on paper, and then several more "clock ticks" of their mind as its states changed exactly as they would if the person were being horribly tortured were printed out as well, into a gigantic book? And then the book were flipped through, over and over again. Fl-l-l-l-liiiiip! Fl-l-l-l-liiiiip!

Would this count as simulated torture? If so, would you care about stopping it, or is it different from computer-simulated torture?

comment by MinibearRex · 2011-05-26T01:29:53.682Z · score: 20 (22 votes) · LW(p) · GW(p)

Consider a slightly different thought experiment. Suppose some government, as it tortured a prisoner, took a brain scanner and took a series of, say, 200 pictures of that person's brain, spaced, one second apart. Those pictures are then printed out, and they can be "flipped through" in the manner you describe. But suppose instead that you simply took the pictures, stacked them, and then put them in a box off in a warehouse somewhere. Does that count as simulated torture? Even better, what if you took the entire box in an airplane, and scattered the pictures out the window so that the wind blew them all over the world. What's the difference here?

While flipping through the printouts creates the illusion of seeing something happen, it is not a simulation. When you watch a movie on DVD, you are not watching some alternate universe where the movie's subject is actually occurring, and the photons are being transmitted into your own universe to be projected out of your TV. It's an optical illusion. Nothing about "flipping" the printouts does anything special, other than create an illusion within your own brain. You are not going to be able to absorb all the data on the sheet anyway (at least not at regular flipping speeds) so there is no chance of your own brain being used as the computing substrate.

Rather, the actual simulated torture in this scenario would be the actual calculations to generate the printouts. That does count as torture, and I would pay to stop it. But once the calculations have been done, it doesn't matter how many pieces of paper come out of the printer.

comment by jasonmcdowell · 2011-05-26T02:57:53.361Z · score: 4 (4 votes) · LW(p) · GW(p)

What you've said makes sense to me, that the flipbooks do not constitute a calculation. However, it feels like there is a fuzzy boundary somewhere nearby, similar to the fuzzy boundary of what constitutes life. Maybe there is a information theory explanation which relates the two.

If the flipbooks contain enough information to continue the calculation then they are the same as a backup. Ok, so a flipbook is a series of closely spaced backups. What constitutes a calculation? I've read about these things, but I've never tried to work it out for myself before.

A backup is a static result of a calculation. Static results are static. They don't count as alive, they don't count as a calculation.

What counts as a calculation? I'm getting stuck. Let's say we do the calculation as a state machine. You have static states that are updated according certain rules. State 1 determines/causes state 2. The calculation is implemented somewhere. So there are patterns of matter/energy that represent the states and represent the arithmetic needed to change states. I guess the calculation is here?

comment by DanielLC · 2011-05-26T16:22:05.513Z · score: 3 (3 votes) · LW(p) · GW(p)

It can't be that it's static. Time doesn't exist, at least, not as a basic part of physics.

Still, it seems like the universe is doing the calculation. After all, where else would the output come from?

This makes me wonder, if we were in a universe exactly like this one, except that the laws of physics specified everything exactly, and it matching this universe was a total coincidence, would people have subjective experience?

comment by endoself · 2011-05-26T17:11:15.030Z · score: 1 (1 votes) · LW(p) · GW(p)

This makes me wonder, if we were in a universe exactly like this one, except that the laws of physics specified everything exactly, and it matching this universe was a total coincidence, would people have subjective experience?

The problem here is the 'total coincidence'. This is analogous to watching a video of someone being tortured that was randomly generated. No one is being harmed and it only seems like it because of a massive coincidence. There is still enough data to specify the brain state it your scenario so, given our current knowledge about the brain, it is more likely to have conscious experience than the videotape one. Even with the naive concept of time, it would be very hard to define what constitutes a calculation, and it looks even harder without one.

comment by hairyfigment · 2011-05-27T19:59:53.567Z · score: 0 (0 votes) · LW(p) · GW(p)

What does it mean, "specified everything exactly"?

This sounds like a slightly modified (at most) version of timeless physics. A function would deterministically assign arrows to each N-dimensional set of coordinates without even looking at its neighborhood, and this function just happens to define an N+2D surface. Points on the surface would still show the relationships we call causality, and by assumption they still exhibit the functional equivalent of our consciousness.

If you mean Many-Worlds does not apply to that reality, well, we don't know for sure that our reality doesn't work by Bohmian mechanics. Maybe we should give this a larger probability. In that case I don't see myself changing my own probability of having subjective experience.

comment by jasonmcdowell · 2011-05-26T03:20:26.143Z · score: 2 (2 votes) · LW(p) · GW(p)

Other points that tickle my mind:

1. The uniqueness of a calculation matters. Running the same program twice doesn't give you a new result.

2. Does cause and effect (and representation of state) really matter that much? (Dust theory). My answer: still confused.

As a whole, a pattern of behavior of matter/energy can be called a calculation when State 1 causes State 2. When this happens, we can at least point to the calculation. With dust, states do not cause other states, and states can have different representations.

Right now (for at least the next minute) I don't think calculations exist. There must be some kind of illusion here. Related stuff: timeless physics, static states, causality, consciousness, memory. Memory is static. Consciousness is dynamic. Flipbook pages are static. Calculations are dynamic.

comment by MinibearRex · 2011-05-27T03:09:39.131Z · score: 1 (1 votes) · LW(p) · GW(p)

The uniqueness of a calculation matters. Running the same program twice doesn't give you a new result.

Running the program a second time is definitely an ethical violation. It would be analogous to me torturing you for an hour, wiping your memories of the past hour, and then torturing you for another hour. Alternatively, if I torture "you" in this universe, and then pop on over to the adjacent universe, it's no less of a crime for me to torture your counterpart.

comment by Icelus · 2011-06-07T07:47:15.600Z · score: 0 (0 votes) · LW(p) · GW(p)

However, it feels like there is a fuzzy boundary somewhere nearby, similar to the fuzzy boundary of what constitutes life. Maybe there is a information theory explanation which relates the two.

You might find it useful thinking about computations in terms of turing machines and the tape they use: http://lesswrong.com/r/discussion/lw/5vx/torture_simulated_with_flipbooks/4b7p

comment by atucker · 2011-05-26T22:54:12.845Z · score: 0 (0 votes) · LW(p) · GW(p)

I think that the algorithm used to compute the brain states is also important.

How about a different thought experiment?

A computer program is computing pi, and stumbles upon a stream of numbers which happen to perfectly describe the brain state of a person being tortured for 3 seconds. The program is doing no neural simulation on any level, and its just happening across this sequence. Did torture happen?

The computer is doing calculations to reach the brain-state, but the calculations have nothing to do with torture.

(Another example: a computer computes $a\_\{n\+1\} = a\_\{n\} \+ a\_\{n\-1\}, a\_\{0\} = 2, a\_\{1\} = 4$, and stumbles across the beginning of the sequence $a\_\{n\} = 2 \* n$, since they both cover 2, 4, 6 )

comment by MinibearRex · 2011-05-27T03:16:52.644Z · score: -1 (1 votes) · LW(p) · GW(p)

A computer program is computing pi, and stumbles upon a stream of numbers which happen to perfectly describe the brain state of a person being tortured for 3 seconds. The program is doing no neural simulation on any level, and its just happening across this sequence. Did torture happen? The computer is doing calculations to reach the brain-state, but the calculations have nothing to do with torture.

I doubt it. Mind processes aren't static. A person who's been frozen isn't consciously feeling that they are frozen. They just aren't feeling. In the same way, a picture of someone is not a trapped version of them, and a recording of a tortured person's brain state isn't a tortured person itself.

Those numbers are just an output of a calculation, but there's nothing special about the order. The only way that the sequence of digits in pi could "perfectly describe" the brain state is if there is someone to interpret it as such. But there are numbers all around us. There are seven drawers on my desk. There are nine pieces of visual art in this room. Why couldn't I just interpret those numbers in such a way to describe a tortured person? The actual torture would occur if, as you were looking at the sequence of numbers, you fed them into a simulator of a human brain, and ran the simulation from there.

comment by hairyfigment · 2011-05-27T20:08:18.970Z · score: 0 (0 votes) · LW(p) · GW(p)

I think I agree with all but the last sentence. But I definitely would not feel comfortable creating more physical objects that might exhibit probabilistic causality if you knew their source, and that describe perfectly a person feeling torture.

I think I could justify drawing a line between this and the process of flipping, despite the flips creating patterns of photons, because the fact that papers in a box have different patterns of ink on them should have similar effects on the world around them.

comment by atucker · 2011-05-26T22:33:21.461Z · score: 0 (0 votes) · LW(p) · GW(p)

I pretty much agree with everything said here.

I think that the algorithm used to compute the brain states is also important.

How about a different thought experiment?

A computer program is computing pi, and stumbles upon a stream of numbers which happen to perfectly describe the brain state of a person being tortured for 3 seconds. The program is doing no neural simulation on any level, and its just happening across this sequence. Did torture happen?

The computer is doing calculations to reach the brain-state, but the calculations have nothing to do with torture.

(Another example: a computer computes $a\_\{n\+1\} = a\_\{n\} \+ a\_\{n\-1\}, a\_\{0\} = 2, a\_\{1\} = 4$, and stumbles across the beginning of the sequence $a\_\{n\} = 2 \* n$, since they both cover 2, 4, 6 )

comment by Zack_M_Davis · 2011-05-26T01:21:53.596Z · score: 15 (23 votes) · LW(p) · GW(p)

I'm starting to sympathize with PlaidX's complaint. If what you really want to ask is, "Could a flipbook be conscious?" then why not just say that? The torture is completely irrelevant.

comment by Vladimir_Nesov · 2011-05-26T02:29:37.015Z · score: 14 (18 votes) · LW(p) · GW(p)

Asking whether the simulation is morally relevant is putting the question as a decision problem, rather than classification by a poorly specified concept.

comment by prase · 2011-05-26T14:21:09.838Z · score: 3 (3 votes) · LW(p) · GW(p)

But it is reasonable to expect that most decisions will be based solely on the poorly specified classification.

comment by jasonmcdowell · 2011-05-26T02:00:40.835Z · score: 3 (5 votes) · LW(p) · GW(p)

I'm reminded of a story in Orion's Arm where a super intelligence is simulated with pencil and paper. This depiction isn't a flipbook of course. In the story, a bunch of volunteer baseline human carried out the algorithm of a super intelligence doing the arithmetic by hand on pieces of paper. They did it as a hobby.

After searching for a while, I found the story.

comment by wedrifid · 2011-05-26T09:25:26.619Z · score: 13 (15 votes) · LW(p) · GW(p)

Would this count as simulated torture? If so, would you care about stopping it, or is it different from computer-simulated torture?

Cute idea!

The computations involved in producing the clipbook count as a torture sim. Flipping through the flipbooks just counts as some sort of twisted torture-porn fetish.

comment by Icelus · 2011-06-07T07:18:23.613Z · score: 0 (0 votes) · LW(p) · GW(p)

Agreed. Actual computations have to be performed and I think a useful mental-model is of a turing machine and tape and figuring out what in a situation is part of each.

comment by [deleted] · 2011-05-26T08:33:23.856Z · score: 7 (7 votes) · LW(p) · GW(p)

.

comment by jfm · 2011-05-27T15:56:07.428Z · score: 0 (0 votes) · LW(p) · GW(p)

It make me think of "Poor little clams, snap, snap, snap".

comment by Armok_GoB · 2011-05-26T09:10:50.083Z · score: 5 (5 votes) · LW(p) · GW(p)

My instant, cached response is "Obviously not, it's no different from just letting it lay there unflipped, it's not computation, it's not a simulation, it's not concious." ... And that's true, as an answer to the question "is this torture".

BUT, reading the comments, thinking about it a bit more, and the notion that every computation exists and that measure is what counts... And that one very likely candidate for what measure is is the K-complexity of outputting that mind/experience by specifying the laws of physics + a position within that universe where the mind happens to be... It seems suddenly not as implausible, that by acting as POINTERS to FINDING the minds computation, either by looking the pages or photons going all over the universe when it's flipped, this act could indeed increase the measure of the tortured brain state and thus the decision problem might be that the book flipping is indeed a bad thing.

This actually sounds very plausible, but it requires a conjunction of several assumptions to work... Still, depending on the amount of flipping involved and the cost, the remaining probability is enough to reasonably shift decisions.

If this is true, it leads to all sorts of interesting counter-intuitive implications for what kind of physical implementations have which amount of moral weight. At the most extreme end of possibility things like simulating a mind on a Babbage engine made from some rare mix of heavy metals not found in nature might be of greater moral weight than a billion minds running on an opaque architecture implemented in carbon. And I don't consider that a reductio, just immensely counter-intuitive. (and that was an extreme example meant to shock for demonstration, it's unlikely to be THAT bad)

This post made me actually THINK. Upvoted.

comment by ata · 2011-05-26T06:07:15.842Z · score: 3 (5 votes) · LW(p) · GW(p)

Flipping through a book doesn't repeat the computation. I'd say that the process that was used to generate the tortured brain states was the conscious part of this scenario.

comment by ArisKatsaris · 2011-05-26T09:36:02.685Z · score: 4 (4 votes) · LW(p) · GW(p)

If I think to myself 4*5 = 20 does this fail to "repeat the calculation" because I have it cached in my brain instead of having to calculate 5+5+5+5=10+5+5=15+5=20?

Does this mean that if a computer has some values cached instead of physically needing to bang together particles every single time in order to measure the results, this will likewise fail to repeat the "computation"?

comment by Icelus · 2011-06-07T07:38:44.982Z · score: 0 (0 votes) · LW(p) · GW(p)

Yes, it fails to repeat the computation, simply because there is no machine doing active computation.

Although whether or not using cached values to make a person in a sim think they were tortured is a moral quandary to me. Highly relevant is this lw post linked to elsewhere in this thread here.

comment by jasonmcdowell · 2011-05-26T01:51:40.838Z · score: 3 (7 votes) · LW(p) · GW(p)

I'd say the torture happened once. Even if you make more flipbooks and it changes the measure of the subjective experience, there is only one unique experience. The experience doesn't know if it happened before.

Once the system is closed, I'd think it is morally same for the experience to be simulated once or many times.

You're no more torturing them again than you are killing them again and again when the flipbook finishes its calculation.

comment by DanielVarga · 2011-05-27T00:43:30.148Z · score: 1 (1 votes) · LW(p) · GW(p)

I happen to agree with you 100%, but let me note that this line of reasoning has some strange conclusions. It implies that it is the same to torture one computer-simulated consciousness to torturing 100 clones of him at the same time the same way. But when one of the simulations has an accidental bit-flip due to hardware error, it is not the same anymore. Similarly, if you torture 100 different computer-simulated consciousnesses by a deterministic process, but during the simulation two of them become identical, it means that now there are only 99 people tortured.

comment by Icelus · 2011-06-07T07:25:47.246Z · score: 0 (0 votes) · LW(p) · GW(p)

I'm undecided on how to treat 'running the exact same torture sim (say as a flipbook of instructions)' but I'm leaning towards it being increasingly morally worse the more time one runs the simulation because of one thing that sticks out to me: that if you complete the torture sim then ask the person in the sim if they think they're a person, if they think it's okay to torture them because they're a copy, etc they're going to have every reason/argument a human in meatspace has against torture being done to them.

comment by Pavitra · 2011-05-27T21:33:58.004Z · score: 1 (1 votes) · LW(p) · GW(p)

This seems to reduce to the question of the moral status of independent identical copies.

comment by Icelus · 2011-06-07T07:16:40.956Z · score: 0 (0 votes) · LW(p) · GW(p)

I was thinking about this for a while and I think I have an insight. A good way to think about computation is to just go with the model of a turing machine. (I don't know if this includes all kinds of "simulations", since it seems people are still arguing pretty heavily around whether or not the universe, or an individual section of the universe, is representable by a turing machine and I don't have the expertise/skill/knowlege/experience/etc to know one way or the other.)

Though, assuming turing machines are okay, I think it's important to distinguish between what is (and what is part of) the turing machine and what is (and what is part of) the tape (or memory, etc).

A person reading detailed instructions of a torture simulation has that person's brain as the turing machine and the instructions are the tape.

In this xkcd where a universe is simulated on rocks the tape is the rocks and the turing machine isn't specified. Some type of omega-level observer would be necessary to 'run' the computation.

Is tape without a turing machine a computation? I'd say certainly not.

I would say just having a stack of instructions (say in a flipbook) isn't morally wrong but that it depends highly on how likely the computation is to be run. Say if there are a bunch of roomba turing machines that start running whatever they come across, it would be very morally bad for one to leave around a bunch of torture sim instruction flipbooks, since that sim would probably get run quite a lot.

Although this all assumes answers to some pretty tough problems have been found. Such as nailing down what a tortured mind consists of and gray areas.

Gray areas like how much fidelity is required; since one could design a torture sim but only run the computation for each 10 second jump, so only a fraction is actually computed but something 'torture-like' is going on.

I'll reply to this if I think of any others.

comment by ArisKatsaris · 2011-05-26T09:30:31.275Z · score: 0 (2 votes) · LW(p) · GW(p)

Would this count as simulated torture?

Given the way I understand the meaning of "simulation", yes.

If so, would you care about stopping it, or is it different from computer-simulated torture?

Neither. People need to stop assuming that by classifying a set of calculations as "simulation" the simulations must automatically experience qualia.

comment by XiXiDu · 2011-05-26T09:24:53.934Z · score: 0 (0 votes) · LW(p) · GW(p)

If we assume that computation is an illusion, then we would have to annihilate or alter the actual pattern that we dislike. But does that even make sense?

Relevant Less Wrong posts:

I request a post that explains what constitutes a computation.

comment by Thomas · 2011-05-26T08:57:29.019Z · score: 0 (0 votes) · LW(p) · GW(p)

It could be your copy. Would you feel the pain then? Of course, this would be stored for you in the future. Your copy is always you and will always be you. Substrate does not matter much.

comment by Nic_Smith · 2011-05-26T02:30:32.260Z · score: -1 (1 votes) · LW(p) · GW(p)

Is this not essentially the Einstein's-brain-as-a-book problem?