Does quantum mechanics make simulations negligible?

post by homunq · 2011-08-13T01:53:17.622Z · LW · GW · Legacy · 16 comments

Contents

16 comments

I've written a prior post about how I think that the Everett branching factor of reality dominates that of any plausible simulation, whether the latter is run on a Von Neumann machine, on a quantum machine, or on some hybrid; and thus the probability and utility weight that should be assigned to simulations in general is negligible. I also argued that the fact that we live in an apparently quantum-branching world could be construed as weak anthropic evidence for this idea. My prior post was down-modded into oblivion for reasons that are not relevant here (style, etc.) If I were to replace this text you're reading with a version of that idea which was more fully-argued, but still stylistically-neutral (unlike my prior post), would people be interested?

 

16 comments

Comments sorted by top scores.

comment by wedrifid · 2011-08-13T02:08:18.578Z · LW(p) · GW(p)

I've written a prior post about how I think that the Everett branching factor of reality dominates that of any plausible simulation, whether the latter is run on a Von Neumann machine, on a quantum machine, or on some hybrid; and thus the probability and utility weight that should be assigned to simulations in general is negligible.

I think this is mistaken. Quantum mechanics adds up to what we think of as normal in this case. The simulations split in into everett branches the same way the non-simulations split. It makes basically no difference.

Replies from: homunq
comment by homunq · 2011-08-13T15:05:36.753Z · LW(p) · GW(p)

Yes, the simulations split, but those splits tend to either give the identical in-simulation result, or in rare cases to break the simulation altogether. That is, one necessary precondition for my idea is that the "felt measure" of being in a million identical copies of a simulation is identical to that of being in one; and that this is not true for non-identical copies with even apparently-trivial differences.

A quantum computer, or a hybrid of a conventional computer with a quantum source of entropy (random number generator), would be splitting in a true sense. However, the number of quantum bits which were effectively splitting the sim without breaking it, and thereby were multiplying the measure of the sim, would still be vastly smaller than the number of quantum bits on which "reality" depends, and which are thereby constantly multiplying the measure of "reality".

I also give credit to the Tegmark Level IV, computable/countable, multiverse. In that sense, there are certainly an uncountable number of "seeds" for any given isomorphic class of universes, including our own level-III multiverse. Some of these may include an additional level of "simulators" which are "outside" our level-III multiverse; even possibly some "intelligent" simulators who have deliberately built the simulation in some sense. However, those "two-level" (or more) seeds are, in all probability, much rarer than seeds which simply code the fundamental laws of our own, "simple", level-III multiverse. And when I say much rarer, I mean that, if they even exist, the atoms of our visible universe would probably not suffice to express how much rarer in conventional notation.

comment by [deleted] · 2011-08-13T02:08:06.475Z · LW(p) · GW(p)

Well, to be frank, the idea doesn't sound very interesting to me. Yes, on a Turing machine, the universe would take a very long time to simulate. Perhaps our simulators are very patient, or perhaps they're using something more powerful than Turing machines.

It's possible I'm totally misunderstanding you, though. I know what Everett branching is, but not what the associated "factor" is, nor do I know what it would mean for one of these factors to dominate another.

Edit for typo.

Replies from: homunq
comment by homunq · 2011-08-13T15:05:33.926Z · LW(p) · GW(p)

As an analogy, imagine two bacteria. One of them represents a "sim" and one of them represents "reality". Every time a quantum branching - a effectively-irrevocable "collapsed wave function" - affects each, they reproduce, splitting into two or more Everett branches. (Also, every time one of the realities ends up containing a non-identical sim, you also have a new sim; but this is a trivial correction). My argument is that the "branching factor" or "rate of reproduction" of reality will be so massively bigger for reality than for any possible sim embedded in reality that it would be non-trivial to even calculate or express the numbers involved.

I am not arguing that there is not some possible hyper-reality where simulations may be more powerful. Although I think that there may be reasons to believe that the measure of that hyper-reality is negligible, that is a separate argument.

comment by homunq · 2011-08-13T14:40:51.032Z · LW(p) · GW(p)

It's interesting that both of the comments so far seem to be discounting the idea based on frankly obvious arguments. That is, people think of a counterargument, and stop there. To me, the counter-counter-arguments are about equally obvious; and even if they were not, it would be wise to consider the possibility they exist.

I'm not arguing that I'm necessarily correct here (insofar as that's even knowable currently), but just that, from my perspective, it seems that people are dismissing my idea too lightly. The possibility of simulation is so much a part of Less Wrong culture that there is a vocabulary of over a half dozen terms to describe related sub-ideas. If its very foundations are questionable, I would say that that deserves proper discussion. And "your first draft was too short", insofar as it's contributing to my down-rating, seems an especially poor reasoning.

Edit: as the discussion has progressed, I no longer find this to be a currently valid observation, yet I still think it was valid when I first made it.

Replies from: None
comment by [deleted] · 2011-08-13T16:46:25.763Z · LW(p) · GW(p)

Well, your reasoning appears to be, "A simulation of our universe would require vastly immense computational resources. Things that require vastly immense computational resources are vanishingly unlikely. Therefore, the existence of a simulation of our universe is vanishingly unlikely." I can't think of an argument for the second premise.

ETA: well, I can think of one argument: "A universe with vastly immense computational resources would have a very high Kolmogorov complexity." This is false, however, as, for example, Conway's Game of Life seeded with a normal number in Z-order) will compute everything that can be computed.

Replies from: homunq
comment by homunq · 2011-08-13T17:29:35.566Z · LW(p) · GW(p)

Not just "vastly immense", but "on a fundamental level, more than exists in our universe, by a factor which is almost certainly greater than zero and whose natural scale is potentially vastly immense".

If you want to argue that the simulation is happening in a different universe, then by that same argument, that is a universe with a lot more stuff going on in it overall than this one, and so the question becomes, why, from an anthropic perspective, aren't we experiencing that one? Which is a weak argument, because if both exist, then SOMEONE would still be experiencing this one, but still, it carries about as much weight as the argument for the existence of that computationally-superior universe, which is basically "you can't prove it doesn't exist".

PS to respond to your edit regarding Kolmogorov complexity:

  1. This is beside the point, because my original argument is not about any possible simulating universe, but about a post-singularity simulation from inside our own universe or one with similar computing power.

  2. Of course it's easy to build something that computes everything computable. It's much harder to build something that computes more than the universe, including some "meta level" capable of referring to the universe but not capable of being referred to by it (where the "simulators" live), but does NOT compute everything computable. The former is uninteresting because it does not increase the measure of any class, and I'd argue that the latter is indeed far more Kolmogorov-complex than just the (simplest members of the class of things isomorphic to the) universe.

Replies from: None
comment by [deleted] · 2011-08-13T17:41:10.478Z · LW(p) · GW(p)

I do believe that there's very little if any evidence for a simulated universe. The question is essentially, since we also have little evidence against a simulated universe, what's our prior for the idea?

Replies from: homunq
comment by homunq · 2011-08-13T17:50:37.645Z · LW(p) · GW(p)

I believe that even if this argument is fundamentally irresolvable on empirical grounds, that does not preclude an effective resolution on logical grounds. So I think that throwing up your hands and making it just an arbitrary matter of priors — if that was your intention — is premature.

Replies from: None
comment by [deleted] · 2011-08-13T20:46:14.343Z · LW(p) · GW(p)

Well, I have nothing more to say at the moment.

comment by Armok_GoB · 2011-08-16T10:20:01.629Z · LW(p) · GW(p)

If this is true, it has an interesting application: you could control the utility weight of a simulation nearly independently from it's behaviour, by having it deterministic except for a large input for random data in it, and switching the sourced of that randomness between a pseudorandom number algorithm and quantum randomness. Obvious example would be an upload of you which is deterministic and non-branching when experiencing unpleasant things, and massively branching (much more so than your biological brain) when experiencing pleasant things.

comment by Manfred · 2011-08-13T17:12:29.733Z · LW(p) · GW(p)

Do I think we're in a simulation? No. But though the reasons why a perfect simulation is possible aren't necessarily obvious, but they are compelling.

Quantum mechanical computation depends on the energy splitting (energy difference) between different levels - different energy levels of a quantum mechanical system oscillate relative to each other, and bigger energy splittings mean faster relative oscillations, which means you can get more computation done. So if you want to simulate a quantum system perfectly in faster than real time, you just have to make a model that is higher energy. The cool thing is that the model doesn't necessarily have to be arranged like the actual system: quantum computers designed to simulate chemical reactions can be just a line or grid of atoms linked by light - as long as the interactions between the atoms are proportional to the interactions in the modeled system, the computer works fine. This would allow, for example, a spatially 3d universe to by simulated within a 5d universe just by making the right connections.

Actually, now that I think about it, that may not be the heart of your post - it may be speculation about "subjective experience" rather than the practicality of simulations, which would make it even worse than I'd first thought.

Replies from: homunq
comment by homunq · 2011-08-13T17:33:51.527Z · LW(p) · GW(p)

Yes, you could in principle create a dissimilar but isomorphic quantum system to simulate reality. My argument is that the real one will take less stuff to build by a very large factor, where the factor is large enough that "stuff" can be validly taken to mean any of matter, energy, or negentropy.

Replies from: Manfred
comment by Manfred · 2011-08-13T19:50:48.201Z · LW(p) · GW(p)

Phew, I'm relieved your argument isn't something like "a simulation would by assumption be 'grainier' than a natural universe, and so it would 'split' less often, and so have less 'subjective experience.'"

As to it being a gigantic pain in the ass to simulate an entire universe - sure, and it's unlikely that we're in a simulation. But ignoring units is typically only done when even the exponent is huge, since 10^10^10 meters is 10^(10^10 - 3) kilometers, which is still pretty much 10^10^10. On the other hand, it should only take some well-designed nanotech to keep things running, which is a factor of 10^20 at the worst, which isn't a huge exponent. It's certainly more than we have in our universe, but it's well within what we could have if we had a few extra spatial dimensions or a different history of our vacuum energy or something.

Replies from: FeepingCreature, homunq
comment by FeepingCreature · 2011-08-17T00:51:15.324Z · LW(p) · GW(p)

The interesting question is: "do universes exist with a higher computational capacity than ours? How much higher? Orders of magnitude higher? Degrees of infinity higher? Arbitrarily higher? "

comment by homunq · 2011-08-13T22:21:24.384Z · LW(p) · GW(p)

To clarify: I mean that a sim would either be "grainier", not in any sense that would be detectable from inside, but just in the sense that it used some pseudorandom numbers as a proxy for quantum branching; or bigger in terms of stuff; or both (because there's plenty of orders of magnitude to spread between those options.

As to "well-designed nanotech" on the order of 10^20... that's vaguely plausible, but it's also plausible that that just wouldn't be able to handle the wide varieties of quantum entanglement that matter in the world we observe. Remember, even simple facts like "light travels in a straight line" are, at root, a result of quantum interference, conceivable as infinite numbers of Feynman diagrams. While it is certainly possible to create heuristics, perhaps even perfect algorithms, to reproduce any one quantum effect like that, I'm skeptical that you can just induct from there up to the quantum soup we swim in. So I'd still guess 10^(10^x) with x>=2 (note: I had said x=10 but on second thought it's probably either impossible or easier than that).