Do you think you are a Boltzmann brain? If not, why not?
post by Jack R (Jack Ryan) · 2021-10-15T06:02:30.083Z · LW · GW · 3 commentsThis is a question post.
Contents
Answers 12 rosyatrandom 6 ike 2 Charlie Steiner 2 DanArmak 1 Logan Zoellner 0 simon -2 JBlack None 3 comments
For more on Boltzmann brains, see here.
Answers
Yes, and also no.
That is, there are Boltzmann Brains that represent my current mental state, and there are also 'normal' universes containing 'normal' brains doing the same thing, and there are probably a bunch of other things too.
All of them are me.
↑ comment by simon · 2021-10-15T09:51:56.820Z · LW(p) · GW(p)
Even if the vast majority of entities with your current mental state are Boltzmann brains, you can only expect the mental operations to carry out the conclusion "and therefore I am likely a Boltzmann brain" to validly operate in the entities in which you are not, in fact, a Boltzmann brain. That operation, therefore, would only harm the accuracy of your beliefs.
Replies from: Jack Ryan↑ comment by Jack R (Jack Ryan) · 2021-10-15T10:46:55.902Z · LW(p) · GW(p)
Why can't you have Boltzmann brains-that-carry-out-that-operation?
Replies from: Flaglandbase↑ comment by Flaglandbase · 2021-10-15T12:01:49.771Z · LW(p) · GW(p)
Because most Boltzmann brains are so ephemeral they would instantaneously collapse.
Replies from: Jack Ryan↑ comment by Jack R (Jack Ryan) · 2021-10-15T22:57:19.189Z · LW(p) · GW(p)
Agreed, I think that's a good reason. It's related to the reason I don't think I am a Boltzmann brain--most Boltzmann brains don't have the memory that they exist due to evolutionary processes, since brains with that memory are an extremely small sector of all possible Boltzmann brains. And so it seems like the simplest explanation for me having that memory is that evolution actually happened (since the B-brain explanation is kind of wild). Though I haven't thought super carefully about this and would like to hear other's thoughts.
Replies from: Jiro↑ comment by avturchin · 2021-10-15T10:50:13.560Z · LW(p) · GW(p)
If there are no real worlds, but only BBs all along, this argument doesn't work.
However, it is still not a big problem, as Dust theory still works, and for any BB there will be another BB which represent its next mental state. So from inside it will look like normal world. Mueller wrote a mathematical formalism for this.
Replies from: rosyatrandom↑ comment by rosyatrandom · 2021-10-18T09:06:43.851Z · LW(p) · GW(p)
Oh yes, 'real' is a fuzzy concept once you allow Boltzmann/Dust approaches. Things just... are, and can be represented by other things that also just are...
Replies from: avturchin↑ comment by avturchin · 2021-10-18T12:10:45.490Z · LW(p) · GW(p)
In his article Mueller says that no physics exists at all. Only math world exists, and dust minds are just random strings of digits.
Replies from: rosyatrandom↑ comment by rosyatrandom · 2021-10-19T10:22:08.694Z · LW(p) · GW(p)
And strings/digits themselves are just a bunch of bits in fancy clothes.
At some point, years ago, I decided that reality was basically just 'nothing', endlessly abstracted, and what can you do? :_D
↑ comment by Jack R (Jack Ryan) · 2021-10-15T09:08:13.935Z · LW(p) · GW(p)
Do you think that "most" of you are Boltzmann brains?
Replies from: rosyatrandom↑ comment by rosyatrandom · 2021-10-18T09:07:20.580Z · LW(p) · GW(p)
I'm not sure we're dealing with quantifiable abstractions here
Replies from: Jack Ryan↑ comment by Jack R (Jack Ryan) · 2021-10-18T22:53:23.370Z · LW(p) · GW(p)
I also had this thought, though I'm not sure--what kind of abstractions are we talking about?
Replies from: rosyatrandom↑ comment by rosyatrandom · 2021-10-19T10:25:27.831Z · LW(p) · GW(p)
There's probably only one kind of fundamental abstraction: can A represent B if you squint real hard? Can 'nothing' represent 'something'*? If so, perhaps that's all you need to get 'everything'.
* Like how you can build numbers up from the empty set: https://en.wikipedia.org/wiki/Set-theoretic_definition_of_natural_numbers
No, because that's a meaningless claim about external reality. The only meaningful claims in this context are predictions.
"Do you expect to see chaos, or a well formed world like you recall seeing in the past, and why?"
The latter. Ultimately that gets grounded in Occam's razor and Solomonoff induction making the latter simpler.
I basically still endorse this, but have shifted even more in the direction of endorsing the simplicity prior: https://www.lesswrong.com/posts/yzrXFWTAwEWaA7yv5/boltzmann-brains-and-within-model-vs-between-models
This is a question similar to "am I a butterfly dreaming that I am a man?". Both statements are incompatible with any other empirical or logical belief, or with making any predictions about future experiences. Therefore, the questions and belief-propositions are in some sense meaningless. (I'm curious whether this is a theorem in some formalized belief structure.)
For example, there's an argument about B-brains that goes: simple fluctuations are vastly more likely than complex ones; therefore almost all B-brains that fluctuate into existence will exist for only a brief moment and will then chaotically dissolve in a kind of time-reverse of their fluctuating into existence.
Should a B-brain expect a chaotic dissolution in its near future? No, because its very concepts of physics and thermodynamics that cause it to make such predictions are themselves the results of random fluctuations. It remembers reading arguments and seeing evidence for Boltzmann's theorem of enthropy, but those memories are false, the result of random fluctuations.
So a B-brain shouldn't expect anything at all (conditioning on its own subjective probability of being a B-brain). That means a belief in being a B-brain isn't something that can be tied to other beliefs and questioned.
No.
Mathy-answer:
Because "thinking" is an ability that implies the ability to predict future states off the world based off of previous states of the world. This is only possible because the past is lower entropy than the future and both are well below the maximum possible entropy. A Boltzman brain (on average) arises in a maximally entropic thermal bath, so "thinking" isn't a meaningful activity a Boltzman brain can engage in.
Non Mathy answer:
Unlike the majority of LW readers, I don't buy into the MWI or Mathematical realism, or generally any exotic theory that allows for super-low-probability events. The universe was created by a higher power, has a beginning, middle and end, and the odds of a Boltzman brain arising in that universe are basically zero.
In addition to what DanArmak said:
Even if you, in the moment, do not have good reason to be confident that you are not a Boltzmann brain, you do have much better reason to believe that any entity you create in the future is not a Boltzmann brain.
If you wish to improve the accuracy of that entity's beliefs, you can do so by instilling that entity with a low prior of being a Boltzmann brain.
Among the entities you will create in the future is your own future self.
No, I don't. I think the argument for their existence is pretty weak at best, and if they exist and are common, so what? It's the sort of hypothesis for which no possible evidence can be given for or against and no action can be taken in any event.
Even given the (in my opinion pretty unlikely) hypotheses of their existence and ubiquity, what's the point of considering whether you're one of them? Such "observers", stretching the term to cover entities with essentially certain inability to form thoughts, lacking any sort of consistent memories, and hallucinating in their mean lifetime of less than a millisecond, can't do anything about it anyway.
3 comments
Comments sorted by top scores.
comment by Shmi (shminux) · 2021-10-15T07:48:23.635Z · LW(p) · GW(p)
Sean Carroll talked about this just recently, in the context of Bayesianism https://www.preposterousuniverse.com/podcast/2021/09/16/ama-september-2021/
It's around 2:11:08, or Ctrl-F in transcript.
comment by Viliam · 2021-10-17T19:37:07.076Z · LW(p) · GW(p)
If I am a Boltzmann brain, and I guess correctly, what do I gain?
If I am not a Boltzmann brain, and I guess incorrectly, what do I lose?
Replies from: Jack Ryan↑ comment by Jack R (Jack Ryan) · 2021-10-17T21:02:17.648Z · LW(p) · GW(p)
In general, I think it does matter what you think "actually exists" even outside of what you can observe. For instance, to me it seems like your beliefs about what "actually exists" would affect how you acausally trade, but I haven't thought about this much.