I need a protocol for dangerous or disconcerting ideas.
post by Fivehundred · 2015-07-12T01:58:51.257Z · LW · GW · Legacy · 156 commentsContents
156 comments
I have a talent for reasoning my way into terrifying and harmful conclusions. The first was modal realism as a fourteen-year-old. Of course I did not understand most of its consequences, but I disliked the fact that existence was infinite. It mildly depressed me for a few days. The next mistake was opening the door to solipsism and Brain-in-a-Vat arguments. This was so traumatic to me that I spent years in a manic depression. I could have been healed in a matter of minutes if I had talked to the right person or read the right arguments during that period, but I didn't.
Lesswrong has been a breeding ground of existential crisis for me. The Doomsday argument (which I thought up independently), ideas based on acausal trade (one example was already well known; one I invented myself), quantum immortality, the simulation argument, and finally my latest and worst epiphany: the potential horrible consequences of losing awareness of your reality under Dust Theory. I don't know that that's an accurate term for the problem, but it's the best I can think of.
This isn't to say that my problems were never solved; I often worked through them myself, always by refuting the horrible consequences of them to my own satisfaction and never through any sort of 'acceptance.' I don't think that my reactions are a consequence of an already depressed mind-state (which I certainly have anyway) because the moment I refute them I feel emotionally as if it never happened. It no longer wears on me. I have OCD, but if it's what's causing me to ruminate than I think I prefer having it as opposed to irrational suppression of a rational problem. Finding solutions would have taken much longer if I hadn't been thinking about them constantly.
I've come to realize that this site, due to perhaps a confluence of problems, was extremely unhelpful in working through any of my issues, even when they were brought about of Lesswrong ideas and premises. My acausal problem [1] I sent to about five or six people, and none of them had anything conclusive to say but simply referred me to Eliezer. Who didn't respond, even though this sort of thing is apparently important to him. This whole reaction struck me as disproportionate to the severity of the problem, but that was the best response I've had so far.
The next big failure was my resolution to the Doomsday argument. [2] I'm not very good yet at conveying these kind of ideas, so I'm not sure it was entirely the fault of the Lesswrongers, but still. One of them of them insisted that I needed to explain how 'causality' could be violated; isn't that the whole point of acausal systems? My logic was sound, but he substituted abstractly intuitive concepts in place of them. I would think that there would be something in the Sequences about that.
The other posters were only marginally more helpful. Some of them challenged the self-sampling assumption, but then why even bother if the problem I'm trying to solve requires it to be true? In the end, not one person even seemed to consider the possibility that it might work. Even though it is a natural extrapolation from other ideas which are taken very very seriously by Lesswrong. Instead of discussing my resolution, they discussed the DA itself, or AI, or whatever they found more interesting.
Finally, we come to an absolutely terrifying idea I had a few days ago, which I naively assumed would catch the attention of any rational person. An extrapolation of Dust Theory [3] implied that you might die upon going to sleep, not immediately, but through degeneration, and that the person who wakes up in the morning is simply a different observer, who has an estimated lifespan of however long he remains awake. Rationally anyone should therefore sign up for cryonics and then kill themselves, forcing their measure to continue into post-Singularity worlds that no longer require him to sleep (not that I would have ever found the courage to do this). [4] In the moments when I considered it most plausible I gave it no more than a 10% chance of being true (although it would have been higher if I had taken Dust Theory for granted), and it still traumatized me in a way I've never experienced before. Always during my worst moments sleep came as a relief and escape. Now I cannot go to sleep. Only slightly less traumatizing was the idea that during sleep my mind declines enough to merge into other experiences and I awake into a world I would consider alien, with perfectly consistent memories.
My inquiries on different threads were almost completely ignored, so I eventually created my own. After twenty-four hours there were nine posts, and now there are twenty-two. All of them either completely miss the point (always not realizing this) or show complete ignorance about what Dust Theory is. The idea that this requires any level of urgency does not seem to have occurred to anyone. Finally, the second part of my question, which asked about the six-year-old post "getting over Dust Theory" was completely ignored, despite having ninety-five comments on it by people who seem to already understand it themselves.
I resolved both issues, but not to my own satisfaction: while I now consider the death outcome unlikely enough to dismiss, the reality-jumping still somewhat worries me. I now will not be able to go to sleep without fear for the next few months; maybe longer, and my mental and physical health will deteriorate. Professional help or a hotline is out of the question because I will not inflict these ideas on people who are not equipped to deal with them, and also because I regard psychologists as charlatans or, at best, practitioners of a deeply unhealthy field. The only option I have to resolve the issues is talking to someone who can discuss it rationally.
This post [5] by Eliezer, however unreliable he might be, convinced me that he might actually know what he is talking about (though I still don't know how Max Tegmark's rebuttal to quantum immortality is refuted, because it seems pretty airtight to me). More disappointing is Nick Bostrom's argument that mind-duplicates will experience two subjective experiences; he does not understand the idea of measure, i.e. that we exist in all universes that account for our experiences, but more in some than others. Still, I think there has to be someone out there who is capable of following my reasoning- all the more frustrating, because the more people misapprehend my ideas, the clearer and sharper they seem to me.
Who do I talk to? How do I contact them? I doubt that going around emailing these people will be effective, but something has to change. I can't go insane, as much as that would be a relief, and I can't simply ignore it. I need someone sane to talk to, and this isn't the place to find that.
Sorry if any of this comes off as ranting or incoherent. That's what happens when someone is pushed to all extremes and beyond. I am not planning on killing myself whatsoever and do not expect that to change. I just want help.
[1] http://lesswrong.com/lw/l0y/i_may_have_just_had_a_dangerous_thought/ (I don't think that the idea is threatening anymore, though.)
[2] http://lesswrong.com/lw/m8j/a_resolution_to_the_doomsday_argument/
[3] http://sciencefiction.com/2011/05/23/science-feature-dust-theory/
[4] http://lesswrong.com/lw/mgd/the_consequences_of_dust_theory/
[5] http://lesswrong.com/lw/few/if_mwi_is_correct_should_we_expect_to_experience/7sx3
(The insert-link button is greyed out, for whatever reason.)
156 comments
Comments sorted by top scores.
comment by gjm · 2015-07-12T09:36:09.328Z · LW(p) · GW(p)
There is a pattern here, and part of it looks like this. You contemplate an idea X and it bothers you. You circulate your concerns among a number of people who are good at thinking and interested in ideas like X. None of them is bothered by it; none of them seems to see it the same way as you do. And, in every case, you conclude that all those people have failed to understand your idea.
Now, I think there are two kinds of explanation for this. First, we have (to put it crudely) the ones in which you are right and everyone else is wrong.
- These ideas are so horrifying that almost everyone flinches away from them mentally before they can really engage with them. The other people you talk to about X might be able to understand it, but they won't.
- You are super-abnormally good at understanding these things, and the other people you talk to about X simply don't have the cognitive horsepower to understand it.
- X is really hard to express (in general, or for you in particular) and on these occasions you have not been successful. So, while the other people could have understood X, they haven't yet had it explained clearly enough.
And then we have (to put it crudely, again) the ones in which you are wrong and everyone else is right. They all begin "You have, for whatever reason, become unduly upset about X", and continue:
- ... Others don't feel the same, and so they don't pay as much attention to X as you think they should.
- ... Now if anyone offers their own analysis of X and it somehow conflicts with (or merely doesn't include) that feeling of upset-ness, it will seem wrong to you.
- ... Other people see that you're upset, and what they say about X is aimed at some version of X they've thought of that would justify the upset-ness. But your upset-ness actually has other causes, so they're inventing versions of X that don't match yours.
For obvious reasons you will be more inclined to endorse the first kind of explanation. But an "outside view" suggests that the second kind is more likely.
Possibly relevant: Existential Angst Factory. Your situation is clearly not exactly the same as the one described there, but you should consider the possibility that your unusually dramatic reaction to these ideas is at least partly the result of something other than being the only person who truly understands them.
Now, considering the only one of those discussions that I've been in recently: I think you are simply incorrect to say that no one who disagreed with you in the Dust Theory thread actually understands Dust Theory. What might be true, though, is that you have (so to speak) your own private version of Dust Theory, and no one understands it because you haven't explained it and have just kept saying "Dust Theory".
Replies from: ChaosMote, Fivehundred↑ comment by ChaosMote · 2015-07-13T04:40:34.704Z · LW(p) · GW(p)
@gjm:
Just wanted to say that this is well thought out and well written - it is what I would have tried to say (albeit perhaps less eloquently) if it hadn't been said already. I wish I had more than one up-vote to give.
@Eitan_Zohar:
I would urge you to give the ideas here more thought. Part of the point here is that from you are going to be strongly biased for thinking your explanations are of the first sort and not the second. By virtue of being human, you are almost certainly biased in certain predictable ways, this being one of them. Do you disagree?
Let me ask you this: what would it take to make you change your mind; i.e. that the explanation for this pattern is one of the latter three reasons and not the former three reasons?
Replies from: gjm↑ comment by Fivehundred · 2015-07-12T14:06:10.984Z · LW(p) · GW(p)
X is really hard to express (in general, or for you in particular) and on these occasions you have not been successful. So, while the other people could have understood X, they haven't yet had it explained clearly enough.
This is pretty much it, although I'm frustrated at the sheer lack of engagement.
I think you are simply incorrect to say that no one who disagreed with you in the Dust Theory thread actually understands Dust Theory.
I didn't say that, I said that no one understood my specific argument and that a few just didn't understand Dust Theory.
Replies from: Dentin, None↑ comment by Dentin · 2015-07-13T13:48:55.739Z · LW(p) · GW(p)
I'm pretty sure I understand your specific argument regarding dust theory. I'm also pretty sure that the reason I'm not upset is because I require observables to actually care about things like that. You're worried about an idea/argument that has no backing evidence, makes no observable predictions, and is unfalsifiable - no matter how horrible it sounds, it isn't sane to fret over that sort of thing.
Also, I would encourage you to spend some time on the concept of identity for yourself. Even if your idea/argument did have backing evidence, it wouldn't be horrible to me because I allow distributed identity.
↑ comment by [deleted] · 2015-07-13T01:59:19.332Z · LW(p) · GW(p)
I haven't seen you change your mind once in this entire thread.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T02:07:52.407Z · LW(p) · GW(p)
I'm sorry, I've tried to be as open as possible to different forms of help.
Replies from: None↑ comment by [deleted] · 2015-07-13T21:06:56.014Z · LW(p) · GW(p)
From an outside view, I wouldn't take the time to offer my advice on this thread - I would feel that my advice would simply be shot down with the first objection that pops into your head.
When I make a thread like this asking for help, the first thing I do is set a time to try out the best piece of advice I get, even if I think that all the advice is awful. Not only does it encourage people to help me in the future, but just maybe I'll find out through experimentation that my untested assumptions were wrong.
comment by Shmi (shminux) · 2015-07-12T04:26:18.429Z · LW(p) · GW(p)
Another arm-chair diagnosis here. Clearly far-out ideas affect you more than other, equally (or more) intelligent people. This is almost certainly a flaw in how your brain functions, not an indication of the problem's severity. If it were, some of those smart people you contacted would consider it seriously. If you concede that this is a problem with your brain, then you should consult the experts on fixing the brain, not fixing the future.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T04:27:18.853Z · LW(p) · GW(p)
You're bad at arm-chair diagnostics, buddy.
I don't consider "smartness" to be single category. Whatever my other limitations, I think I'm a very logical thinker and that I process ideas more thoroughly than others.
Replies from: James_Miller↑ comment by James_Miller · 2015-07-12T05:33:29.662Z · LW(p) · GW(p)
Which would be harder for you to take: (1) What shminux said is true, or (2) Your fears are justified? My attempt at arm-chair diagnostics is that you have gotten yourself into a trap where either you are right which is bad, or you are not analyzing problems as logically as you think which is also bad.
shminux might not be right, but given your information set I think you have to assign a reasonably high probability to him being right, and not doing this is a sign you are not looking at the situation rationally which means you should assign even a higher probability to him being right.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T14:01:14.415Z · LW(p) · GW(p)
Absolutely I would be thrilled if he was right.
shminux might not be right, but given your information set I think you have to assign a reasonably high probability to him being right, and not doing this is a sign you are not looking at the situation rationally which means you should assign even a higher probability to him being right.
I think this is circular reasoning...?
Replies from: shminux↑ comment by Shmi (shminux) · 2015-07-12T17:49:33.888Z · LW(p) · GW(p)
Given that multiple people pointed out that the problem might be with you and not with the universe, consider assigning a non-small probability that this is indeed the case and talk to the people who can fix you. Imagine the payoff: "you would be thrilled". The alternative you have been pursuing so far clearly isn't working -- maybe it's time to face reality.
comment by Darklight · 2015-07-12T19:28:42.024Z · LW(p) · GW(p)
I think what you're doing is something that in psychology is called "Catastrophizing". In essence you're taking a mere unproven conjecture or possibility, exaggerating the negative severity of the implications, and then reacting emotionally as if this worst case scenario were true or significantly more likely than it actually is.
The proper protocol then is to re-familiarize yourself with Bayes Theorem (especially the concepts of evidence and priors), compartmentalize things according to their uncertainty, and try to step back and look at your actual beliefs and how they make you feel.
Rationality is more than just recognizing that something could be true, but also assigning appropriate degrees of belief to ideas that have a wide range of certainties and probability. What I am seeing repeatedly from your posts about the "dangers" of certain ideas, is that your assigning far too much fear to things which other people aren't.
To use an overused quote: "Fear is the mindkiller."
Try to look at the consequences of these ideas as dispassionately as possible. You cannot control everything that happens to you, but you can, to an extent, control your response to these circumstances.
For instance, with Dust Theory, you say that you gave it at most a 10% chance of being true, and it was paralyzing to you. This shouldn't be. First, you need to consider your priors and the evidence. How often in the past have you had the actual experience that Dust Theory suggests is possible and which you fear? What actual experiential evidence do you have to suggest that Dust Theory is true?
For that matter, one of the common threads of your fears seems to be that "you" cease to exist and are replaced by a different "you" or that "you" die. But the truth is already the case that people are constantly changing. The "you" from 10 years ago will be made up of different atoms than the "you" 10 years from now by virtue of the fact that our cells are constantly dying and being replaced. The thoughts we have also change from moment to moment, and our brains adjust the strengths of the connections between neurons in order to learn, such that our past and future brains gradually diverge.
The only thing that really connects our past, present, and future selves is causality, in the sense that our past selves lead to our future selves when you follow the arrow of time. Therefore, what you -think- is a big deal, really isn't.
This doesn't mean you shouldn't care about your future selves. In fact, in the same way that you should care about the experiences of all sentient beings because those experiences are real, you should care about the experiences of your future selves.
But don't worry so much about things that you cannot control, like whether or not you'll wake up tomorrow because of Dust Theory. I cannot see how worrying about this possibility will make it any more or less likely to occur. For all we know the sun could explode tomorrow. There is a non-zero possibility of that happening because we don't know everything. But the probability of that happening, given our past experience with the sun, is very very very low, and as such behaving as if it will happen is completely irrational. Act according to what is MOST likely to happen, and what is MOST likely true, given the information you have right now. Maximize the Expected Utility. Expected is the key word here. Don't make plans based on mere hopes or fears unless they are also expected. In statistics, expectation is commonly associated with the mean or average. Realistically, what will happen tomorrow will probably be a very average day.
That is being rational.
Hope that helps!
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T02:04:12.808Z · LW(p) · GW(p)
For instance, with Dust Theory, you say that you gave it at most a 10% chance of being true, and it was paralyzing to you. This shouldn't be. First, you need to consider your priors and the evidence.
No, I rated the death outcome as having a 10% chance of being true. But now I rate it much lower.
How often in the past have you had the actual experience that Dust Theory suggests is possible and which you fear? What actual experiential evidence do you have to suggest that Dust Theory is true?
This:
It will be some kind of natural selection in dust world lines, which will result in more stable ones, and most likely I am already in such line. In this line dreaming is built such that it will not result in important shifts of reality. And it is true: dreaming is not unconsciousness state. I start to have dreams immediately than I fall asleep. So dreaming is built to be not interupting some level of consciousness.
Basically, the fact that we do it only a little bit accounts for our observations in ways that other cosmological theories can't.
For that matter, one of the common threads of your fears seems to be that "you" cease to exist and are replaced by a different "you" or that "you" die. But the truth is already the case that people are constantly changing. The "you" from 10 years ago will be made up of different atoms than the "you" 10 years from now by virtue of the fact that our cells are constantly dying and being replaced. The thoughts we have also change from moment to moment, and our brains adjust the strengths of the connections between neurons in order to learn, such that our past and future brains gradually diverge.
Er, you don't understand the problem. I was worried about my subjective self dying.
Replies from: Dentin, Darklight↑ comment by Dentin · 2015-07-15T04:03:40.538Z · LW(p) · GW(p)
I suspect part of the issue here is that your concept of subjective self isn't constructed to be compatible with these kinds of thought experiments, or with the idea that reality may be forking and terminating all the time. I can say that because mine -is- compatible with such things, and as a result pretty much all of this category of problem doesn't even show up on my radar.
Assuming I had a magical copying device that could copy my body at a sufficient accuracy, I could:
use the copier to create a copy of myself, and as the copy I could do the household chores then self destruct to free up resources without worrying about my 'self' dying.
use the copier to create a copy of myself, then as the original go do the chores/self destruct without worrying about my 'self' dying.
if there was a resource conflict which required the destruction of a copy, I could decide that I was the 'least important' copy and self terminate without worrying about my 'self' dying.
When a person's sense of identity can do the above things, concerns about your dust scenario really don't even show up as relevant - it doesn't matter which timeline or state you end up in, so long as your self is active somewhere, you're good.
How would you treat the above situations?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-15T04:36:17.972Z · LW(p) · GW(p)
use the copier to create a copy of myself, and as the copy I could do the household chores then self destruct to free up resources without worrying about my 'self' dying.
I wouldn't do it in the first place, since there's a fifty percent chance of me winding up doomed. But if the copy is already created than no, it would not be me dying.
use the copier to create a copy of myself, then as the original go do the chores/self destruct without worrying about my 'self' dying.
That is absolutely dying.
if there was a resource conflict which required the destruction of a copy, I could decide that I was the 'least important' copy and self terminate without worrying about my 'self' dying.
Same thing for this.
Replies from: Dentin↑ comment by Dentin · 2015-07-15T15:28:27.752Z · LW(p) · GW(p)
That's what I figured. If anything, I'd say that this is your core issue, not dust theory. Your sense of subjective self just doesn't map well onto what it's actually possible to do, so of course you're going to get garbage results from time to time.
comment by jimrandomh · 2015-07-12T03:39:47.769Z · LW(p) · GW(p)
The next mistake was opening the door to solipsism and Brain-in-a-Vat arguments. This was so traumatic to me that I spent years in a manic depression.
Consider the possibility that the manic-depression was coincidental. When people have mental things happen for fundamentally biochemical reasons, they often misattribute them to the most plausible seeming non-biochemical cause they can think of. Exposure to ideas can exacerbate an existing problem, but it is unlikely that the lowest-hanging here has anything at all to do with the ideas themselves. Instead of looking at how you engage with stressful ideas, consider looking into other aspects of your life which might reduce your resilience.
With that said...
You started with a set of values and preferences and an ontology. When you encountered dust theory, you discovered that one of the definitions used to define your values - the notion of personal identity - wasn't fully coherent. You then tried to substitute a different definition in its place - an alternative notion of personal identity, which might not carry across a sleep/wake cycle. This alternate notion of identity is not the thing you care about. A small philosophically-minded portion of your brain has decided that it is what you care about, and is now in conflict with the other parts of your brain which don't accept the altered values. Listen to them; while those brain-parts aren't good at explaining things, they have knowledge and in this case they are right.
Replies from: None, Fivehundred, buybuydandavis, TheAncientGeek↑ comment by [deleted] · 2015-07-12T16:17:31.398Z · LW(p) · GW(p)
Replies to the comment you are now reading accurately describe my ideas so the original post has been replaced by this disclaimer to spare your time :)
Replies from: drethelin↑ comment by drethelin · 2015-07-12T23:00:18.133Z · LW(p) · GW(p)
How would the app know? You would need some sort of automatic system that scans every parking spot to see if there is a car currently in it.
Replies from: faul_sname, None↑ comment by faul_sname · 2015-07-13T10:03:48.378Z · LW(p) · GW(p)
How difficult would this be, out of curiosity, keeping in mind that you don't need 100% accuracy? I can think of a couple approaches, though probably nothing that would be supported by any revenue model I can think of off the top of my head.
↑ comment by [deleted] · 2015-07-13T03:07:22.953Z · LW(p) · GW(p)
or just crowdsource it :)
Replies from: drethelin↑ comment by drethelin · 2015-07-13T04:33:16.179Z · LW(p) · GW(p)
This is putting the cart before the horse. A crowdsourced app that requires user to report ACCURATELY which parking spots are free when will only work when it has a lot of users. But it can't get users unless it's a useful app.
Replies from: None↑ comment by [deleted] · 2015-07-13T05:35:24.130Z · LW(p) · GW(p)
Unless it's built upon existing platforms that map out where paid parking spots are so that users already benefit from a service from the app. Parking spot owners have an incentive tor report their spots to get parkers. Not to mention shopping centres and other business want to attract people to the area.
↑ comment by Fivehundred · 2015-07-12T04:04:01.205Z · LW(p) · GW(p)
Consider the possibility that the manic-depression was coincidental. When people have mental things happen for fundamentally biochemical reasons, they often misattribute them to the most plausible seeming non-biochemical cause they can think of.
I have. It definitely isn't. It may have been exacerbated by biochemical causes, but it wasn't caused by them alone. (Sertraline did help me, just never as much as nullifying an existential problem.)
You then tried to substitute a different definition in its place - an alternative notion of personal identity, which might not carry across a sleep/wake cycle.
So you accept the argument?
This alternate notion of identity is not the thing you care about. A small philosophically-minded portion of your brain has decided that it is what you care about, and is now in conflict with the other parts of your brain which don't accept the altered values. Listen to them; while those brain-parts aren't good at explaining things, they have knowledge and in this case they are right.
I have no idea what you are trying to say, beyond "listen to your instincts because are more suited for the real world than your intellect."
Replies from: ChristianKl, MarsColony_in10years, jimrandomh↑ comment by ChristianKl · 2015-07-12T12:31:17.552Z · LW(p) · GW(p)
I have. It definitely isn't. It may have been exacerbated by biochemical causes, but it wasn't caused by them alone. (Sertraline did help me, just never as much as nullifying an existential problem.)
The fact that taking drugs for your mental issues doesn't nullify your concerns about existential problems in no way implies that your worries about those problems don't come as a result of mental health issues.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T14:29:46.103Z · LW(p) · GW(p)
Sure, but I can say that I wouldn't be depressed at all if not for those existential problems. I mean, I would be depressed but in a general, background sort of way.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-07-12T14:43:03.649Z · LW(p) · GW(p)
You can say that and of course it seems true to you. It's just like it feels true to the schizophrenic that the CIA is out to get him and his paranoia is due to the CIA trying to get him and not due to the fact that he's a schizophrenic.
Psychological research in general suggests that people are quite good at finding ways to rationalize their emotions. There a strong outside view, that suggests that rationalizations are usually not the root cause.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T14:47:29.643Z · LW(p) · GW(p)
It's just like it feels true to the schizophrenic that the CIA is out to get him and his paranoia is due to the CIA trying to get him and not due to the fact that he's a schizophrenic.
I've considered it at various points over the last seven years. I think I've justified it properly.
The nature of outside views is that they are going to be wrong eventually.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-07-12T14:54:38.426Z · LW(p) · GW(p)
I think I've justified it properly.
Of course you do, as the pressures for internal mental consistency are very strong.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T14:55:34.764Z · LW(p) · GW(p)
This isn't an argument, it's Descartes' demon.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-07-12T18:49:07.951Z · LW(p) · GW(p)
Understanding mental biases and how our brain plays tricks on it is a core part of LW. It hasn't much to do with logical argument but with modern psychological research.
It's no easy skill to notice when your emotions prevent you from clearly thinking about an issue.
Saying "The nature of outside views is that they are going to be wrong eventually." is also very particular. If I'm testing gravity by repeating scientific experiments whereby I drop balls, I'm engaging in the outside view. Science is all about the outside view instead of subjective experience.
When one is subject to a mental illnesses that generally is known to make on think irrationally about an issue, it's useful to not trust one's reasoning and instead seek help for the mental illness by trustworthy people. Bootstrapping trust isn't easy. There are valid reasons why you might not trust the average psychologists enough to trust his judgement over your own.
The general approach is too find trustworthy in person friends. For LW type ideas, you find them at LW meetups. You likely don't want to pull all your information from people from a LW meetup but if your LW friends say that you are irrational about an issue, your mainstream psychologists tells you, you are irrational about the issue and other social contacts also tell you that you are irrational, no matter how strongly it feels like you are right, you should assume that you aren't right.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T01:58:54.417Z · LW(p) · GW(p)
Well, I definitely know that my depression is causally tied to my existential pessimism. I just don't if it's the only factor, or if fixing something else will stop it for good. But as I said, I don't necessarily want to default to ape mode.
Replies from: ChaosMote↑ comment by ChaosMote · 2015-07-13T04:02:00.730Z · LW(p) · GW(p)
I definitely know that my depression is causally tied to my existential pessimism.
Out of curiosity, how do you know that this is the direction of the causal link? The experiences you have mentioned in the thread seem to also be consistent with depression causing you to get hung up on existential pessimism.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T05:46:27.850Z · LW(p) · GW(p)
I go through long periods of peace, only to find my world completely shaken as I experience some fearful epiphany. And I've experienced a complete cessation of that feeling when it is decisively refuted.
Replies from: ChaosMote↑ comment by ChaosMote · 2015-07-13T13:47:28.641Z · LW(p) · GW(p)
Okay, but at best, this shows that the immediate cause of you being shaken and coming out of it is related to fearful epiphanies. Is it not plausible that the reason that, at a given time, you find particular idea horrific or are able to accept a solution as satisfying depending on your mental state?
Consider this hypothetical narrative. Let Frank (name chosen at random) be a person suffering from occasional bouts of depression. When he is healthy, he notices an enjoys interacting with the world around him. When he is depressed, he instead focuses on real or imagined problems in his life - and in particular, how stressful his work is.
When asked, Frank explains that his depression is caused by problems at work. He explains that when he gets assigned a particularly unpleasant project, his depression flares up. The depression doesn't clear up until things get easier. Frank explains that once he finishes a project and is assigned something else, his depression clears up (unless the new project is just as bad); or sometimes, through much struggle, he figure out how to make the project bearable, and that resolves the depression as well.
Frank is genuine in expressing his feelings, and correct about work problems being correlated with his depression, but he is wrong about causation between the two.
Do you find this story analogous to your situation? If not, why not?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-14T01:14:29.988Z · LW(p) · GW(p)
I find it hard to believe. But maybe I've always been depressed and that's why I've suffered from them so badly.
↑ comment by MarsColony_in10years · 2015-07-12T21:46:59.921Z · LW(p) · GW(p)
I have no idea what you are trying to say, beyond "listen to your instincts because are more suited for the real world than your intellect."
I think he was trying to make a map-territory distinction. You have a mental model of how your brain computes value. You also have your brain, computing value however it actually computes value. Since our values are quite complex, and likely due to a number of different physical causes, it is reasonable to conclude that our mental model is at best an imperfect approximation.
I don't think he's trying to say "listen to your heart" so much as "the map is not the territory, but both are inside your brain in this instance. Because of this, it is possible to follow the territory directly, rather than following your imperfect map of the territory."
That said, we are now a couple meta-levels away from your original question. To bring things back around, I'd suggest that you try and keep in mind that any odd, extreme predictions your mental models make may be flaws in an oversimplified model, and not real existential disasters. In some cases, this may not seem to be the case given other pieces of evidence, but hopefully in other instances it helps.
The greater the inferential distance you have to go to reach an uncomfortable conclusion, the higher the likelihood that there is a subtle logical flaw somewhere, or (much more common) some unknown-unknown that isn't even being taken into account. LessWrong tends to deal with highly abstract concepts many steps removed from observations and scientifically validated truths, so I suspect that a large fraction of such ideas will be discredited by new evidence. Consider shifting your probability estimates for such things down by an order of magnitude or more, if you have not already done so. (That last paragraph was an extremely compressed form of what should be a much larger discussion. This hits on a lot of key points, though.)
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T01:54:14.824Z · LW(p) · GW(p)
That does sound like reasonable advice... however I now have empirical evidence for Dust Theory. Still, most of the horrible problems in it seem to have been defused.
Replies from: Dentin↑ comment by Dentin · 2015-07-13T13:56:45.741Z · LW(p) · GW(p)
What is your empirical evidence for dust theory?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-14T01:17:00.103Z · LW(p) · GW(p)
Point 2: http://lesswrong.com/lw/mgd/the_consequences_of_dust_theory/ck0q
Replies from: Dentin↑ comment by Dentin · 2015-07-15T03:50:22.902Z · LW(p) · GW(p)
That doesn't even remotely meet the bar for 'evidence' from my standpoint. At best, you could say that it's a tack-on to the original idea to make it match reality better.
Put another way, it's not evidence that makes the idea more likely, it's an addition that increases the complexity yet still leaves you in a state where there are no observables to test or falsify anything.
In common terms, that's called a 'net loss'.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-15T04:33:06.834Z · LW(p) · GW(p)
Why do we dream? Because a large amount of conscious beings join the measure of beings who can. That's why we find ourselves as pre-singularity humans. I'd say that's empirical evidence.
Replies from: Dentin↑ comment by Dentin · 2015-07-15T15:39:52.376Z · LW(p) · GW(p)
Sorry, but evidence doesn't really work that way. Even if we allow it, it is exceptionally weak evidence, and not enough to distinguish 'dust theory' from any other of the countless ideas in that same category. Again, it looks to me like a tack-on to the original idea that is needed simply to make the idea compatible with existing evidence.
As for why we dream, it's actually because of particles, forces, and biochemistry. A mundane explanation for a mundane process. No group hive mind of spirit energy or "measure of beings" required.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-15T16:06:49.808Z · LW(p) · GW(p)
Even if we allow it, it is exceptionally weak evidence, and not enough to distinguish 'dust theory' from any other of the countless ideas in that same category.
Dreaming is a very specific process that seems optimized to the scenario I described with DT. Do these other ideas predict the same?
As for why we dream, it's actually because of particles, forces, and biochemistry. A mundane explanation for a mundane process. No group hive mind of spirit energy or "measure of beings" required.
So you are saying that humans or humanlike minds are the most common type of consciousness that is mathematically possible?
Replies from: Dentin↑ comment by Dentin · 2015-07-15T17:40:00.714Z · LW(p) · GW(p)
Dreaming is a very specific process that seems optimized to the scenario I described with DT. Do these other ideas predict the same?
"Dreaming is a very specific process that seems optimized to demonstrate the existence of a dream realm."
"Dreaming is a very specific process that seems optimized to recharge the Earth Spirit that is Mother Gaia."
"Dreaming is a very specific process by which Wyvren allows us to communicate with Legends."
So you are saying that humans or humanlike minds are the most common type of consciousness that is mathematically possible?
I have literally no idea how you could possibly draw that conclusion from the statement that dreaming has a mundane physics-based explanation. The two things aren't even remotely related.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-15T17:56:21.253Z · LW(p) · GW(p)
"Dreaming is a very specific process that seems optimized to demonstrate the existence of a dream realm."
"Dreaming is a very specific process that seems optimized to recharge the Earth Spirit that is Mother Gaia."
"Dreaming is a very specific process by which Wyvren allows us to communicate with Legends."
Dust Theory is a coherent philosophical idea that has certain logical arguments to be made for it based off of our scientific knowledge of minds and quantum theory.
I have literally no idea how you could possibly draw that conclusion from the statement that dreaming has a mundane physics-based explanation. The two things aren't even remotely related.
No, they aren't. Of course dreaming has a mundane physics-based explanation; Dust Theory predicts that as well. We just find ourselves in a universe where dreaming exists.
↑ comment by jimrandomh · 2015-07-12T04:56:09.578Z · LW(p) · GW(p)
I have. It definitely isn't. It may have been exacerbated by biochemical causes, but it wasn't caused by them alone. (Sertraline did help me, just never as much as nullifying an existential problem.)
Sertraline has insomnia listed as a very common (>10%) side effect. If you're currently on it, this is a more parsimonious explanation for your difficulty sleeping than your philosophical beliefs about how sleeping interacts with subjective experience.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T05:05:53.840Z · LW(p) · GW(p)
I'm not on it. I don't have difficulty falling asleep, it's just traumatizing to get in bed.
↑ comment by buybuydandavis · 2015-07-14T02:09:44.024Z · LW(p) · GW(p)
When people have mental things happen for fundamentally biochemical reasons, they often misattribute them to the most plausible seeming non-biochemical cause they can think of
What is likely is that the plausible cause was a cause too.
The biochemistry pushes him close to the edge, and the "plausible cause" pushes him off.
↑ comment by TheAncientGeek · 2015-07-12T11:52:36.567Z · LW(p) · GW(p)
You started with a set of values and preferences and an ontology. When you encountered dust theory, you discovered that one of the definitions used to define your values - the notion of personal identity - wasn't fully coherent.
Dust theory doesn't show anything to be incoherent, because it's only a theory. One cam take its unwelcome conclusions to be a reductio ad absurdum of its premises.
Replies from: Dentincomment by fubarobfusco · 2015-07-12T04:03:33.111Z · LW(p) · GW(p)
I am going to perpetrate a little bit of the sin of amateur psychological diagnosis over the Internet. Sorry about that.
I'm not sure that the substance of the philosophical and cosmological concepts here is what is afflicting you. After all, many people engage with cosmological horror recreationally — see, for instance, the continued popularity of writers such as Lovecraft, Stross, Banks, or the "SCP Foundation" folks.
Exposure to weird cosmological horror does not cause most humans to freak out, at least not for very long. Most people more-or-less instinctively take Egan's Law into account ("it all adds up to normality") — to the extent that this Law is only needed as a reminder for people who don't automatically do so.
It sounds like you are having trouble disengaging from these ideas. So you might want to go seek treatment specifically for anxiety. This doesn't mean "stop thinking about these issues and thereby give up any possibility of coming up with good solutions to them"; it means "become able to stop thinking about these issues when it's getting loopy and unproductive, and get back to ape mode — and remember, ape mode is acceptable; we've been living with it for a long, long time."
If Egan's Law helps, good. OTOH, if prescribed beta blockers help, good too.
On the other hand, many people have struggled with the existence of sickness and death, and not every one of them became a Buddha. Some of what you write seems to be heavily concerned with the notions of personal identity and continuity, and whether this is an illusion. This is an area in which the Buddhists seem to be way ahead of the clinical psychologists in giving people tools to deal with it.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T04:18:03.133Z · LW(p) · GW(p)
It sounds like you are having trouble disengaging from these ideas. So you might want to go seek treatment specifically for anxiety. This doesn't mean "stop thinking about these issues and thereby give up any possibility of coming up with good solutions to them"; it means "become able to stop thinking about these issues when it's getting loopy and unproductive, and get back to ape mode — and remember, ape mode is acceptable; we've been living with it for a long, long time."
Yes, you're quite right, I even had a short panic attack from reading Sam Hughes' SCP fiction. It's just that ape mode isn't acceptable all the time. When it comes to very serious issues I don't think it's acceptable at all, no matter how much I suffer.
Some of what you write seems to be heavily concerned with the notions of personal identity and continuity, and whether this is an illusion. This is an area in which the Buddhists seem to be way ahead of the clinical psychologists in giving people tools to deal with it.
Buddhism just seems like nihilism to me. Not that I know much about it. Anything you could recommend?
Replies from: fubarobfusco↑ comment by fubarobfusco · 2015-07-12T05:04:58.297Z · LW(p) · GW(p)
When it comes to very serious issues I don't think it's acceptable at all, no matter how much I suffer.
People who work on drugs to cure horrible diseases don't spend 24/7 in an airtight suit in the lab, dropping samples on the floor because their hands are shaking. They go home and watch football and play card games and go to the kids' school play and stuff. And maybe they dream about bacteria once in a while, and maybe some of that is upsetting and some of it is informative. But being unable to disengage from the Big Problems and live your little ordinary life is not heroism, and it actively gets in the way of solving any of those Big Problems.
Buddhism just seems like nihilism to me. Not that I know much about it. Anything you could recommend?
Find a meditation teacher and spend some time doing that. Practice > theory.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T05:15:57.683Z · LW(p) · GW(p)
People who work on drugs to cure horrible diseases don't spend 24/7 in an airtight suit in the lab, dropping samples on the floor because their hands are shaking. They go home and watch football and play card games and go to the kids' school play and stuff.
If they or their kids have the horrible disease? I think they'd react differently.
But being unable to disengage from the Big Problems and live your little ordinary life is not heroism, and it actively gets in the way of solving any of those Big Problems.
Not my Big Problems; they get solved from doing just that.
Find a meditation teacher and spend some time doing that. Practice > theory.
I'm going to have to disagree. I thought you were talking about philosophy when you mentioned "notions of personal identity and continuity, and whether this is an illusion."
Replies from: evand, fubarobfusco↑ comment by evand · 2015-07-14T16:53:11.258Z · LW(p) · GW(p)
But being unable to disengage from the Big Problems and live your little ordinary life is not heroism, and it actively gets in the way of solving any of those Big Problems.
Not my Big Problems; they get solved from doing just that.
How do you know? The question isn't whether obsessing fixes the problem; it's whether taking breaks speeds up the overall process. You don't need tons of hours to fix the problem; as you said earlier, a few minutes to explain the right insight is quite sufficient. What you actually need is the right few minutes of work, spent finding the right key insights.
Thinking longer about a problem is only helpful to the degree it produces new insights. As you've found, this can be very inefficient. If taking a break and not worrying about an unsolved problem increases the efficiency of future problem-solving even a little bit, it could well be worth it.
↑ comment by fubarobfusco · 2015-07-12T20:35:12.851Z · LW(p) · GW(p)
Find a meditation teacher and spend some time doing that. Practice > theory.
I'm going to have to disagree. I thought you were talking about philosophy when you mentioned "notions of personal identity and continuity, and whether this is an illusion."
Yes, there are various Buddhist writings about it.
No, I'm not sure that any of them make much sense without actually doing the meditation. There are certain things which are stupidly obvious and okay from a meditative point of view — like "the self is an illusion" — that are either obviously false or incredibly scary from the kind of point of view you're expressing.
I am not an expert in Buddhist practice, though,, and not qualified to provide much advice. I would note that serious current Buddhist writers such as Daniel Ingram make it very clear that people should deal with big psychological and emotional problems before engaging in heavy meditation.
comment by Unknowns · 2015-07-12T05:50:17.117Z · LW(p) · GW(p)
I think what you need to realize is that it is not a question of proving that all of those things are false, but rather that it makes no difference whether they are or not. For example when you go to sleep and wake up it feels just the same whether it is still you or a different person, so it doesn't matter at all.
Replies from: bbleeker↑ comment by Sabiola (bbleeker) · 2015-07-12T19:55:11.102Z · LW(p) · GW(p)
Also, you're changing all the time anyway, even when you're awake. You have experiences, you learn things, you accumulate memories; all things that change you.
comment by Adam Zerner (adamzerner) · 2015-07-12T04:05:22.201Z · LW(p) · GW(p)
I think there should be a discussion about the more general idea of "needing a protocol for discussing dangerous or disconcerting ideas" in addition to the discussion of this specific circumstance.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-07-12T12:29:23.159Z · LW(p) · GW(p)
If you are concerned about an idea driving you insane, the best way to deal with it is to speak in person with friends who can navigate the surrounding meme space.
In the LW memespace you find those people in LW meetups.
Skype call can also work but my ability to affect the emotions of a person when I'm not physically present is significantly limited. It's a lot better than text but still leaves something to be desired.
comment by turchin · 2015-07-12T10:18:31.904Z · LW(p) · GW(p)
I think that I understand your feelings. I had the same periods of existential fear about most of the things that you had. Two of them I discussed in the post about AI failures levels which didn't got any comments to my surprise.
But it is also possible to have existential euphoria. First one I got than proved for my self the idea of quantum immortality. Second is that than I understood that I will become a god in my own branch of the universe. The last needs more complex explanation, which I would omit for now.
But as I became older I got less and less feeling from theoretical ideas.
Of course the most dangerous of all ideas is idea of death itself. Everybody knows it and nobody react on it. Most people got bulletproof for such ideas from 6 years old when they was told that they will die, and after it nothing will happened, and nothing can be done about it.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T15:45:50.602Z · LW(p) · GW(p)
First one I got than proved for my self the idea of quantum immortality. Second is that than I understood that I will become a god in my own branch of the universe. The last needs more complex explanation, which I would omit for now.
I'd be interested to hear about both of these.
comment by Lumifer · 2015-07-13T15:19:06.593Z · LW(p) · GW(p)
Today's blog post by Yvain starts:
Replies from: DavidmanheimAnxiety disorders are the most common class of psychiatric disorders. Their US prevalence is about 20%. They’re also among the least recognized and least treated...
↑ comment by Davidmanheim · 2015-07-14T18:23:30.073Z · LW(p) · GW(p)
Also, they are incredibly treatable.
And on the irrationality scale, not fixing a debilitating problem that is very fixable ranks pretty high.
comment by jimrandomh · 2015-07-12T04:59:58.900Z · LW(p) · GW(p)
the idea that during sleep my mind declines enough to merge into other experiences and I awake into a world I would consider alien, with perfectly consistent memories.
An entity with self-consistent memories is astronomically more likely to be found in a world which matches those memories, than in some mismatched world. The latter has a complexity penalty equal to all the extra mismatched complexity.
Replies from: DanArmak, TheAncientGeek, Fivehundred↑ comment by DanArmak · 2015-07-12T12:54:41.775Z · LW(p) · GW(p)
That depends on the set of worlds and the measure used. For instance, some models predict that almost all minds will be Boltzmann brains that are uncorrelated with their environment beyond a small local bubble. We can't falsify these models directly; we just have to assume they're wrong to be able to use the past to predict the future.
↑ comment by TheAncientGeek · 2015-07-12T12:01:53.259Z · LW(p) · GW(p)
Under what assumptions?
↑ comment by Fivehundred · 2015-07-12T05:06:51.051Z · LW(p) · GW(p)
Yes, I'm saying that memories aren't accessed while you sleep. Don't know to what extent. I always did exist in this world, but I'm also made of many other Eitans from slightly different worlds whose experience of sleeping was identical to mine. I'm just worried about the scale of difference.
comment by TheAncientGeek · 2015-07-12T11:47:30.378Z · LW(p) · GW(p)
Finally, we come to an absolutely terrifying idea I had a few days ago, which I naively assumed would catch the attention of any rational person. An extrapolation of Dust Theory [3] implied that you might die upon going to sleep, not immediately, but through degeneration, and that the person who wakes up in the morning is simply a different observer, who has an estimated lifespan of however long he remains awake. Rationally anyone should therefore sign up for cryonics and then kill themselves, forcing their measure to continue into post-Singularity worlds that no longer require him to sleep (not that I would have ever found the courage to do this). [4]
You say this is tremendously important, but you haven't put in the rather minimal amount of effort necessary to show how it follows from accepted promises.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T14:30:58.515Z · LW(p) · GW(p)
Sorry, but I often can't expend a lot of energy in the middle of a panic attack. That's why I go on sites where many people already are familiar with the premises.
Replies from: fubarobfusco↑ comment by fubarobfusco · 2015-07-12T20:19:37.466Z · LW(p) · GW(p)
It's the panic attack, not the premises, that's the problem. Lots of people spend time thinking about metaphysics and cosmology without making themselves sick.
Thing is, it sounds like the System 2 beliefs are justifying and protecting System 1 dysfunction. ("I should feel crappy, because the conditions of conscious existence are so fucked up.")
From what I can tell, people who have been in this kind of situation and have successfully gotten out of it have done so by fixing the System 1 situation — the reaction of panic, anguish, and despair — and not by adopting new System 2 beliefs. Things that reportedly help include guided meditation, cognitive-behavioral therapy, and guided use of various psychoactive drugs (ranging from antidepressants, anxiolytics, to various psychedelics in a therapeutic context, not in the wild).
In other words, this is probably not the sort of thing that can be fixed by reading the right philosophy or the right post on the web. Although it might disappoint Hermione Granger, reading the right book is not the solution to every problem. Rather, it is probably the sort of thing that requires the personal guidance of an experienced person in fixing the System 1 reactions that are causing you pain.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T01:55:16.468Z · LW(p) · GW(p)
I don't know who to talk to, in this case.
Replies from: IlyaShpitser↑ comment by IlyaShpitser · 2015-07-13T09:52:37.526Z · LW(p) · GW(p)
A good shrink.
Serious suggestion, not a status attack. I sympathize, and wish you well.
Replies from: gjmcomment by ChristianKl · 2015-07-12T09:42:44.626Z · LW(p) · GW(p)
My logic was sound, but he substituted abstractly intuitive concepts in place of them.
Sound logic doesn't help when you start of with bad assumptions.
One of them of them insisted that I needed to explain how 'causality' could be violated; isn't that the whole point of acausal systems?
No, it isn't. The fact that the word acausal exists on LW, doesn't mean that those people who use it don't believe in causality. It's used when speaking about agents that use a specific decision theory.
You furthermore simply pointed at arguments without being explicit about your chain of reasoning. Without you being explicit it's impossible to show you where you are wrong.
I'm not very good yet at conveying these kind of ideas
Because you are not thinking clearly about the ideas. If you deeply desire to be helped help you are lazy. There no reason to write posts about Dust theory without explaining what you believe Dust theory to be. This is not the kind of discussion where it's useful to point to concepts and take those for granted.
There seem to be multiple people who got into mental health issues by thinking too much about ideas in that realm. In general if people warn you about not developing into ideas because of possible mental health risks, follow their advice.
If you want specific help you are more likely find it through in person discussion at LW meetups then through emailing random persons.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T14:16:41.784Z · LW(p) · GW(p)
Sound logic doesn't help when you start of with bad assumptions.
Which I didn't.
No, it isn't. The fact that the word acausal exists on LW, doesn't mean that those people who use it don't believe in causality. It's used when speaking about agents that use a specific decision theory.
Yes, and my idea(s) work the same way! I don't know what acausality is formally but I've seen how it is used in other LW ideas and I'm pretty sure that it applies.
You furthermore simply pointed at arguments without being explicit about your chain of reasoning. Without you being explicit it's impossible to show you where you are wrong.
"You have to prove causality wrong" seems to be a similar line of thinking. And when was I not completely explicit about my reasoning? Point that out for me please.
If you want specific help you are more likely find it through in person discussion at LW meetups then through emailing random persons.
I'm open to that, but I find it almost impossible to clarify thoughts outside of writing.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-07-12T15:00:33.463Z · LW(p) · GW(p)
Yes, and my idea(s) work the same way! I don't know what acausality is formally but I've seen how it is used in other LW ideas and I'm pretty sure that it applies.
"I'm pretty sure that it applies" is no argument. You didn't make the argument for which you believe that's true. You just asserted it to be true. If you would have actually made the argument your post would have been longer than a paragraph and not simple referred to cached thoughts about acausal reasoning.
"You have to prove causality wrong" seems to be a similar line of thinking.
It's similar to the extend that it's no detailed argument. It a statement that asserts that you have the burden of proof for the thesis you make instead of other people having to prove you wrong.
If you care about your mental health than it's useful to demand from people who think they have found away around causality to provide extraordinary evidence for extraordinary claims but not treat the idea that the world runs in a causal fashion as an extraordinary claim.
I'm open to that, but I find it almost impossible to clarify thoughts outside of writing.
I didn't clarify thoughts in writing either. http://lesswrong.com/lw/m8j/a_resolution_to_the_doomsday_argument/ is not clear writing. In person another person can actually show you easily where you are unclear. The can actually interact with the emotion which you ignore when you are writing. Depending on their skill level they can then debug your emotional issues.
There are emotionally ugh-fields that prevent you from seeing issues that prove you wrong when you sit alone in front of your computer. You need real world feedback with humans to show you where you are confused. If you would actually think clearly about the issue you wouldn't have any trouble with in person discussions of it.
It's the only way to stay sane when thinking about an issue like that and your mind blinds you from going down certain paths.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T15:07:12.278Z · LW(p) · GW(p)
If you would have actually made the argument your post would have been longer than a paragraph and not simple referred to cached thoughts about acausal reasoning.
Which post was this? The solution to the DA?
If you care about your mental health than it's useful to demand from people who think they have found away around causality to provide extraordinary evidence for extraordinary claims but not treat the idea that the world runs in a causal fashion as an extraordinary claim.
I don't understand causality scientifically. It's like asking an evolutionary biologist to demonstrate exactly how his theory overcomes the 2nd law of thermodynamics.
I didn't clarify thoughts in writing either. http://lesswrong.com/lw/m8j/a_resolution_to_the_doomsday_argument/ is not clear writing.
I've always been bad at initial presentations. It gets better after observing where people go wrong.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-07-12T16:08:17.882Z · LW(p) · GW(p)
Which post was this? The solution to the DA?
It certainly applies to the post about the solution to DA.
I don't understand causality scientifically. It's like asking an evolutionary biologist to demonstrate exactly how his theory overcomes the 2nd law of thermodynamics.
A smart evolutionary biologist won't draw a blank when you ask him the question. He will tell you that the sun provides earth with entropy that allows life to blossom.
If you think that you yourself don't understand causality scientifically you shouldn't make complex nonintuitive claims about the nature of causality.
Replies from: gjm, Fivehundred↑ comment by Fivehundred · 2015-07-12T16:34:20.754Z · LW(p) · GW(p)
It certainly applies to the post about the solution to DA.
I don't recall saying a single word about acausality. I put forward my logic.
A smart evolutionary biologist won't draw a blank when you ask him the question. He will tell you that the sun provides earth with entropy that allows life to blossom.
Sure, but I don't claim to be an expert in anything. I just had an idea that seemed sound.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-07-12T19:16:12.603Z · LW(p) · GW(p)
You didn't list all the assumption you make. You didn't explain a causal chain of how what you are proposing will lead to the effects that you desire.
If you would have the post would be longer than a paragraph as any LW post introducing a substantial new concept is.
I just had an idea that seemed sound.
You expressed an idea and the idea felt sound for you but you didn't go into a deep argument for it.
comment by James_Miller · 2015-07-12T02:41:58.513Z · LW(p) · GW(p)
Let X be something bad. If X is true it is something you and nearly every other person should rightly fear. If, however, no one but you fears X than either (1) you are mistaken, or (2) you have some special information or insight that everyone else lacks. Logically, it's almost certainly (1). So if you fear X but Eliezer, Bostrom, and Hanson don't appear to, take comfort from their lack of fear even if you don't understand it.
Replies from: adamzerner, Fivehundred↑ comment by Adam Zerner (adamzerner) · 2015-07-12T04:02:24.154Z · LW(p) · GW(p)
but Eliezer, Bostrom, and Hanson don't appear to
I'm glad you qualify this with "appear to". At some point the thought occurred to me that it isn't always in their strategic interest to publicize everything, or even to be honest about everything they think. Previously I just assumed that they'd always be honest, and I sense that other people might be (unconsciously?) making the same assumption. I don't understand X well enough at all to speak to this particular situation though. I guess I'm just glad you acknowledge the possibility.
take comfort from their lack of fear even if you don't understand it.
That could be difficult to do. It shouldn't be that difficult to update ones confidence given the beliefs of these smart people, but updating your confidence and updating your comfort levels unfortunately are different things. I'm not sure how linked they are in this situation though.
↑ comment by Fivehundred · 2015-07-12T02:46:09.970Z · LW(p) · GW(p)
No, option (2) is perfectly possible given the highly counterintuitive nature of the problem. Also, haven't I just pointed out that Bostrom does not understand Dust Theory?
Replies from: James_Miller↑ comment by James_Miller · 2015-07-12T02:52:37.763Z · LW(p) · GW(p)
While (2) is possible, the probability is low. Academics love counter-intuitive problems and think about them a lot. Bostrom might not understanding Dust Theory in the way that he doesn't understand the power of the Time Cube.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T02:59:59.999Z · LW(p) · GW(p)
Few people know about Dust Theory, even fewer understand it intuitively.
Replies from: Halfwitz↑ comment by Halfwitz · 2015-07-12T03:20:38.916Z · LW(p) · GW(p)
Dust theory is beautiful and terrifying, but what do you say to Egan's argument against it: http://gregegan.customer.netspace.net.au/PERMUTATION/FAQ/FAQ.html
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T03:34:18.505Z · LW(p) · GW(p)
Erm, you mean his argument that we would expect to find ourselves in a more chaotic universe? Well, it might be that such observers are less 'dense' than ones in a stable universe (I never grasped the mathematics of it), and if that's the case than I don't see how the argument works. But then the opposite problem applies- our universe is far too complex, and relies upon contingencies that are highly improbable, merely for observers to exist. The only solution for the 'Big World' is that universes like these have a high-base rate, and that this really is the most common type of scenario that produces life. But that can't save Dust Theory, and probably not Ultimate Ensemble either.
On the other hand, if Egan is right, losing mental awareness in a chaotic universe could have the opposite effect of what I first thought- propelling you into more stable worlds by virtue of your continued existence. This may explain our current observations. But that's a very cloudy line of thinking. Maybe such beings "join" with sleeping human infants if the observations match theirs. But still, this universe seems too stable; why would they have defaulted to this one? And why would the most common type of observer be similar enough to human even to have that much in common?
I'm pretty sure someone who knows what they're talking about could put this question to rest. But no one here will even understand it.
Replies from: Halfwitz↑ comment by Halfwitz · 2015-07-12T03:44:26.562Z · LW(p) · GW(p)
Well, it might be that such observers are less 'dense' than ones in a stable universe
In that case most of your measure is in stable universes and dust theory isn't anything to worry about.
But that can't be the case, as isn't the whole point of dust theory that basically any set of relations can be construed as a computation implementing your subjective experience, and this experience is self-justifying? If that's the case the majority of your measure must be dust.
Dust theory has a weird pulled-up-by-your-own bootstraps taste to it and I have a strong aversion to regarding it as true, but Egan's argument against it is the best I can find and it's not entirely satisfying but should be sufficiently comforting to allow you to sleep.
Replies from: TheAncientGeek, Fivehundred↑ comment by TheAncientGeek · 2015-07-12T12:48:51.828Z · LW(p) · GW(p)
In that case most of your measure is in stable universes and dust theory isn't anything to worry about._
There are different ways of defining ,measure. DT guarantees that lack of continuity, and therefore low density, won't be subjectivtly noticeable....at least, it will look like chaotic observations , not feral like "I'm dead"
Dust theory has a weird pulled-up-by-your-own bootstraps taste to it and I have a strong aversion to regarding it as true, but Egan's argument against it is the best I can find and it's not entirely satisfying but should be sufficiently comforting to allow you to sleep.
Maybe you could include:
construed as a computation BY WHOM?
Computation is a process, and not any process, so the idea of an instantaneous computational state.
(There is a possible false dichotomy there: consciousness isnt the output of a computation that takes a lifetime to perform, but there could be still be millions of computatioNs required to generate a "specious present")
↑ comment by Fivehundred · 2015-07-12T03:58:06.714Z · LW(p) · GW(p)
But that can't be the case, as isn't the whole point of dust theory that basically any set of relations can be construed as a computation implementing your subjective experience, and this experience is self-justifying?
Not necessarily to you. It doesn't have to make much sense to you at all. But our observations are orderly, and that is something that can't be explained by the majority of our measure being dust. Why would it default to this?
If you make Egan's assumption, I think it is an extremely strong argument.
Replies from: Halfwitz↑ comment by Halfwitz · 2015-07-12T04:04:21.357Z · LW(p) · GW(p)
If you make Egan's assumption, I think it is an extremely strong argument.
Why don't you buy it?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T04:11:12.635Z · LW(p) · GW(p)
I don't reject it, I simply think that Dust Theory based on this assumption is so unlikely that we may as well assume the opposite- that different patterns can be more common; have more measure, than others.
Replies from: Halfwitz↑ comment by Halfwitz · 2015-07-12T04:22:00.363Z · LW(p) · GW(p)
I'm confused. What were you referring to when you said, "on this assumption"?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T04:26:02.461Z · LW(p) · GW(p)
That you find yourself randomly selected from a pool of all conceivable observers, rather than a pool with probabilities assigned to them.
EDIT: Actually, the former option is flatly impossible, because my mindstate would jump to any conceivable one that could be generated from it. I would have an infinitesimal chance of becoming coherent enough to have anything resembling a 'thought.'
Replies from: Luke_A_Somers↑ comment by Luke_A_Somers · 2015-07-17T20:33:05.382Z · LW(p) · GW(p)
Then why would you begin to suspect that the pool of observers does not coincide with the set of minds that have a physical instantiation and dynamics? If there's a nontrivial probability distribution, then there's going to be SOME sort of rules involved, and physics gives us a really solid candidate for what those rules might be.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-18T07:02:47.701Z · LW(p) · GW(p)
the set of minds that have a physical instantiation and dynamics?
What exactly does this mean? All minds are going to find some 'justification' as to why they exist.
Replies from: TheAncientGeek↑ comment by TheAncientGeek · 2015-07-21T08:46:25.798Z · LW(p) · GW(p)
Well, they ,might, if they were coherent emough, transtemporally, to even have anything resembling a thought. But why would that be the case?
comment by Fivehundred · 2015-07-14T01:22:08.676Z · LW(p) · GW(p)
Thank you for all the responses. I'm trying to contact Yudkowsky, does anyone know if how often he responds to his email? Or does the email address given here still work?
Replies from: Documentcomment by 615C68A6 · 2015-07-13T08:25:56.383Z · LW(p) · GW(p)
Like many people I felt compelled to distinguish myself by solving your problem while playing by your rules (rules which aren't completely clear). But after all ... and I guess I should offer an apology if this doesn't help, but, why should any of that change anything? Picture someone who for his whole life thought he had free will, then discovered that the universe is deterministic, with all that entails about ideas like "free will" as normal people envision it. This sounds pretty similar to your situation. You discovered that you may at any point "become" or "jump" to another conscious being whose memories are consistent with your own, but whose life and environment/universe is vastly different from your current/own universe.
Then what? What are your goals anyway? How does that change of perspective affect them? How can you best act, adjust yourself to still pursue those goals? What else should matter to you? The waters may be a little muddier than you believed them to be before, but not so muddied that it should be impossible to move forward. Seriously, aside from the vague existential angst, elaborate how this change of perspective affects your beliefs, and what actions you think you should act to reach your goals in life (if you have a good grasp of your goals. If you don't, then you should solve that first).
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T10:29:57.407Z · LW(p) · GW(p)
Apology accepted.
comment by buybuydandavis · 2015-07-13T06:04:12.713Z · LW(p) · GW(p)
I have a talent for reasoning my way into terrifying and harmful conclusions.
I often worked through them myself, always by refuting the horrible consequences of them to my own satisfaction
This is a good start. Assemble your data.
Catalogue the terrifying conclusions which have troubled you, and record the life cycle of each.
Currently still terrified? If not, how was the terror resolved? How long after the thought first terrified you until the terror was resolved?
Stop reasoning and take data. Often patterns becomes obvious once data is tidily assembled in place. And if you clearly and concisely assemble the data here, other people may be able to spot the patterns which you don't see - "given enough eyeballs, all bugs are shallow".
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T10:23:49.646Z · LW(p) · GW(p)
They are arbitrary. Some lasted years, some lasted days.
comment by [deleted] · 2015-10-06T17:51:06.954Z · LW(p) · GW(p)
Come to think of your thought process as an accumulation of information layers on top of each other, it should not be surprising to see the introduction of a possibly devastating new thought threatening the foundations of your thinking process being counter-productive, or depressive. I am speaking of my personal experience with solipsism, which did not come from exposure but personal self-destructive thought process; I've looked it up to find out about solipsism later on. The introduction of these ideas at your pace, as you've experienced yourself, is very uncomfortable and should be moderated.
In your case, you need a solid, unshakeable foundation which you should defend no matter what. This may sound a bit religious, but consider religion for the example's sake: an exposure to potentially threatening concept is overwritten with the religious concepts that are accepted duly by the observer. This is an overwriting mechanism and you need it in the most critical cases.
However overwriting everything simply eliminates all the useful information that are out there, or even the potentially threatening information from which you can extract useful bits of further information. My personal method of coping with these thoughts is simply to accept the possibility of anything presented to you of being real. It is similar to the old days of Stonehedge druids gazing at the wide sky showing the ultimate expression of infinity to humans: stars. The sight of infinite stars lining up on a clear sky is a humbling experience, and being humble comes with the admission that you simply cannot grasp everything. When you start leaving an open door to problems presented to you, instead of trying to tackle it down every time, you accept that it is beyond you and you let it go. That's how it can't affect you in depth. You can see similar thought processes in Middle Eastern https://en.wiktionary.org/wiki/kismet) and Far Eastern fatalism. The fatalism is actually a very useful tool to reduce anxiety such as yours, because it allows the person to move on instead of being crushed under the infinite possibilities of encounters.
For example, a person brought up in a culture where the usage of Kismet is common, would answer whether his newest business would be successful with a simple word: Kismet. It's the blind faith, a psychological support that frees your mind from anxiety, in the same fashion as our ancestors did. In the current age, we forgot to be humble and grew arrogant. Whenever we can't deal with answers we come up with, we either grow anxious or get depressed. Instead, let the questions find their own answers.
Obviously I'm not suggesting a direct cloning of these ideas, but rather your extrapolation of them. As you get more and more comfortable with casting thoughts aside and strengthening your foundation, you'll see the infinite possibilities expand before your eyes.
comment by Houshalter · 2015-07-13T06:04:15.880Z · LW(p) · GW(p)
I have the exact same problem. It's nice to know I'm not alone. I've been scared to mention my fears on lesswrong because I didn't think anyone would understand.
I'm mainly concerned about many worlds interpretation being true. I don't take dust theory seriously. Unless I understand it wrongly, it removes causation and just assumes the information itself is what is important. I really recommend you read Causal Universes. It's one of my favorite Lesswrong posts.
I also think dust theory leads to absurd and obviously wrong conclusions. Like how do interpret some random collection of bits? Under different interpretations they can represent anything.
If it's true that causation doesn't matter, then it's really weird that I just happen to find myself here with memories of living in a causal universe with simple, universal laws, for the last 2 decades. I should expect to be a boltzman brain observing extremely high entropy.
Anyway, my main tactic of dealing with Scary Idea is just keeping my mind off it as much as possible. Thinking about it hurts me, so I don't think about it. I also think I've resolved it somewhat, but again I prefer not to think about the issue at all.
I'm really scared of the transhumanist future. If we massively increase our intelligence, or create AIs which can explain complex ideas to us, then we will be forced to deal with all of these existential issues all at once. Including ones we haven't even thought of yet or can't understand. Our simple minds allow us some cognitive dissonance to just ignore this stuff.
EDIT: Also lighting helps a lot. I feel much worse in a dark room than a bright one. During the day time in the sunlight my mood improves even more. So I can understand why these problems occur during bedtime. Reading until I fall asleep helps. During the day I just find things to keep my mind off of it. If I started getting really terrified, I just opened a reddit page and looked at cat pictures for awhile.
Replies from: 27chaos, Fivehundred↑ comment by Fivehundred · 2015-07-13T10:50:01.318Z · LW(p) · GW(p)
I actually feel worst right after waking up. Is this the same for you?
Unless I understand it wrongly, it removes causation and just assumes the information itself is what is important.
I'm not sure that's true. Why does it remove causality? It removes the 'physical' aspect of causality, but as far as I can see not much else.
I also think dust theory leads to absurd and obviously wrong conclusions. Like how do interpret some random collection of bits? Under different interpretations they can represent anything.
I think this needs some explaining.
Replies from: Houshalter↑ comment by Houshalter · 2015-07-14T08:45:14.415Z · LW(p) · GW(p)
I actually feel worst right after waking up. Is this the same for you?
Just lying in bed alone with nothing but my own thoughts to distract me was the problem. But darkness makes it worse. And you have to lie still to get the sleep, but if I'm having anxiety in the morning I can just get up and find something to distract me.
Possibly I need to understand dust theory more to really debate you about it. Do you have another link?
It's hard to argue against because I don't really find the idea coherent to begin with. Recording someone's brain state and then replaying it doesn't instantiate any consciousness. The causal link between the brain states has already happened. Deleting some of the frames, or tampering with them, or copying them billions of times, doesn't change anything. The chain of causation of one brain state causing another brain state has already happened. Everything else is just like a static captured image of that moment in time.
I'm talking about the arrow of time itself. Why do events in the past seem to cause things in the future and never the other way around? Causation is important for consciousness. One brain state actually causes the next brain state. Just a recording of brain states doesn't cause any further brain states.
As for interpretation. Pi contains every possible sequence of digits possible. You can interpret them as brain states or jpg images or whatever you want. There is no meaning to it though. It's just a sequence of digits.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-14T10:52:41.107Z · LW(p) · GW(p)
Recording someone's brain state and then replaying it doesn't instantiate any consciousness.
Why? It seems like it would definitely instantiate consciousness, if you believe that two identical brain states would have separate phenomenological experiences. Or it would simply merge with the original brainstate, if you accept Dust Theory.
Deleting some of the frames, or tampering with them, or copying them billions of times, doesn't change anything.
Not subjectively, but it would continue 'on' in the Dust.
I'm talking about the arrow of time itself. Why do events in the past seem to cause things in the future and never the other way around? Causation is important for consciousness. One brain state actually causes the next brain state.
Yes, and the causal universe we live in can be represented by the Dust. The possible configurations of the Game of Life are 'real' even if there is no Game of Life being implemented physically.
As for interpretation. Pi contains every possible sequence of digits possible. You can interpret them as brain states or jpg images or whatever you want. There is no meaning to it though. It's just a sequence of digits.
All interpretations are realized under Dust Theory.
Replies from: Houshalter↑ comment by Houshalter · 2015-07-14T11:53:26.397Z · LW(p) · GW(p)
Yes, and the causal universe we live in can be represented by the Dust. The possible configurations of the Game of Life are 'real' even if there is no Game of Life being implemented physically.
Causal universes are represented, true. But so are countless non causal universes. Causality is a very specific constraint on possible universes. If it's not required then the vast majority of universes should be non-causal, simply because the space of non-causal mathematical structures is much much larger than the space of structures which happen to meet the causality restriction just by chance.
So it's really weird that we just happen to find ourselves in a causal universe, if it's not required. See the link I posted for a better argument about this.
But to even talk about that, I have to consider Dust Theory as an actual theory. If it's a theory then what predictions does it make? How does it constrain our expectations? It doesn't seem to add anything to my model of the world.
All interpretations are realized under Dust Theory.
Under what distribution? Are 50% of interpretations jpg images, or only 0.0000...001%?
What does it even mean for an interpretation to be "realized"? Some unspecified observer looks at a sequence of bits and says "this is a a brain state experiencing X". Who is the observer? What does it matter how they interpret it? This idea doesn't seem remotely coherent to me so I'm struggling to find the words to even object to it.
Recording someone's brain state and then replaying it doesn't instantiate any consciousness.
Why? It seems like it would definitely instantiate consciousness, if you believe that two identical brain states would have separate phenomenological experiences. Or it would simply merge with the original brainstate, if you accept Dust Theory.
I just don't accept this premise. A record of a brain state isn't an experience. It's just a series of bits that was caused by an actual running brain. I just don't accept that a static non-causal series of bits has any moral weight, let alone is "me". It doesn't do anything. It isn't connected to anything. It doesn't mean anything. It just "exists". It's not doing computation. It's not doing anything.
A lookup table can't be conscious. (There is a descent amount of material on lesswrong about lookup tables and philosophical zombies. If you can't find anyone to discuss dust theory with you.)
If you take that recorded brain state you can modify it. You can xor all the bits, or hash them, or treat it as a number and divide it by 20, or pad it with random bits, etc...
There is no inherent meaning to any sequence of bits, except the subjective one you give them. They are missing their color. They certainly are not conscious in any meaningful sense.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-14T15:01:18.548Z · LW(p) · GW(p)
Hmm, OK, my belief in DT is pretty well shaken.
Still, take the problem of the physical world. There's a large philosophical question of 'what' is actually out there; or if such a question is answerable even in principle (basically we're talking about the thing-in-itself). A Dust multiverse sidesteps this completely- if you go down far enough, you'll just get some mathematical laws, akin to those found in the Game of Life, which produce our universe. Isn't that at least parsimonious?
(You still have the problem of what mathematics is fundamentally, but it's a separate issue from the physical.)
Just lying in bed alone with nothing but my own thoughts to distract me was the problem. But darkness makes it worse. And you have to lie still to get the sleep, but if I'm having anxiety in the morning I can just get up and find something to distract me.
Meant to answer this and forgot. You're right that distractions always help, but often I go from hopeless to completely optimistic in minutes, for no apparent reason at all. Is this a neurological phenomenon?
Replies from: Dentin↑ comment by Dentin · 2015-07-15T03:42:14.741Z · LW(p) · GW(p)
I get the feeling you guys should read up on timeless qm, which basically avoids all of these problems and questions by treating reality as a static 'crystal' of related events with no time component. If you're going to be talking about stuff near the floor, you might as well go all the way instead of using inaccurate hacks.
comment by NancyLebovitz · 2015-07-13T08:07:46.791Z · LW(p) · GW(p)
An extrapolation of Dust Theory [3] implied that you might die upon going to sleep, not immediately, but through degeneration, and that the person who wakes up in the morning is simply a different observer, who has an estimated lifespan of however long he remains awake.
If that were true, wouldn't a lot of people be dying in their sleep so that we'd be seeing their corpses?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T10:28:10.018Z · LW(p) · GW(p)
Is this facetious?
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2015-07-13T13:47:40.580Z · LW(p) · GW(p)
No.
Your line of thought started with people in general dying slowly as they go to sleep. Wouldn't this suggest that some of them should die (leaving a corpse) before they wake up?
Maybe I've missed something, but I think your argument implies that we would have to be in the extremely rare universe where everyone appears to have survived in spite of death during sleep being the default?
Or did you mean that the person (in the sense of continuity of consciousness) dies during sleep, but the body doesn't die?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-14T01:14:51.195Z · LW(p) · GW(p)
The latter.
Replies from: Dentin↑ comment by Dentin · 2015-07-15T03:36:09.675Z · LW(p) · GW(p)
So the body that gets left behind, is it a p-zombie? If not, why not?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-15T04:29:07.756Z · LW(p) · GW(p)
No, it continues on normally somewhere else in the Dust.
comment by Liso · 2015-07-13T04:43:13.251Z · LW(p) · GW(p)
You are talking about rationality and about fear. Your protocol could have several independent layers. You seems to think that your ideas produce your fear, but it could be also opposite. Your fear could produce your ideas (and it is definitely very probable that fear has impact on your ideas (at least on contents)). So you could analyze rational questions on lesswrong and independently solve your irrational part (=fear etc) with terapeuts. There could be physical or chemical reasons why you are concerning more than other people. Your protocol for dangerous ideas needs not only discussing it but also solve your emotional responses. If you like to sleep well then it could depend more on your emotional stability than on rational knowledge.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T05:47:42.072Z · LW(p) · GW(p)
What sort of therapy would work for me? Ruminating is probably the main cause of it. Now that I've refuted my current fears, I find that I can't wrench the quantum world out of my head. Everything I feel is now tainted by DT.
Replies from: Liso, 27chaos↑ comment by Liso · 2015-07-17T05:40:30.706Z · LW(p) · GW(p)
I am not expert. And it has to be based on facts about your neurosystem. So you could start with several experiments (blod tests etc). You could change diet, sleep more etc.
About rationality and lesswrong -> could you focus your fears to one thing? For example forgot quantum world and focus to superintelligence? I mean could you utilize the power you have in your brain?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-17T06:25:31.337Z · LW(p) · GW(p)
About rationality and lesswrong -> could you focus your fears to one thing? For example forgot quantum world and focus to superintelligence? I mean could you utilize the power you have in your brain?
Heh, no. I can't direct it.
↑ comment by 27chaos · 2015-07-15T05:39:24.221Z · LW(p) · GW(p)
You've mentioned you have a history of inventing arguments with disturbing implications. Have you ever tried to intentionally invent an argument with reassuring implications?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-15T06:39:41.463Z · LW(p) · GW(p)
It's never intentional for me. They just click into place one day and drive me into a frenzy.
Replies from: 27chaos↑ comment by 27chaos · 2015-07-15T08:03:54.511Z · LW(p) · GW(p)
The disturbing arguments might be accidents, but maybe you could create reassuring arguments on purpose? Why let bias or coincidence alone determine the outcome of your reasoning processes, when you can aim towards strategic targets instead?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-15T12:29:17.490Z · LW(p) · GW(p)
Like what? Most of my targets are the already-existing crises, and it is completely arbitrary how long it takes to find a solution.
Replies from: 27chaoscomment by Jiro · 2015-07-13T02:38:26.149Z · LW(p) · GW(p)
The next big failure was my resolution to the Doomsday argument.
Are you aware of the self-indication assumption.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T02:43:37.655Z · LW(p) · GW(p)
I'm aware that it is nonsense, and I also think it is off-topic. I wasn't discussing the DA, I was discussing the possibility of my solution given the DA being true. What use would I have to rehash other arguments?
comment by Fivehundred · 2015-07-13T02:10:37.154Z · LW(p) · GW(p)
Nvm...
comment by Adam Zerner (adamzerner) · 2015-07-12T03:40:43.070Z · LW(p) · GW(p)
Why think about these sorts of things?
Personally, death really messes with my mind, and I try not to think about it (and related bad things) in the short-mid term. I don't see that I'm in a position to do much to avoid death/related bad things right now, and so I don't see that there's much benefit to thinking about it right now. The cost to me is that it makes me mildly unhappy and risks moments of extreme unhappiness.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T04:08:24.259Z · LW(p) · GW(p)
You remind me of this guy.
comment by Halfwitz · 2015-07-12T03:18:58.500Z · LW(p) · GW(p)
Do you have a link to Max Tegmark's rebuttal? What I've read so far seemed like a confused dodge.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T03:20:20.790Z · LW(p) · GW(p)
https://en.wikipedia.org/wiki/Quantum_suicide_and_immortality#Max_Tegmark.27s_work
Replies from: Halfwitz↑ comment by Halfwitz · 2015-07-12T03:27:53.781Z · LW(p) · GW(p)
That doesn't seem very air tight. There is still a world where a "you" survives or avoids all forms of degradation. It doesn't matter if it's non-binary. There are worlds were you never crossed the street without looking and very, very, very, very improbable worlds where you heal progressively. It's probably not pleasant but it is immortality.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T03:54:47.782Z · LW(p) · GW(p)
How would I contact a version of me in another branch? It isn't me at all anymore. You can receive and experience permanent brain damage, so why would a death experience be any different? And what about sleep? If this was true it seems like you wouldn't be able to let go of any of your mental faculties at all.
Replies from: Halfwitz↑ comment by Halfwitz · 2015-07-12T03:59:50.515Z · LW(p) · GW(p)
It isn't me at all anymore.
There will be a "thread" of subjective experience that identifies with the state of you now no matter what insult or degeneration you experience. I assumed you were pro-teleporter. If you're not why are you even worried about dust theory?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-12T04:13:20.114Z · LW(p) · GW(p)
There will be a "thread" of subjective experience that identifies with the state of you now no matter what insult or degeneration you experience.
What is 'me?' I'm not an ontologically basic thing. As long as it is a process, I don't see why I wouldn't just die.
Replies from: jacob_cannell↑ comment by jacob_cannell · 2015-07-13T02:22:56.016Z · LW(p) · GW(p)
The branches wherein you die are effectively discounted, because there is no future you who will remember your current self. The same applies to lesser degree to branches where you suffer brain/memory damage, to varying partial degree.
The problem with the whole QM suicide/immortality is that it assumes that we shouldn't care about measure, and we shouldn't care at all about universes that lack ourselves as future observers. Both of these notions are probably wrong from the perspective of normal human utility functions.
Replies from: Fivehundred, Halfwitz↑ comment by Fivehundred · 2015-07-13T02:33:50.586Z · LW(p) · GW(p)
The branches wherein you die are effectively discounted, because there is no future you who will remember your current self. The same applies to lesser degree to branches where you suffer brain/memory damage, to varying partial degree.
Why? What is so irreducible about my memories?
Replies from: jacob_cannell↑ comment by jacob_cannell · 2015-07-13T02:44:43.275Z · LW(p) · GW(p)
Well think through an example: imagine the future world where 'your' brain contains someone else's memories and personality tomorrow instead of your own. Compare that to the future world where your body contains someone else's skin pattern on the right arm ( a similar amount of physical matter/information replacement).
In the first world 'you' (the bio software mind I am currently speaking to) ceases to exist, whereas in the second world 'you' remains.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T03:05:51.669Z · LW(p) · GW(p)
I don't understand. I'm asking about irreducibility.
Replies from: jacob_cannell↑ comment by jacob_cannell · 2015-07-13T06:30:10.018Z · LW(p) · GW(p)
I don't understand then - what do you mean by irreducibility of memories?
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-13T10:25:36.442Z · LW(p) · GW(p)
I mean it is required for quantum immortality. Other than reducing their destruction to an binary event, how can they continue on?
Replies from: jacob_cannell↑ comment by jacob_cannell · 2015-07-13T21:26:00.446Z · LW(p) · GW(p)
Still don't understand your point/question - what do you mean by "reducing destruction to a binary event"?. Earlier I mentioned that destruction/survival isn't binary at all.
The idea is that there is always some branches in which a version of yourself survives. Survival is not binary, there are different degrees of 'survival'.
Replies from: Fivehundred↑ comment by Fivehundred · 2015-07-14T01:19:02.088Z · LW(p) · GW(p)
Yes, there are always some branches. But you can only follow one at a time. If you are in a branch in which your skull is being crushed, you are not likely to jump to a branch where you are totally fine.
Replies from: jacob_cannell↑ comment by jacob_cannell · 2015-07-14T16:50:03.017Z · LW(p) · GW(p)
There always exists some tiny subset of branches where you survive.
BTW, I don't completely buy the argument - as I mentioned earlier, measure is important and it works out to normality of probability. If my skull is being crushed, most of the branches past that point don't contain me. I care about the whole set, so the fact that I always survive in some tiny rare branches is not much of a consolation.
↑ comment by Halfwitz · 2015-07-13T02:32:24.104Z · LW(p) · GW(p)
I'm with Yvain on measure, I just can't bring myself to care.
Replies from: jacob_cannell↑ comment by jacob_cannell · 2015-07-13T02:41:37.317Z · LW(p) · GW(p)
Relative measure matters, but its equivalent to probability and thus adds up to normality.
comment by Halfwitz · 2015-07-12T03:07:24.227Z · LW(p) · GW(p)
I'm pretty much immune to infinity angst. PM the "dust theory" problem to me. I'm curious how it could be worse psychologically than modal realism, as AFAICT dust theory implies that all subjective experiences exist, so I'm unsure how it could differ in terms of psychological impact.