How can I get over my fear of becoming an emulated consciousness?

post by James Dowdell (james-dowdell) · 2024-07-07T22:02:43.520Z · LW · GW · 8 comments

Contents

  Summary
  Lead-Up / Premise
  The Scenario
  Request for Comment
None
8 comments

Summary

In the last few months I've been suffering from bouts of abject terror related to the thought of experiencing reality as a thinking being only - one with no senses of any kind, as well as no agency - presumably as a result of unintended consequences from asking to have my mind uploaded and emulated in a computer. I don't have anybody to talk to about this, who can recommend practical mitigation, and hope to request help from the LessWrong community for recommended reading.

Lead-Up / Premise

This is related and relevant, I promise: in theory I love the idea of dirigibles and airships.  I hate that we spend so much energy (literally) and focus on fixed wing aircraft and drones, when we could just be drifting along gracefully through the air.  Airship Ventures' "Eureka" airship (while it was flying) was one of my favorite things about Silicon Valley.

My wife knew this, and for my 30th birthday, arranged for us to go on a surprise hot air balloon tour of Napa Valley.  I was so excited getting into the basket, I was grinning ear to ear!  But the moment the basket lifted off the ground - and somehow it was only then that it hit me - I remembered I had a pretty serious fear of heights, and ended up collapsed on the floor of the basket, shaking pretty much throughout the whole flight.

Jumping topic but not point, I have been extremely keen, for decades, under a mechanistic "I am my brain" assumption, to destructively scan my brain and upload my mind into a computer, where it would not be subject to biological death, and could be backed up and made quite durable.  This desire was a very natural evolution of my experience with NES and SNES video game emulation in the 1990s, and the idea is well treated by both pop sci-fi as well as more academic treatments on sites such as this one.  We don't have the ability to do something like this yet, and I grow concerned as I approach 40 years old, that we may not have it solved in my lifetime.  But I think I'm "reasonably hopeful" that between progress at the Allen Institute, with brain scanning; and the work by Robert McIntyre at Nectome and the Brain Preservation Foundation more generally; that if I live a typical lifespan (with about 35 years of life left), I will have the option to undergo physician-assisted suicide with something like Robert's aldehyde-stabilized cryopreservation protocol, with successful brain scan and mind upload to follow many years after my death, when it becomes available.

If that plays out the way I predict, I would likely become one of the first people to be uploaded; and as a software engineer, I can tell you that any new system has bugs.  But surely it can't be too bad?

One more jump, to bring it all together.  As I've gotten older, I've suffered from pretty serious migraines attacks, that at their worst appear more like a substantial stroke.  I'll go blind, or lose the ability to read or speak, and sometimes end up in the hospital.  In a recent attack a few months ago, I was in bed alone, and I lost my hearing and my sight, most coherent thought, and my ability to move my body.  In some ways it mirrored the stories one hears of "locked-in" patients.  To the extent that I could think coherent thoughts, I was terrified.  When the attack finally subsided and I began to regain control of my senses and my body, I wept profoundly.  At the time, I had been afraid maybe I was going to die - but that was, oddly, not what was really scaring me.

The real issue was that I had suddenly quite viscerally discovered a potential failure case for mind upload that seems substantially worse to me than dying.  Like the incident with the hot air balloon, something I had been looking forward to purely conceptually, now has me alarmed when the details have become more real.  Starting immediately after the migraine attack I described, I now keep reliving a modified version of it, as an intrusive thought that hits out of nowhere a few times a month, and sends my pulse through the roof and me into the height of suffering.  

The Scenario

What has me so scared?

Most of us think of "getting uploaded" as something like the movie The Matrix, where you're in a simulated body that can see and hear and walk around.  But what if you just end up as a "virtual brain in a vat", literally just a software program running on a server, with no "body" hooked up - no vision, no hearing, no sense of touch, no pain, no fingers to wiggle or tongue to wag, no taste, nothing.  

I can write all this theoretically with no problem.  But when I go to actually imagine existing like this, truly visualize and realize it, I get hit with a runaway panic and terror experience.  Even in a pitch dark room, even in a sensory-deprivation chamber, you still feel your heart beats, you still sense your breath going in and out.  Ironically, the techniques I've learned to calm down in panic situations - "focus on the breath" mindfulness meditation - only make everything worse here, because you end up focusing on the lack of breath, which the body recognizes as fundamentally wrong and worth panicking about.

I've realized there's also a quite literally "grounding" sense one gets from feeling pressure, somewhere, in their body.  If you're laying in bed, you feel it where your butt and your head and your legs make contact with the bed.  When you're standing, it's all in your feet.  You can feel the wind blow; or, when swimming in water, you can feel the encumbrance of the liquid around you.  All of that is, eerily, also missing in this scenario.

(I'm uncertain about the vestibular system, but I also worry that in this scenario, without proper input from the inner ear, one's experience could be extreme dizziness and nausea.)

These "lack of sensations" alone - no heart, no breath, no pressure or touch - when vividly imagined, seem to be enough to trigger a dehumanizing animal panic within me.  The human body interprets all this as something being severely wrong, and the experience of that is pure torture.

But it gets worse.  Even if you can get over that - and I'm not sure you can - there's a lot of other things that are "wrong" too.  There's no sense of time.  No sun, no clock.  There's nothing to do - you have no body, you can't snap your fingers or tap your foot, stand up or sit down.  Even when someone has locked-in syndrome and they're stuck in place on the bed, there's something going on; gurgling in the tummy, air or blood moving this way or that.  A sense of pain, such as an unscratchable itch, seems like it would be torture; and yet, in this world of absolutely no input whatever, it seems that even that itch would be preferable to having nothing at all.

For me, when I imagine this state of being, it's agony just experiencing it for a few seconds.  And then it occurs to me, that if my well-meaning children successfully implement my desire never to die, by being uploaded, and "turn me on" like this with sufficient data and power backups but lack of care; or if something else goes wrong with the technicians involved not bothering to check if the upload was successful in setting up a fully virtualized existence complete with at least emulated body sensations, or do not otherwise check from time to time to ensure this remains the case; that I could get stuck existing like this for an unendurably long hellish period.  Probably not for eternity, but possibly for many decades at least, and worse if it gets simulated at faster-than-realtime.

Request for Comment

These experiences of the last few months have me questioning if I shouldn't in fact "just die" and not pursue upload, for fear that I get stuck in such a scenario as I have been vividly experiencing.  I'm requesting advice and insight on a few points:

8 comments

Comments sorted by top scores.

comment by Seth Herd · 2024-07-09T05:11:11.329Z · LW(p) · GW(p)

You've come to the right place; I agree (and I think most here would) that such a thing is possible. However, I think it's also quite unlikely. There are many better things to worry about, including in the domain of brain upload downsides.

This really sounds like you're suffering PTSD from your migraine experience, and projecting that fear into the future.

You have plenty of time to decide whether to do a brain upload, should one become available. If and when you get a chance to decide that, you can decide what probably different failure modes have. I think you'll decide this one is incredibly unlikely.

I think you should be seeking treatment for PTSD. I don't know what the best current forms of treatment are. I'd start looking into that if I were you.

This should not play much role in your thinking, because you should be thinking about more realistic things, particularly addressing your PTSD. But if you were an accurately simulated brain with no sensory inputs, you would fairly quickly start to hallucinate sensory inputs. Sensory deprivation produces dreamlike hallucinations fairly quickly despite people being awake. There are deep reasons for this; you still have the primary sensory cortices, they don't like to stay quiet, and they and the bidirectionally linked higher brain areas will form patterns similar to the ones you've learned from experience. That's what dreams are, the default behavior of the brain without sensory input. Theoretical reasons aside, that's what empirically happens when people reduce their sensory inputs for a half hour or so while remaining awake.

But the plausibility of such a thing happening is not the crux of this problem, because even if that was the brain's behavior, that failure mode would be incredibly unlikely. PTSD is the root of your problem, almost certainly. Debunking this belief for yourself rationally is part of the solution, but not the whole thing. Get that PTSD treated, and not just by the first suggestion you happen across. It's a big deal, it's worth simultaneously pursuing several routes to treatment. It's not something I have much knowledge of, so I'm not going to suggest anything in particular; but treatments are available.

Good luck to you.

comment by green_leaf · 2024-07-08T13:50:46.958Z · LW(p) · GW(p)

This is not an obviously possible failure mode of uploads - it would require that you get uploaded correctly, but the computer doesn't feed you any sensory input and just keeps running your brain without it. Why would something like that happen?

comment by Nick_Tarleton · 2024-07-08T17:32:08.953Z · LW(p) · GW(p)

if my well-meaning children successfully implement my desire never to die, by being uploaded, and "turn me on" like this with sufficient data and power backups but lack of care; or if something else goes wrong with the technicians involved not bothering to check if the upload was successful in setting up a fully virtualized existence complete with at least emulated body sensations, or do not otherwise check from time to time to ensure this remains the case;

These don't seem like plausible scenarios to me. Why would someone go to the trouble of running an upload, but be this careless? Why would someone running an upload not try to communicate with it at all?

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2024-07-09T20:50:13.853Z · LW(p) · GW(p)

I don't want to aggravate the OP's problem, but a reason that immediately occurs to me is that the people running the upload have a statutory duty to do so but don't actually care. Consider the treatment of the elderly incapable in some care homes.

Replies from: Seth Herd
comment by Seth Herd · 2024-07-09T21:34:09.744Z · LW(p) · GW(p)

Making it a statutory obligation to run them with no legal obligation for their quality of life would be so insane and shortsighted that not even the US legal system would (probably) do it. Let's make sure it doesn't.

comment by noggin-scratcher · 2024-07-08T11:27:59.675Z · LW(p) · GW(p)

Point of curiosity: do you happen to have posted about this scenario on the subreddit /r/NoStupidQuestions/ ?

Because someone has (quite persistently returning on many different accounts to keep posting about it)

Replies from: Seth Herd
comment by Seth Herd · 2024-07-09T05:15:18.051Z · LW(p) · GW(p)

If I were the person asking the question, I don't think I'd appreciate this question. It feels a little like doxxing. If they were different accounts that didn't share a name, they're meant to be anonymous and so private.

Replies from: noggin-scratcher
comment by noggin-scratcher · 2024-07-09T11:25:31.785Z · LW(p) · GW(p)

I'll accept that concern as well-intentioned, but I think it's misplaced.

I've offered zero detail of any of the accounts I've seen posting about mind uploads (I don't have the account names recorded anywhere myself, so couldn't share if I wanted to), and those accounts were in any case typically throwaway usernames that posted only once or a few times, so had no other personal detail attached to be doxxed with. They were only recognisable as the same returning user because of the consistent subject matter.

Genuinely just curious about whether the people I have encountered suffering intrusive fears about their mind being uploaded are in fact one person in different contexts, or if this is a more widespread thing than I expected.