Mind uploading from the outside in

post by Alexandros · 2015-11-29T02:05:07.228Z · LW · GW · Legacy · 10 comments

Contents

10 comments

Most discussion of uploading talks of uploading from the inside out: simply, a biological person undergoes a disruptive procedure which digitises their mind. The digital mind then continues the person’s timeline as a digital existence, with all that entails.

 

The thing that stands out here is the disruptive nature of the process from biological to digital being. It is not only a huge step to undergo such a transformation, but few things in reality operate in such binary terms. More commonly, things happen gradually.

 

Being an entrepreneur and also having a keen interest in the future, I both respect audacious visions, and study how they come to be realised. Very rarely does progress come from someone investing a bunch of resources in a black-box process that ends in a world-changing breakthrough. Much more commonly, massive innovations are realised through a process of iteration and exploration, fueled by a need that motivates people to solve thousands of problems, big and small. Massive trends interact with other innovations to open up opportunities that when exploited cause a further acceleration of innovation. Every successful startup and technology, from Facebook to Tesla and from mobile phones to modern medicine can be understood in these terms.

 

With this lens in mind, how might uploading be realised? This is one potential timeline, barring AI explosion or existential catastrophy.

 

It is perhaps useful to explore the notion of “above/below the API”. A slew of companies have formed, often called “Uber for X” or “AirBnB for Y”, solving needs we have, through a computer system, such as a laptop or a mobile phone app. The app might issue a call to a server via an API, and that server may delegate the task to some other system, often powered by other humans. The original issuer of the command then gets their need covered, minimising direct contact with other humans, the traditional way of having our needs covered. It is crucial to understand that API-mediated interactions win because they are superior to their traditional alternative. Once they were possible, it was only natural for them to proliferate. As an example, compare the experience of taking a taxi with using Uber.

 

And so computer systems are inserted between human-to-human interactions. This post is composed on a computer, through which I will publish it in a digital location, where it might be seen by others. If I am to hear their response to it, it will also be mediated by APIs. Whenever a successful new API is launched, fortunes are made and lost. An entire industry, venture capital, exists to fund efforts to bring new APIs into existence, each new API making life easier for its users than what came before, and adding additional API layers.

 

As APIs flood interpersonal space, humans gain superpowers. Presence is less and less important, and a person anywhere in the connected world can communicate and effect change anywhere else. And with APIs comes control of personal space and time. Personal safety increases both by decreasing random physical contact and by always being connected to others who can send help if something goes wrong. The demand for connectivity and computation is driving networking everywhere, and the cost of hardware to fall through the floor.

 

Given the trends that are in motion, what’s next? Well, if computer-mediated experience is increasing, it might grow to the point where every interaction a human has with the world around them will be mediated by computers. If this sounds absurd, think of noise-cancelling headphones. Many of us now use them not to listen to music, but to block the sound from our environment. Or consider augmented reality. If the visual field, the data pipeline of the brain, can be used to provide critical, or entertaining, context about the physical environment, who would want to forego it? Consider biofeedback: if it’s easy to know at all times what is happening within our bodies and prevent things from going wrong, who wouldn’t want to? It’s not a question of whether these needs exist, but of when technology will be able to cover them.

 

Once most interaction is API-mediated, the digital world switches from opt-in to opt-out. It’s not a matter of turning the laptop on, but of turning it off for a while, perhaps to enjoy a walk in nature, or for a repair. But wouldn’t you want to bring your augmented reality goggles that can tell you the story of each tree, and ensure you’re not exposed to any pathogens as you wander in the biological jungle? As new generations grow up in a computer-mediated world, fewer and fewer excursions into the offline will happen. Technology, after all, is what was invented after you were born. Few of us consider hunting and gathering our food or living in caves to be a romantic return to the past. When we take a step backward, perhaps to signal virtue, like foregoing vaccination or buying locally grown food, we make sure our move will not deprive us of the benefits of the modern world.

 

Somewhere around the time when APIs close the loop around us or even before then, the human body will begin to be modified. Artificial limbs that are either plainly superior to their biological counterparts, or better adapted to that world will make sense, and brain-computer interfaces (whether direct or via the existing senses) will become ever more permanent. As our bodies are replaced with mechanical parts, the brain will come next. Perhaps certain simple parts will be easy to replace with more durable, better performing ones. Intelligence enhancement will finally be possible by adding processing power natural selection alone could never have evolved. Gradually, step by small step, the last critical biological components will be removed, as a final cutting of the cord with the physical world.

 

Humans will have digitised themselves, not by inventing a machine that takes flesh as input and outputs ones and zeroes, not by cyberpunk pioneers jumping into an empty digital world to populate it. We will have done it by making incremental choices, each one a sound rational decision that was in hindsight inevitable, incorporating inventions that made sense, and in the end it will be unclear when the critical step was made. We will have uploaded ourselves simply in the course of everyday life.

10 comments

Comments sorted by top scores.

comment by passive_fist · 2015-11-30T00:21:06.588Z · LW(p) · GW(p)

It seems like the problem of "the disruptive nature of the process from biological to digital being" still exists, except in your scenario it's only being pushed farther down the road.

You could imagine a radical outside-in modification of the human body, going from people with only their limbs replaced, to others with everything but their brain replaced, to others still with most of their brain except their cerebral cortex replaced, and even their cerebral cortex vastly modified and extended. But the seat of consciousness would still be their brain, and a person in such a state would not be considered an upload. They are not 50% uploaded or 5% uploaded or even 1% uploaded. They are zero percent uploaded, because their consciousness is still tied to some particular piece of hardware, exactly the way it was before. A true upload would not have their consciousness tied to any particular hardware - they would exist as information and could travel around at will.

A good 'test' for uploading might be the speed-of-light test. If your consciousness is capable of travelling at the speed of light, you are uploaded. Otherwise you aren't uploaded. The great thing about this test is that nature provides a very clear boundary between pure information and matter. Matter can contain information but is necessarily always travels slower than light, whereas pure information can and does travel at the speed of light.

Sure, it might be possible to have a 'half-uploaded' state where some of your consciousness is still tied to some hardware and some of it can move freely. But we don't yet know if such a state is possible. Actually, we don't even know if uploading is possible. It could be that your consciousness is always doomed to being tied to some hardware and it could never move at the speed of light.

Replies from: Alexandros
comment by Alexandros · 2015-11-30T03:20:04.443Z · LW(p) · GW(p)

Surely the point at which your entire sensory input comes from the digital world you are somewhat uploaded, even if part of the processing happens in biological components. what does it mean to "travel" when you can receive sensory inputs from any point in the network? There are several rubicons to be crossed, and transitioning from "has tiny biological part" to "has no biological part" is another, but it's definitely smaller than "one day an ape, the next day software". What's more, what I'm arguing is not that there aren't disruptive steps, but that each step is small enough to make sense for a non-adventurous person, as a step increase in convenience. It's the theseus ship of mind uploading.

Replies from: passive_fist
comment by passive_fist · 2015-11-30T03:28:58.149Z · LW(p) · GW(p)

what does it mean to "travel" when you can receive sensory inputs from any point in the network?

To be able to shorten the time which it takes to be conscious of a sensory input. If the sensor is at point A and you are at distance x from that sensor, you require at least x/c time to be aware of an input from that sensor.

The whole point of travel is to have low-latency, high-bandwidth access to information that exists at some point in the universe.

but that each step is small enough to make sense for a non-adventurous person

It still seems to me that the step from 'being tied to a specific piece of hardware' - whether that hardware is an entirely biological brain or an enhanced biological brain - to being pure information capable of moving from hardware to hardware is a pretty big step, regardless of how it is performed. It's the very essence of digitizing something. A physical book is information tied to hardware; uploading consists of scanning the book.

Replies from: Alexandros
comment by Alexandros · 2015-11-30T03:40:23.953Z · LW(p) · GW(p)

There are still many intermediate steps. What does it mean "to be conscious of a sensory input"? Are we talking system 1 or system 2? If the brain is composed of modules, which it likely is, what if some of them are digital and able to move to where the information is and others are not? What if the biological part's responses can be modelled well enough to be predicted digitally 99.9% of the time, such that a remote near-copy can be almost autonomous by means of optimistic concurrency, correcting course only when the verdict comes back different than predicted. The notion of the brain as a single indivisible unit that "is aware of an input" quickly fades away when the possibilities of software are taken into account, even when only part of you is digital.

Replies from: passive_fist
comment by passive_fist · 2015-11-30T03:48:21.755Z · LW(p) · GW(p)

There are still many intermediate steps. What does it mean "to be conscious of a sensory input"? Are we talking system 1 or system 2?

The system 1/system 2 distinction is only tangentially related here.

If the brain is composed of modules, which it likely is, what if some of them are digital and able to move to where the information is and others are not?

It's irrelevant whether the brain is 'composed of modules' or not. If what you mean is whether it is possible for consciousness to be distributed, well that's a good question. If it's possible for consciousness to be distributed then you could imagine being 'spread out' over a very large computer network (possibly many light-years in length). But the situation becomes tricky because if, say, your 'leg' was in one star system and your 'eye' was in another star system, stimulus from your eye could not cause a reaction from your leg in time shorter than several years, otherwise you violate the speed of light limit and causality. So either you cannot be 'spread out', or your perception of time slows down so extremely that several years seems instantaneous (just like the fraction of a second required for you to move your human leg seems instantaneous now).

Replies from: Alexandros
comment by Alexandros · 2015-11-30T05:35:45.492Z · LW(p) · GW(p)

I don't use the word consciousness as it's a complex concept not really necessary in this context. I approach a mind as an information processing system, and information processing systems can most certainly be distributed. What that means for consciousness depends on what you mean by consciousness I suppose, but I would not like to start that conversation.

Replies from: passive_fist
comment by passive_fist · 2015-11-30T05:48:47.612Z · LW(p) · GW(p)

The whole idea of uploading concerns human consciousness. Specifically, transferring a human consciousness to a non-biological context. If you're not talking about human consciousness, then you're just talking about building an AI.

Replies from: Alexandros
comment by Alexandros · 2015-11-30T05:55:54.791Z · LW(p) · GW(p)

Which in turn depends on what you mean by "artificial".

Replies from: passive_fist
comment by passive_fist · 2015-11-30T20:13:39.146Z · LW(p) · GW(p)

The route to AI that you're suggesting is a plausible one; people like Nick Bostrom have talked about scenarios like this at length. Scenarios where we gradually shift our 'computational substrate' to non-biological hardware over several generations. But that's not necessarily what uploading is! As I mentioned, uploading is the transferring of a consciousness from some specific piece of hardware to another piece of hardware. The title and wording of your post implies that you are talking about uploading, but our discussion indicates you are actually talking about building an AI, which is an entirely different concept, and everyone who is confused about this distinction would do well to clearly understand it before talking about it.

Replies from: Alexandros
comment by Alexandros · 2015-11-30T23:46:12.259Z · LW(p) · GW(p)

You appear to be arguing about definitions. I'm not interested in going down that rabbit hole.