When does technological enhancement feel natural and acceptable?
post by Gunnar_Zarncke · 2015-05-01T21:11:11.164Z · LW · GW · Legacy · 34 commentsContents
Body Enhancement Mind Enhancement Body Schema Extension Sensory Enhancement Devices and Gadgets Services Language Control Social Interaction Slow Processes My Transhuman Wish-List Open Ends None 34 comments
Technology can be used and perceived in different ways. Future technology may change our lives beyond imagination. How can friendly AI technology enrich the human experience positively? Technology can feel like it controls us, or - if it goes well - it can feel like a natural enhancement of mind and body.
I'm interested in ways future technology could or couldn't do this. I will explore some avenues and state my opinion on these. Make up your mind. I'd like to hear your opinion.
Body Enhancement
The first thing we'd like to get rid of is impediments and diseases. Some might like to be immortal. But this is not enhancement but rather maintenance.
People apparently like to enhance their bodies. This starts with cosmetics and doesn't end with doping. Strictly clothing could also count among this. We know quite well what we want here. I'd bet that people would accept body enhancements easily - esp. if it is reliable, safe, and/or reversible. Fictional evidence here is the positive reception of suitably enhanced heroes. Wouldn't you like to have super strength or look like a supermodel? Such enhancement for everybody, which is otherwise zero-sum, could also balance against the effect that ideals in the media diminish our self-image compared to the ancestral environment where we were only one in 150 average guys.
As long as people don't change their native preferences, this should make everybody happier with themselves. If preferences are changed, bets are off again.
Mind Enhancement
Drugs can have not only pleasurable but also performance-increasing effects. Nootropics for everybody could be acceptable - if free and safe. I think that increasing brain capacity (speed and capacity) would feel the most natural - if it could be done.
The trouble with this is the inevitable return on cognitive investment. Seems like either exponential or chaotic changes (from interacting minds) result. One tricky part here seems to be how to avoid boredom once stability has been reached.
Body Schema Extension
Body perception is flexible. It is known that the body schema (our body self-image) can expand to encompass tools. Thus tools become part of the body (schema) and are wielded and handled and thus felt like one's own body. I got the impression that this might extend to vehicles - e.g., driving a car - or probably also flying a plane - can feel like one's own movement. One knows where the car ends. I'd guess that technology that has immediate feedback and can be mapped to a (distorted/extended) body schema will likely feel natural (after some time of adjustment).
Sensory Enhancement
Apparently, our senses are quite flexible. Whatever input (visual, auditory, tactile, even smell) can be mapped to a 3D environment model by long training. This is apparently also possible for non-native senses, which is called Sensory Substitution and Sensory Augmentation. There are already some projects which build actual working devices. Once this mapping has settled in the subconscious, it feels natural. I wonder whether augmented reality systems can achieve this. Virtual reality systems are dual for this - data mapped to the senses instead of senses mapped to data.
Devices and Gadgets
Devices that require conscious interaction and translation into some UI often feel clumsy no are clumsy. They break the flow. They require conscious effort. I think the main attraction of having an app for that is the feeling of control over the distance we gain. We can do something by invoking some magical ritual to achieve some effect other people can't achieve (or rather only via mundane manual action). This is good and fine, but even better if you can achieve the effect even without the interaction.
There was a recent post somewhere about the best smartphone UI being just a blank screen where you could type (or dictate) what you want, and the 'UI' would figure out the context and intention. While googling unsuccessfully for that, I found this link about natural UIs:
“The real problem with the interface is that it is an interface. Interfaces get in the way. I don’t want to focus my energies on an interface. I want to focus on the job … I don’t want to think of myself as using a computer, I want to think of myself as doing my job.” -- Donald Norman in 1990
Services
Commerce and esp. the internet provide lots of services that we use to reap the benefits of a digital society. Amazon, Netflix, online booking... But we are at the mercy of the service providers and the power of the interface provided, and the cost required. Independent of how well-integrated this is, every interaction with service either means a transaction (cost per use), freemium choice (will I regret this later), or ad suffering (paying with attention). This is bondage and discipline. I'd rather minimize this as a means of future technological enrichment.
Language Control
Communication is natural - via speech and via text. Communication is natural, not only with people. Most programmers value the power of the command lines - because it allows them to combine commands in new ways in a - to an experienced user - natural linguistic way. Why not use language to control the technology of the future? Just utter your wishes. A taste of this could be the service offered by the Magic startup.
Social Interaction
We are social animals. Could we deal with digital assistants who understand us and support us? Probably - if they are beyond the Uncanny Valley. But would we trust them? Only if they behave consistently with equal or lower status. Otherwise, we'd justifiably feel dominated. Can this be achieved if the artificial agent is much smarter than us and unavoidably controls us thereby? Would we feel manipulated?
Slow Processes
Societal processes affecting us in ways we (feel we) have only limited control over often feel oppressive - even if they are by some objective standard intended for our (collective) best. Examples are health care bureaucracy, compulsory education, traffic rules, and above all, parliamentary democracy. These processes are slow in the sense of affecting and changing over longer times than conscious effort can easily work on. Often there is no immediate or clearly attributable feedback. Such a process often feels like a force of nature, and humans have adapted quite well to forces of nature - but just because we accept it doesn't mean that we feel liberated by it. I think that any slow process that changes things in complex ways we cannot follow will cause negative feelings. And many conceptions of how FAI could help us I have seen involve masterminds setting things up. People might feel manipulated. Either this is balanced by other means, or we need a correspondingly slow consciousness or deep understanding to follow this.
My Transhuman Wish-List
I want to look better, be more robust - even if everybody else would look better too. I want backups and autonomous real and virtual clones of myself.
I'd like to think faster, have a perfect memory, or even access information from the web in a way that feels like recall. I'd like to be able to push conscious thought processes into the subconscious - call it deliberate, efficient reversible habit formation.
I'd like to be able to move into machines (vehicles, robots, even buildings) and feel the machines as my extended self. I'd like to perceive more kinds of sensor input as natural as my current senses.
I don't want to interface with devices but command linguistically, by thought, or completely subconsciously.
I want a consciousness that can deal with slow processes and possibly a way to think slower in parallel with normal consciousness.
Open Ends
There are more areas where this reasoning can be applied, and I'd like to state some general patterns behind these areas - but my time for this post has run out.
Just two examples:
- Incremental changes are preferable to abrupt changes. People oppose changes for which they cannot see the consequences. But compared to slow external processes. Slow internal processes may be the best option.
- Enhancements that can be used subconsciously are better than those that need conscious attention (and context switches).
I'd like to give fictional evidence for each point. But here, I also just point you to the Optimalverse, where some of these are played out, and to The Culture, which describes some of the effects.
EDITED: Spelling, typos.
34 comments
Comments sorted by top scores.
comment by [deleted] · 2015-05-04T10:58:33.914Z · LW(p) · GW(p)
Technology can feel like it controls us or - if it goes well - it can feel like a natural enhancement of mind and body.
Have you ever wondered why, in an age of cell phones and hand grenades, telepaths and fireball throwing wizards in fantasy books sounds cool? Somehow it seems like we like to do things with our mind or body only, not relying on tools.
That is how primitive cyberpunk novels fail. I am pretty sure I don't want to replace my eyeballs with mechanical eyes. However I am thinking if LASIK surgery could be good. So the idea is not so much so to implant machines in our body or to use them externally, but to use technology to make our own bodies become high quality and powerful. I like this idea.
Replies from: Lumifer, Jiro, ChristianKl, advancedatheist↑ comment by Lumifer · 2015-05-06T15:09:39.905Z · LW(p) · GW(p)
I am pretty sure I don't want to replace my eyeballs with mechanical eyes.
Not even if you could adjust them to become telescopes or microscopes if need be? If you could switch to seeing in infrared or ultraviolet? Add amplification to clearly see on moonless nights with nothing but starlight?
Replies from: None↑ comment by [deleted] · 2015-05-07T07:20:57.384Z · LW(p) · GW(p)
If this can be done inside the eyes, it can be done outside the eyes, as a removable eyeglasses like thing.
This is why cyberpunk never really made sense to me. Why remove the choice of putting something on or off? Okay there is an advantage of never forgetting it at home, still. Gibson's razor blades implanted under the fingernails sound cool until you realize you just gave up the option of ever being allowed on an airplane for example.
Oh, and sometimes I hear horror stories that when people wear diamond rings for decades and it becomes unremovable from their fingers, and some criminal mugs them, they just cut of the finger. Extrapolate from here...
↑ comment by ChristianKl · 2015-05-04T11:26:28.591Z · LW(p) · GW(p)
That is how primitive cyberpunk novels fail. I am pretty sure I don't want to replace my eyeballs with mechanical eyes.
You might not, that doesn't mean that nobody does. I think I have meet 3 people face to face with implants to be able to perceive magnetic fields.
Replies from: Gunnar_Zarncke↑ comment by Gunnar_Zarncke · 2015-05-04T18:23:35.519Z · LW(p) · GW(p)
Need not be implants. The NorthPaw http://sensebridge.net/projects/northpaw/ or the feelspace belt http://feelspace.cogsci.uni-osnabrueck.de/ are cool despite being devices - precisely because they quickly fade into the subconscious.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-05-04T18:27:39.323Z · LW(p) · GW(p)
Yes, there are non-implant solution but that doesn't change the fact that there are people willing to use implants.
↑ comment by advancedatheist · 2015-05-06T15:00:28.786Z · LW(p) · GW(p)
I am pretty sure I don't want to replace my eyeballs with mechanical eyes.
If you have vision problems, as I do, those "mechanical eyes" sound interesting.
Fantasy appeals strongly to the adolescent mind because our bodies at that age start to change and develop new powers, so to speak - just not necessarily the kinds of powers we might want; or else our new powers still don't meet the needs of our new desires. Notice especially how fantasy appeals strongly to the sorts of boys who get pushed aside from access to girls until their 20's, or even indefinitely in more cases than we would care to admit.
Replies from: None↑ comment by [deleted] · 2015-05-07T07:25:04.788Z · LW(p) · GW(p)
Notice especially how fantasy appeals strongly to the sorts of boys who get pushed aside from access to girls
But that is self-hating escapism mostly.
comment by advancedatheist · 2015-05-02T05:32:16.024Z · LW(p) · GW(p)
Required reading:
Man Into Superman (1972), by Robert Ettinger:
http://www.cryonics.org/images/uploads/misc/ManIntoSuperman.pdf
I read it in 1974. Ettinger anticipated a lot of things that today's transhumanists think they just discovered.
comment by Gunnar_Zarncke · 2015-05-01T21:52:16.884Z · LW(p) · GW(p)
Which enhancements would you like? "Yes" doesn't mean "always" but "as needed". Choose "Other" if unsure, if you see other choices you want to comment on or if you just want to see the answers.
[pollid:908]
[pollid:909]
Virtual clones [pollid:910]
Real clones [pollid:911]
Independent clones [pollid:912]
Think faster [pollid:913]
Think slower [pollid:914]
Perfect memory [pollid:915]
Conscious access to numeric computing ressources (arithmetic, statistics) [pollid:916]
Conscious access to symbolic computing ressources (logic) [pollid:917]
Conscious access to turing complete computing ressources [pollid:918]
Recall of information from the web like own memory [pollid:919]
Conscious control over habit formation [pollid:920]
Affect emotional states in a controlled way (happiness, attention, fear...) [pollid:921]
Alter my mind in deeper ways [pollid:922]
Move my mind into or expand my mind to vehicles or other bodies [pollid:923]
Perceive radiation natively [pollid:924]
Perceive material properties natively [pollid:925]
Act via tactile control of audiovisual devices [pollid:926]
Act via tactile control with feedback via augmented senses [pollid:927]
Act via linguistic control of audiovisual devices [pollid:928]
Act via linguistic control with feedback via augmented senses [pollid:929]
Act via conscious thought control of audiovisual devices [pollid:930]
Act via conscious thought with feedback via augmented senses [pollid:931]
Interact with artificial beings [pollid:932]
Interact with artificial beings that are smarter than I [pollid:933]
Interact with artificial beings that are less smart than I [pollid:934]
Interact with artificial beings that are more powerful than I [pollid:935]
Have (parallel) consciousness which runs at time-scales of societal change [pollid:936]
Conscious access to slow processes [pollid:937]
Be consciously aware of cost-benefit trade-offs any application or usage of the above enhancements brings [pollid:938]
You may add other polls as sub comments.
Replies from: NancyLebovitz, lululu, Richard_Kennaway, DanielLC, Gunnar_Zarncke↑ comment by NancyLebovitz · 2015-05-02T13:36:15.364Z · LW(p) · GW(p)
I want to be able to reverse aging.
What would the use be of thinking slower? Maybe for boring times?
I don't just want conscious recall of information from web-like own memory, I want to be able to communicate (both receive and transmit) directly in hypertext-- I don't know what it would be like, but it's frustrating that I can't.
If I could alter my mind in deeper ways, I'd like really good version control. I'd also like to be able to toggle between sensory extension and old-style sensory systems-- there's a lot of art which is optimized for currently standard senses.
And I'd like self-modules., so that if I wanted to experience something as though it was new to me or as if I were at an earlier age, I could. Daniel Pinkwater (a notable author of children's books) has mentioned that he has access to what it's like to be various ages.
Replies from: Gunnar_Zarncke↑ comment by Gunnar_Zarncke · 2019-10-09T18:52:49.075Z · LW(p) · GW(p)
What would the use be of thinking slower? Maybe for boring times?
No, though that might be useful for things like long space travel too.
I'm more thinking about the ability to perceive and act on longer timescales effectively. What Robin Hanson calls the Long View. We are not very good at noticing and consciously dealing with processes that are much slower than our attention span. We have to piece these together from episodic memory.
Sorry for the laaaaate reply. Curious whether you are still here.
↑ comment by lululu · 2015-05-28T20:27:56.179Z · LW(p) · GW(p)
I think people are SEVERELY overestimating the utility of perfect memory (74% yes, 10% no), and underestimating the value of traumatic and unpleasant experiences fading over time. Some people currently have perfect memory, it is not a good experience.
A better selective memory is a good thing. Electing to remember where you placed your keys or the name of your mailman is a good idea. Having perfect memory of all the idiotic things you said or did during your first break up or that fight with your mom, or more importantly that time you were molested or almost died in combat is a recipe for emotional disaster and severe PTSD. Its very hard to control where your mind dwells and how memories are triggered, but slow fade and nostalgic filters protect us from the worst emotional damage of long-term rumination over negative events.
Replies from: Gunnar_ZarnckeIn addition to good memories, every angry word, every mistake, every disappointment, every shock and every moment of pain goes unforgotten. Time heals no wounds for Price. "I don't look back at the past with any distance. It's more like experiencing everything over and over again, and those memories trigger exactly the same emotions in me. It's like an endless, chaotic film that can completely overpower me. And there's no stop button."
She's constantly bombarded with fragments of memories, exposed to an automatic and uncontrollable process that behaves like an infinite loop in a computer. Sometimes there are external triggers, like a certain smell, song or word. But often her memories return by themselves. Beautiful, horrific, important or banal scenes rush across her wildly chaotic "internal monitor," sometimes displacing the present. "All of this is incredibly exhausting," says Price.
↑ comment by Gunnar_Zarncke · 2015-05-28T21:58:03.875Z · LW(p) · GW(p)
Insightful. But that really 'only' means that these transhumanists just want conscious access to the availability of the memory too.
↑ comment by Richard_Kennaway · 2015-05-05T14:14:22.382Z · LW(p) · GW(p)
Summary of results: we want everything.
↑ comment by DanielLC · 2015-05-03T04:24:22.543Z · LW(p) · GW(p)
What's the difference between "die when I want" and "immortality"? I would expect "die when I want" would mean that I keep living until I decide to die, and "immortality" would mean that I keep living, but I could totally change my mind if I want to. I'm fine with clones if we can recombine, but if we can't it would be disconcerting.
↑ comment by Gunnar_Zarncke · 2015-05-05T12:41:57.705Z · LW(p) · GW(p)
Lots of people have voted "other" - but not always (show results) so I wonder: What other options there are hidden?
comment by gurugeorge · 2015-05-03T23:45:59.045Z · LW(p) · GW(p)
I think that it's acceptable when it works.
What I mean is, a lot of the transhumanist stuff is predicated on these things working properly. But we know how badly wrong computers can sometimes go, and that's in everyone's experience, so much so that "switch it off and switch it on again" is part of common, everyday lore now.
Imagine being so intimately connected with a computerized thingummybob that part of your conscious processing, what makes you you, is tied up with it - and it's prone to crashing. Or hacking, or any of the other ills that can befall computery things. Potential horrorshow.
Similar for bio enhancements, etc. For example, physical enhancement like steroids, but safer and easier to use, are still a long way off, and until they come, people are just not going to go for it. We really only have a very sketchy understanding of how the body and brain work at the moment. It's developing, but it's still early days.
So ultimately, I think for the foreseeable future, people are still going to go for things that are separable, that the natural organic body can use as tools that can be put away, that the natural organic body can easily separate itself from, at will, if they go wrong.
They're not going to go for any more intimate connections until such things work much, much better than anything we've got now.
And I think it's actually debatable whether that's ever going to happen. It may be the case that there are limits on complexity, and that the "messy" quality of organics is actually the best way of having extremely complex thinking, moving objects - or that there's a trade-off between having stupid things that do massive processing well, and clever things that do simple processing well, and you can't have both in one physical (information processing) entity (but the latter can use the former as tools).
Another angle to look at this would be to look at the rickety nature of high IQ and/or genius - it's six and a half dozen whether a hyper-intelligent being is going to be of any use at all, or just go off the rails as soon as it's booted up. It's probably the same for "AI".
I don't think any of this is insurmountable, but I think people are massively underestimating the time it's going to take to get there; and we'll already have naturally evolved into quite different beings by that time (maybe as different as early homonids from us), so by that time, this particular question is moot (as there will have been co-evolution with the developing tech anyway, only it will have been very gradual).
comment by Sable · 2015-05-03T20:09:45.824Z · LW(p) · GW(p)
I'm no expert in the field, but I'd like to bring up neuroplasticity. Our brains are constantly rewiring themselves as they process input, and they gradually adjust to change. My point is that I believe any enhancement could come to feel natural (although some would certainly have a higher learning curve).
Other thoughts:
Ever read Uglies, Pretties, and Specials by Scott Westerfield? It's set in a utopia/dystopia where massive plastic surgery is the norm - at 16, everyone chooses what they will look like (going from "Ugly" to "Pretty") and similar changes occur at middle age, and so on. One of the points made is that there will always be something to envy - if it stops being looks it'll become something else.
I'd take some kind of physical enhancement that removes most bodily needs - sleeping, bathroom, eating, etc. - although this is a symptom of the more general "anything that gives me more free time is good" heuristic.
I can imagine some kind of gene sequencing becoming a regular medical practice - stripping people of bad genes, or enhancing good ones.
comment by [deleted] · 2015-05-02T13:31:29.299Z · LW(p) · GW(p)
When does technological enhancement feel natural and acceptable? When it relieves us of a perceived burden. When trend leaders present it as natural and acceptable. When other choices are taken away. In short, not necessarily for rational or self-interest reasons.
Good results can come if you put the question on its head. When does technological enhancement not feel natural and acceptable? Or, as an egoist, how can I get what I want in indifference to how others feel about it?
comment by advancedatheist · 2015-05-02T06:01:13.225Z · LW(p) · GW(p)
You can have a lot of fun imagining how HEPs (highly enhanced persons) would interact with MOSHes (Mostly Original Substrate Humans), especially if the MOSHes didn't understand the nature of the interaction. Olaf Stapledon wrote an excellent short novel on that theme back in the 1930's:
Odd John:
http://gutenberg.net.au/ebooks06/0601111h.html
The HEP character as a boy, John Wainwright, for example makes a MOSH boy fall in love with him in a homosexual way, just as an experiment.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2015-05-02T17:00:12.098Z · LW(p) · GW(p)
John is a feral child who needs to figure out what he's doing by himself.
In the real world, there would be at least efforts to have some rules for how HEPs and MOSHes interact, even if those rules can't be enforced reliably.
Replies from: advancedatheist↑ comment by advancedatheist · 2015-05-02T20:49:10.519Z · LW(p) · GW(p)
We don't have exactly any HEPs around now that I know of. The first ones may live in something analogous to a state of nature, however, until new social norms emerge to regulate their behavior.
comment by ChristianKl · 2015-05-01T21:26:50.007Z · LW(p) · GW(p)
I'd like to think faster, have perfect memory or even access to information from the web in a way that feels like recall.
That's the Borg. If remembers feels the same as information that other people put on the internet it changes a lot.
Replies from: Gunnar_Zarncke↑ comment by Gunnar_Zarncke · 2015-05-01T21:56:17.933Z · LW(p) · GW(p)
Ah, the idea is that I can recall the information that way but that I'm aware of the source. Not that it feels genuinely like my own memory. But without that distinctions yes Bork fits it.
comment by advancedatheist · 2015-05-03T00:01:34.118Z · LW(p) · GW(p)
Related article by Zoltan Istvan:
The Culture of Transhumanism Is About Self-Improvement
http://www.huffingtonpost.com/zoltan-istvan/the-culture-of-transhuman_b_7022406.html
Replies from: Dorikka↑ comment by Dorikka · 2015-05-03T01:13:55.688Z · LW(p) · GW(p)
So my instinct is to write this guy off as a nut because it's super sketchy to try to run for president for a party that (to my very cursory knowledge) he made up to increase his book sales. Does anyone else find some value in paying attention to him or taking time to read his stuff? I'm wondering if this is a correct judgement on my part or an instinctual misfire.
Replies from: advancedatheist, ilzolende↑ comment by advancedatheist · 2015-05-03T04:51:21.374Z · LW(p) · GW(p)
I follow his career with a perverse kind of fascination just to see how aggressive self-promotion works. Two years ago I had never heard of this guy, though he does have a media trail on the internet. Now he has figured out how to get invited to all kinds of H+ related conferences so that he can plug his novel, argue for the imminence of all kinds of radical transformations in the human condition due to allegedly accelerating technology, and make the case for a transhumanist political party in the U.S. with himself as the presidential candidate.
Case in point: He will speak at a conference in Palm Springs next month, along with several other individuals whose names you might recognize, hosted by something called the Brink Institute:
Replies from: None↑ comment by [deleted] · 2015-05-03T06:42:01.253Z · LW(p) · GW(p)
I can say that whenever he speaks about biology his claims are an order of magnitude more inane than the usual ones I see made by others.
Replies from: JoshuaZ↑ comment by JoshuaZ · 2015-05-07T02:17:47.980Z · LW(p) · GW(p)
Is there a standard metric for inanity of biological claims?
Replies from: None↑ comment by [deleted] · 2015-05-07T04:32:40.607Z · LW(p) · GW(p)
Hmmm... we could define one.
We might need multiple axes though. One for thermodynamic implausiblity, one for dammit-thats-not-how-it-works-at-all / misapplication of programming concepts to chemistry, one for do-you-realize-how-complicated-what-you-just-suggested-is.
↑ comment by ilzolende · 2015-05-03T22:48:01.306Z · LW(p) · GW(p)
I recently registered to vote and did not see his party listed as an option, even though I have never heard of the "Americans Elect Party" and it is an option. I mostly pay attention when other people mention him. Also, I kind of wish the Transhumanist Party would issue some statements about ballot issues besides "vote for Istvan".