Do Virtual Humans deserve human rights?

post by cameroncowan · 2014-09-11T19:20:14.514Z · LW · GW · Legacy · 8 comments

Do Virtual Humans deserve human rights?

Slate Article

 

I think the idea of storing our minds in a machine so that we can keep on "living" (and I use that term loosely) is fascinating and certainly and oft discussed topic around here. However, in thinking about keeping our brains on a hard drive we have to think about rights and how that all works together. Indeed the technology may be here before we know it so I think its important to think about mindclones. If I create a little version of myself that can answer my emails for me, can I delete him when I'm done with him or just turn him in for a new model like I do iPhones? 

 

I look forward to the discussion.

 

8 comments

Comments sorted by top scores.

comment by buybuydandavis · 2014-09-12T01:58:32.577Z · LW(p) · GW(p)

To quote William Munny in Unforgiven:

Deserve's got nothin' to do with it.

comment by Gunnar_Zarncke · 2014-09-12T06:54:40.392Z · LW(p) · GW(p)

If you wonder why this is downvoted despite it being on-tipic: It hasn't enough flesh for a topic that isn't discussed the first time. You could add [link] to your post and add at least a few refs to existing discussions. Or just post this in the media thread.

Replies from: cameroncowan
comment by cameroncowan · 2014-09-12T16:55:45.922Z · LW(p) · GW(p)

I wasn't that concerned about it but I honestly didn't want to burden the topic down with tedious commentary and links to other relevant discussion. It was meant to be a short lived discussion on an independent topic. If I had wanted to do all that I would have written an essay on the subject.

Replies from: Toggle
comment by Toggle · 2014-09-13T05:04:58.171Z · LW(p) · GW(p)

You might also get a more positive response to narrowly focused subjects within this fairly large philosophical question. Your post is a bit 'transhumanism 101', and most LW posters have long since started wrangling with these ethics on a deeper level.

As a random example: Since uploaded minds can replicate themselves easily, is there a role for representative democracies in a world where this technology is available?

Replies from: cameroncowan
comment by cameroncowan · 2014-09-14T19:46:53.347Z · LW(p) · GW(p)

That's ok, I won't be posting further.

comment by shminux · 2014-09-11T20:00:18.432Z · LW(p) · GW(p)

If I create a little version of myself that can answer my emails for me, can I delete him when I'm done with him or just turn him in for a new model like I do iPhones?

The standard Schelling point for assigning "human rights" is self-awareness. I think Eliezer calls it "internal listener" or something like that. Maybe it is possible to create a subset of your mind without self-awareness, but intelligent enough to answer your emails the same way you would. After all, our "internal listener" is off quite often and we don't appear visibly stupid during these times.

Replies from: skeptical_lurker
comment by skeptical_lurker · 2014-09-12T12:19:51.757Z · LW(p) · GW(p)

Pretty sure babies aren't self-aware, while chimpanzees are. Yet the majority opinion is that the former has human rights and the latter doesn't.

Replies from: shminux
comment by shminux · 2014-09-12T15:44:40.943Z · LW(p) · GW(p)

Right, we extend "human rights" to potentially self-aware humans (sometimes including fetuses) and no-longer-self-aware humans, and generally anything with human DNA which appears human, but that's where the majority gets thinner. In actuality the Schelling point is more like a fading line than a point.