Ask LW: Why no reproductive human cloning?

post by Will_Newsome · 2011-06-05T11:57:16.123Z · LW · GW · Legacy · 22 comments

From what I can tell human cloning for the purpose of, ya know, actually cloning a person in the Dolly sense, is legal in many parts of the United States. It looks hard to pull off but without conceptual problems. Seems likely that after the first few clones are born there'll be a huge backlash and it will get banned forever. My impression is that whoever does it first would get a lot of money and tons of media attention that would be useful for getting funding for some other biotech venture. They'd get extra publicity if they put a eugenics spin on it too, which I haven't seen anyone talking about from my few Google searches. I also haven't seen anything about a combination of cloning and genome design/tweaking of various kinds, for research or for creating less-misoptimized humans; I'm not at all familiar with the science/tech there, is there a reason no one thinks it's promising? I can't find a decent blog that covers any of the related topics.

Who's familiar with this dormant technology and its social situation? Are there good blogs that cover it? What parts of the picture am I missing?

22 comments

Comments sorted by top scores.

comment by Richard_Kennaway · 2011-06-05T14:02:38.575Z · LW(p) · GW(p)

How many failures were there on the way to making Dolly? You can simply dispose of a failed sheep (or horse), but a failed human is a very different matter.

Replies from: Will_Newsome, XiXiDu
comment by Will_Newsome · 2011-06-05T14:45:36.691Z · LW(p) · GW(p)

Many, but my naive impression was that the methods could be refined quite a bit, and that one would be able to notice most abnormalities while still in the abortionable stage. But you're right, without artificial wombs, which are probably very hard to engineer, I don't immediately see an intermediate step that would get the success rate to acceptable levels.

comment by XiXiDu · 2011-06-05T17:18:15.126Z · LW(p) · GW(p)

You can simply dispose of a failed sheep (or horse), but a failed human is a very different matter.

Some people don't think that you can simply dispose sheep (or horses), what do you think of those people, are they irrational?

Replies from: Will_Newsome, Richard_Kennaway, magfrump
comment by Will_Newsome · 2011-06-09T02:43:55.417Z · LW(p) · GW(p)

Not addressed to XiXiDu specifically.

Proposal: "Rational" is now a Banned Word. No one is allowed to use it. No one is allowed to snidely use its ironic social connotations. No one is allowed to use one of its technical meanings to sneak in connotations of general normativity-backed-by-math (e.g. 'exponential discounting is rational', 'maximizing utility is rational'). No one is allowed to call a person, group, belief, meme, ideology, or effing anything, "irrational". (If you want to use a generic vague negative social descriptor for some insane reason then use "biased" or "false" or something that sort of makes sense.)

A few of the many reasons: In all the time I've hung around SingInst people I've barely ever heard the words "rational" or "irrational". It's not done. There's no reason to ever use the words. If you want to say a person is good at thinking in one or a few ways, you list the ways. If you want to say they're good at achieving their goals or getting shit done then you say that. Reversed for lacking certain thinking skills, generalized for groups of people.

Around some parts of SingInst-y Visiting Fellows-y folk there is a notion of "sanity/sane" but it's way more nuanced than "rational" and isn't used in retarded ways.

To call a person or a group irrational is a variation on that ancient and incredibly typical pattern where one manages to commit both (a) the fundamental attribution error and (b) the sin of comparing one's (affectively biased) perception of the world to a vague idealistic wish-fulfilling should world, at the same time, so one can justify and thus set the stage for some combination of immediately subsequent indignation, frustration, feelings of superiority, not wanting to help, contempt, and generally that whole class of negative emotions one can easily hold self-righteously, while spinning it all in a way that signals concern for morality and tribal allegiance and while keeping others from pointing out how useless or counterproductive such circle-jerking is because such criticism would be perceived as endorsement of irrationality (oppression, sexism, racism, fundamentalism, insert contemptible beliefs/preferences/memes here). This is such a typical and head-desk-causing signaling pattern (and has been forever) that surely it's written up much better somewhere else.

I would call it irrational for irony's sake (though you can tell I'm being legitimately hypocritical here, and that this is self-criticism as well as whatever else it is), but what is the "rational" should world standard against which I am comparing this general type of "reasoning"? The nameless virtue? Our best current conception of decision theory? AIXI? What we can imagine using inside view as the closest thing a human or humans or things roughly as bounded as humans are could come to approximating ideal decision theory? What is "rational"? Where the fundamental attribution error sees vague irrationality, straightforward non-socially-fueled reasoning sees the consequences of boundedness in a world most likely beyond the reach of God. Your enemies really truly aren't innately evil. There's no room in the model for blame or contempt or indignation. There's still room for them in human psychology, and if you want to use the fire of indignation for personal pursuits then I won't blame you. But when it comes to epistemic hygiene, that kind of reasoning is toxic.

(Note also the various social maneuvers one can use to make it seem as if another is trying to use the above social maneuver against one, et cetera. What I listed as a problem is a symptom, obviously.)

Calling a belief or set of memes "irrational" just looks like an ugly and badness-causing misuse of a word/concept.

(I haven't slept in a long time. I apologize for the preachy tone and aggressiveness. Or, like, there's some counterfactual world where it could have been avoided, and it is expected of me that I acknowledge that a tradeoff has been made which keeps this world from looking more like that slightly-more-optimized world, and feel sorry about that necessity, or something, so I do.)

Replies from: XiXiDu
comment by XiXiDu · 2011-06-09T08:51:39.990Z · LW(p) · GW(p)

It would help me a lot if you could tell me if you disagree with this (my definition of rationality, right, wrong and should) and this (why we utter moral propositions). If you disagree, please try to explain how I am wrong so that I can become less wrong.

comment by Richard_Kennaway · 2011-06-05T18:23:31.554Z · LW(p) · GW(p)

They have their values. Sometimes they firebomb people for it. But for most people, the issues around cloning people are of far greater gravity than for cloning sheep. When does the soul enter a clone? (I know, but we live in the same world as these people. Being right does not make the people who are wrong change their minds or go away.) At what point should a clone be considered to have the right to be brought to maturity? How many clones should be made, and of whom?

comment by magfrump · 2011-06-05T18:13:50.385Z · LW(p) · GW(p)

That seems like a case of different terminal values, not a matter of irrationality.

But the reason you can dispose of a failed sheep and not a failed human is (I assume) a legal matter as well as a moral matter.

Replies from: Will_Newsome
comment by Will_Newsome · 2011-06-09T03:37:04.666Z · LW(p) · GW(p)

That seems like a case of different terminal values, not a matter of irrationality.

It seems to me you're using "terminal values" here to mean "provisionally terminal values we'll have to work with for now because we aren't going to make much progress on morality without something like FAI". What you're literally claiming seems much too strong considering our current lack of understanding of morality (or what a "terminal value" is, for that matter). If you literally meant terminal values then I... have nothing to say, but if you mean "provisionally terminal values yada yada" then that's sort of unfortunate. Is there a short way of expressing

not a matter of irrationality.

I wrote a general screed arguing that this kind of "irrationality" is too-close-to-incoherent in a comment reply to XiXiDu. I'd agree it's not a matter of irrationality, but the framework "terminal values"+"rationality" seems to me basically inapplicable to humans in the first place (i.e., it's a crude approximation that rarely adds much and often misleads). The tone of my comment arguing such might make it unreadable but you might be interested in seeing it despite that.

Replies from: Nick_Tarleton, magfrump, Will_Newsome
comment by Nick_Tarleton · 2011-06-09T06:38:04.562Z · LW(p) · GW(p)

Is there a short way of expressing ["provisionally terminal values yada yada"?]

"moral beliefs"

(we really could use a standard term for this)

Replies from: steven0461
comment by steven0461 · 2011-06-10T20:14:53.389Z · LW(p) · GW(p)

But if we started distinguishing between provisional and real terminal values, how would I continue to use bold statements of self-expressive fiat to avoid thinking about complicated unsolved problems in ethical philosophy with potentially uncomfortable answers?

Replies from: Will_Newsome
comment by Will_Newsome · 2011-07-30T02:25:59.843Z · LW(p) · GW(p)

Oh, this one is obvious! You simply assert that everyone else's "terminal values" are actually just provisional, whereas yours are real, due to a moral miracle. Or! Become a nihilist, or perhaps an existentialist, or join a vaguely plausible religion.

comment by magfrump · 2011-06-10T18:24:47.267Z · LW(p) · GW(p)

You have a point and Nick Tarleton's response below is what I would have said if I had thought of it.

comment by Will_Newsome · 2011-06-09T06:16:49.680Z · LW(p) · GW(p)

Above comment not completed (has abrupt break in middle) due to iOS and the text entry field and the cursor not working together nicely.

comment by RadiVis · 2011-06-05T22:52:07.625Z · LW(p) · GW(p)

I think that many people have aversions against elites seeking some form of immortality for themselves. Reproductive cloning would only enable a relatively weak for of immortality that might not be available to everyone. Therefore, some people could argue that reproductive cloning creates intolerable relative advantages for a relatively small group of people who can afford that procedure.

In the case that reproductive cloning could be afforded by virtually everyone people could still draw some kind of connection between cloning and incest, as both acts can be seen as diminishing genetic diversity. Less genetic diversity could decrease the overall level of immunity against diseases. But perhaps you could compensate for that with some genetic treatment before the implantation process.

Additionally, those who create a clone of themselves could be seen as egocentric, because they don't seem to be ready to mix their genetic material with someone else and prefer their own undiluted genetic code.

comment by timtyler · 2011-06-05T19:31:00.679Z · LW(p) · GW(p)

They should probably try cloning a chimpanzee first.

comment by lsparrish · 2011-06-05T15:13:26.276Z · LW(p) · GW(p)

Here's a link to an advocacy site for human cloning.

Wouldn't the eugenics argument favor selectively breeding embryos (from a large donor population) for optimal chromosome combinations, rather than cloning existing individuals?

Cloning embryos is relatively simple -- you could do preimplantation genetic selection (PGS) for the equivalent of several hundred generations, and save the stem cell line in a freezer, while raising the first few babies to see how they turn out. If they turn out well, couples (perhaps from the original donor population) could then adopt new babies from the same line.

comment by MartinB · 2011-06-05T19:41:03.414Z · LW(p) · GW(p)

Why would anyone seriously want to clone a person? That is pretty useless. Or as a biology professor put it in a debate about the very topic:„the natural method is cheaper, easier and way more fun“

Replies from: steven0461, knb, None, timtyler
comment by steven0461 · 2011-06-05T20:00:46.965Z · LW(p) · GW(p)

Why would anyone seriously want to clone a person?

Because then they get to pick what the baby's genome is?

comment by knb · 2011-06-05T20:42:29.651Z · LW(p) · GW(p)

Bryan Caplan has expressed interest in cloning himself. A big part of his reason why is that identical twins are usually grateful for their special bond. From Caplan's Selfish Reasons to Have More Kids

When people ask how my identical twin sons get along, I answer, "I've never seen anything like it. They are literally 'brotherly.'"

comment by [deleted] · 2011-06-27T20:10:33.082Z · LW(p) · GW(p)

Curiosity would play a large part. I know that my clone and I would be alike in certain ways and different in others, but the way I'd like to find out which is which is by cloning myself. Having a clone would illuminate what particular environmental effects make you unique.

Replies from: MartinB
comment by MartinB · 2011-06-27T20:40:52.089Z · LW(p) · GW(p)

You might mistake cloning and actual copying. A clone starts out as a fetus and will always be n years younger than you. For general comparisons you might look into twin research, and (if that is even done yet) research into time delayed twins that are available since fertilization and storage of embryos can be done.

comment by timtyler · 2011-06-06T18:55:52.880Z · LW(p) · GW(p)

Why would anyone seriously want to clone a person?

Perhaps see my Celebrity cloning video.