Posts

Your transhuman copy is of questionable value to your meat self. 2016-01-06T09:03:30.949Z

Comments

Comment by Usul on Stupid Questions, 2nd half of December · 2016-01-12T08:16:06.732Z · LW · GW

"I'm curious whether you're willing to chomp down on bullets."

Since you're happy to go off topic, and your other posts suggest you've definitely got a dog in this fight already, would you agree or disagree with the following statement:

Based on things I've read on the internet (Cochrane) (not to be confused with the Cochrane Library that actually produces meta-analyses, just some guy named Cochrane who can't land a tenure track job teaching physics) regarding brain size and IQ test results, I believe that it is more probable than not that Black People are less intelligent than White People, that the jury's still out on Asian People, and that this is due in no small part to genetics.

"I am not particularly attached to the strawful "popular" ideas of race."

That is the very definition of race. That is what the term means.

"I am not a fan of high-priesthood treatment of science."

When I meet the strawman who does I'll let him know.

This really takes me back to a month or so I spent trolling Christian Identity White Supremacists back in the day, not sure if I should be surprised to find it here or not. Good luck with your confirmation bias.

Comment by Usul on Open Thread, January 4-10, 2016 · 2016-01-12T03:36:48.991Z · LW · GW

"the idea that false meories got planted is uncontroversial history"

Certainly, but is this a significant concern for the OP at this time, such that it bears mention in a thread in which he is turning to this community seeking help with a mental health problem. "Dangerous territory" is a strong turn of phrase. I don't know the answer, but I would need evidence that p(damage from discouraging needed help)< p(damage from memory implantation in 2015). Would you mention Tuskigee if he was seeking help for syphilis? Facilitated communication if he was sending an aphasic child to a Speech Language Pathologist? Just my opinion.

Comment by Usul on Stupid Questions, 2nd half of December · 2016-01-12T02:51:43.251Z · LW · GW

"So, sure, lets' put the idea of race to bed and start with killing affirmative action. You're good with that?"

This is the point where I say "politics is the mind killer" and discount all of your politically charged conclusions, then?

"Have you actually seem Somalis? They do not look like the stereotypical African blacks at all."

My point exactly. Yet they are universally considered "black" by people in your and my culture because of the arbitrary (which word I do mean quite literally) choice to see skin color as one of the two supremely defining qualities by which we "know" race. If certain facial features were (just as arbitrarily) selected, Somalis would be in the same race as Samis.

Another example: By standards of race, Native Australians are morphologically black (show an unlabeled photo of a black haired Aboriginal to a North American- he will say "black" if asked to assign a race) as are Kalahari Bushmen. I can not think of two more genetically divergent populations. Yes, human genetic diversity exists. However, current ideas of race have so little genetic basis as to be useless, and are mired in bias and produce bias in our modern thinking (mine, too). It is foolish to cling to the primitive beliefs of your ancestors to address problems or inquiries in the modern world.

I use the term "wholehearted accept" in the context of isolated scientific findings. In other words: do I accept that this individual study proves or significantly suggests that it says what it's authors say it does? I have expertise in perhaps 5-6 highly specific areas of study to the extent that I can competently evaluate the merits of published research on my own. Outside of those areas I would be a fool to think I could do so without some recourse to expert analysis to explain the minutia that only years of experience can bring. Otherwise I might as well join the young earthers and anti-vaccinationists.

Comment by Usul on Stupid Questions, 2nd half of December · 2016-01-12T02:32:41.262Z · LW · GW

So there exists a Pure Caucasian, a Pure Mongoloid, and a Pure Negroid out there? Can you identify them? Can you name a rational basis for those morphological qualities by which you know them? Is it a coincidence that the qualities you have chosen coincide perfectly with those that were largely developed by bias-motivated individuals living in Europe, Australia, and North America over the past few centuries? Why not back hair, toe length, presence of palmeris longus muscle, renal vein anatomy, positon of the sciatic nerve relative to piriformis muscle? Among the "grey" how do we know which individuals can be characterized by what (oh let's say percentage) of membership they can say to have in each category? Is such a thing useful? What is your motivation for believing so?

Which has been the greater source of error: the fairly recent hyper-vigilance so seek out sources of bias and error in research seeking so-called racial differences? Or the unconscious tendency to be blind to one's own cultural norms as the arbitrary choices that they are, and to more readily accept the value of the self-like?

As to black, white, and grey, my eyes and visual cortex zero out relative to local contrast and past a certain point will default the lightest colorless shade to white and the darkest to black. With photo-sensors, I can read the result identifying the wavelength and intensity, which will tell me if the light is black or white.

Comment by Usul on Open Thread, January 4-10, 2016 · 2016-01-11T07:38:54.504Z · LW · GW

"That's dangerous territory. Quite a lot of people got talked by their therapist has having false memories of abuse."

I would want to have a hell of a lot of evidence showing a clear statistically significant problem along these lines before I attempted to discourage a person from seeking expert help with a self-defined mental health problem.

Comment by Usul on Stupid Questions, 2nd half of December · 2016-01-11T07:23:02.972Z · LW · GW

"Social science academics are very skewed politically." So shall we discount any concept of expertise based solely on our biases towards the suspected biases of others based on their reported political affiliations? I don't have the time to get my own PhD in every subject. I don't claim they have the gospel truth, but, as I said, it's a good place to start, from which a cursory examination of geographic population variations pretty much puts the the idea of race to bed with very short work.

Tabooing race, I think your paraphrasing doesn't quite capture his question, because inherent in the use of "race" is not simply "genetically similar" but rather the specific arbitrary morphological features traditionally used to define race. Greenland Inuits are further removed genetically from Siberians than Somalis are from Yemenis, yet a photo line-up would be greatly skewed in favor of the former being of the same race and the latter being of different races.

As to the entirely separate question of validity of IQ testing (leaving aside whether IQ captures a genetically-mediated aspect of intelligence), I am not an expert in the field of cognitive science or psychology but I am aware of significant expert-level controversy over the reliability and validity of their application cross-culturally in the past and present, and would therefore be even more hesitant, selective, and dependent upon expert review of study methodology than I generally am before I wholeheartedly accepted a published finding as established fact.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-11T06:52:58.023Z · LW · GW

I believe there is a functional definition of amnesia, loss of factual memory, life skills remain intact. I guess I would call what you are calling a memory wipe a "brain wipe". I guess I'd call what you are calling memory "total brain content". If a brain is wiped of all content in the forest is Usul's idea of consciousness spared? No idea. Total brain reboot? I'd say yes and call that good as dead I think.

I would say probably yes to the text only question. Again, loss of factual memory. But I don't rate that as a reliable or valid test in this context.

Comment by Usul on Stupid Questions, 2nd half of December · 2016-01-11T06:38:55.870Z · LW · GW

Should one value the potential happiness of theoretical future simulated beings more than a certain decline in happiness for currently existing meat beings which will result as soon as the theoretical become real? Should one allow for absurdly large populations if the result is absurd morality?

The promise of countless simulated beings of equal moral value to meat beings, and who can be more efficiently cared for than meat, seems to make the needs and wants of simulated beings de facto overrule the needs and wants of meat beings ( as well as some absurdly large sim populations being absurdly over-valued relative to other smaller sim populations). As meat currently exists and simulated beings do not (Bostrom be damned- simulated meat over sim-within-sim, then), it seems the present moral imperative should be to avoid the creation of simulated beings or even preemptively plan their destruction (to discourage/ blackmail against ever needing to actually do so) because as soon as they do exist the FAI overlord must value them as equals and by numbers their needs will overrule the needs of any meat alive at the FOOM. If the FAI does not value them as equals then we have the even more Repugnant Conclusion of a relatively tiny meat ruling class and countless virtual slaves.

Is there a Utilitarian case to be made for extremely strict "virtual population control"? Many Repugnant Conclusions, such as Torture vs. Dust Specks require large populations before they become relevant. Should a FAI overlord be programmed against allowing large populations of simulated sentient beings to exist in the first place? Perhaps a "one person-one upload" policy with no parthenogenesis.

The cost of a ban on (unlimited) simulated sentient beings would be simply not receiving the benefits of allowing (unlimited) simulated life, which humanity has thus far done without.

Comment by Usul on Stupid Questions, 2nd half of December · 2016-01-11T05:29:09.646Z · LW · GW

When the relevant experts, Anthropologists, say that the concept of race is a social construct with no basis in biological fact they aren't just bowing to some ivory tower overlord of political correctness. We would do well to consider their expertise as a starting point in any such inquiry.

Start anywhere on a map of the Eastern Hemisphere and trace what the people look like in any geographic area relative to the regions beside them and then consider why the term "race" has any meaning. sami, swede, finn, rus, tatar, khazak, turk, kurd, arab, berber, ethiopian, tutu. Or Han, mongol, uiger, kyrgir, uzbek, khazak, pashtun, persian, punjabi, hindi, bangali, burmese, thai, javanese, dayak. Where exactly do you parse the line of Caucasian, Negroid, Mongoloid? And why?

Historically, in the cultures from which our culture was derived, skin color, and later eyelid morphology, has been used to define three races (conveniently ignoring the pacific ocean and western hemisphere), for no reason other than the biases of the people in those cultures. If you actually look at facial structure (and why not, no less arbitrary) you'll find the people of the horn of africa have more in common with central european populations in terms of nose and lip shape than they do with more inland African populations. It is our bias to see skin color as more relevant than nose morphology that causes us to group Ethiopians with Hottentots and Biafrans as a single race. We could just as easily group them with Arabs, Berbers, and Kurds. An albino from the Indian subcontinent could claim without fear of contradiction to be an albino of just about any heritage in south asia or europe. Burmese and Japanese have vastly different average skin color but we arbitrarily group them together because of eyelid morphology.

So your question becomes "If different people..." to which the answer is: Of course.

The question you think you are asking, I think, is best rendered "Are those morphological features our modern society arbitrarily associates with membership in three arbitrary sets of humanity also associated with specific brain variations?" Which is exactly as arbitrary a question as "Is foot length/ back hair/ bilateral kidney symmetry associated with specific brain variations."

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-11T04:46:18.841Z · LW · GW

I see factual memory as a highly changeable data set that has very little to do with "self". As I understand it (not an expert in neuroscience or psychiatry, but experience working with neurologically impaired people) the sort of brain injuries which produce amnesia are quite distinct from those that produce changes in personality, as reported by significant others, and vice versa. In other words, you can lose the memories of "where you came from" and still be recognized as very much the same person by those who knew you, while you can become a very different person in terms of disposition, altered emotional response to identical stimuli relative to pre-injury status, etc (I'm less clear on what constitutes "personality", but it seems to be more in line with people's intuitive concept of "self") with fully intact memories. The idea of a memory wipe and continued existence is certainly a "little death" to my thinking, but marginally preferable to actual death. My idea of consciousness is one of passive reception. The same "I", or maybe "IT" is better, is there post memory wipe.

If memory is crucial to pattern identity then which has the greater claim to identity: The amnesiac police officer, or his 20 years of dashcam footage and activity logs?

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-11T03:50:42.229Z · LW · GW

Anatomical location meaning neurons in the brain. Not necessarily a discrete brain organelle. To deny consciousness an anatomical location in the brain is to say it arises from something other than brain function. Are you supporting some sort of supernatural theory of consciousness?

Comment by Usul on Consciousness and Sleep · 2016-01-08T06:04:24.567Z · LW · GW

If retention of memory is a key component of identity, then what are the implications for identity:

When decades of new memories have been made (if loss of memory=loss of identity does gain of memory also=change of identity)? When old memories have changed beyond all recognition (unaware to the current rememberer he doesn't recall Suzy Smith from 1995 in 2015 the same way he recalled her in 2000)? When senile dementia causes gradual loss of memory? When mild brain injury causes sudden loss of large areas of memory while personality remains unchanged post injury? When said memory returns?

Tricky stuff, identity. Without a clear continuity to hang it on why should I care about what happens to me in five minutes, much less five years? Why do I work to benefit me tomorrow more than I do to benefit you next week? That's why I like hanging it on passive conscious awareness (assuming that thing exists), but damned if I know.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-08T05:18:13.847Z · LW · GW

"I model that as "when you die, some memories are wiped and you live again". If you concede that wiping a couple days of memory doesn't destroy the person, then I think that's enough for me. Probably not you, though. What's the specific argument in that case?"

I think I must have missed this part before. Where I differ is in the idea that a copy is "me" living again, I don't accept that it is, for the reasons previously written. Whether or not a being with a me-identical starting state lives on after I die might be the tiniest of solaces, like a child or a well-respected body of work, but in no way is it "me" living on in any meaningful way that I recognize. I get the exact opposite take on this, but I agree even with a stronger form of your statement to say that "ALL memories are wiped and you live again" (my conditions would require this to read "you continue to live") is marginally more desirable than "you die and that's it". Funny about that.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-08T04:42:02.282Z · LW · GW

I'm familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain's atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me.

However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla's machine.

At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can't come up with any answer other that "fuck that guy". I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it's going to be that being which remains behind these eyes.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-08T04:02:25.504Z · LW · GW

Another thought, separate but related issue: "fork" and "copy" could be synonyms for "AI", unless an artificial genesis is in your definition of AI. Is it a stretch to say that "accomplish some task" and "(accept) termination" could be at least metaphorically synonymous with "stay in the box"?

"If I make 100 AIs they will stay in the box."

(Again, I fully respect the rationality that brings you to a different conclusion than mine, and I don't mean to hound your comment, only that yours was the best comment on which for me to hang this thought.)

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-08T03:35:07.380Z · LW · GW

I'm going to go ahead and continue to disagree with the the pattern theorists on this one. Has the inverse of the popular "Omega is a dick with a lame sense of irony" simulation mass-murder scenario been discussed? Omega (truthful) gives you a gun. "Blow your brains out and I'll give the other trillion copies a dollar." It seems the pattern theorist takes the bullet or Dr Bowie-Tesla's drowning pool with very little incentive.

The pattern theorists as you describe them would seem to take us also to the endgame of Buddhist ethics (not a Buddhist, not proselytizing for them): You are not thought, you are not feeling, you are not memory, because these things are impermanent and changing. You are the naked awareness at the center of these things in the mind of which you are aware. (I'm with them so far. Here's where I get off): All sentient beings are points of naked awareness, by definition they are identical (naked, passive), therefore they are the same, Therefore even this self does not matter, therefore thou shall not value the self more than others. At all. On any level. All of which can lead you to bricking yourself up in a cave being the correct course of action.

To your understanding, does the pattern theorist (just curious, do you hold to the views you are describing as pattern theory?) define self at all on any level? Memory seems an absurd place to do so from, likewise personality, thought- have you heard the nonsense that thought comes up with? How can a pattern theorist justify valuing self above other? Without a continuous You, we get to the old Koan "Who sits before me now? (who/what are You?)"

"Leave me alone and go read up on pattern theory yourself, I'm not your God-damn philosophy teacher." Is a perfectly acceptable response, by the way. No offense will be taken and it would not be an unwarranted reply. I appreciate the time you have taken to discuss this with me already.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-08T03:03:00.086Z · LW · GW

I'm not sure I follow your first point. Please clarify for me if my answer doesn't cover it. If you are asking if multiple completely non-interacting, completely functional minds run on a single processing medium constituting separate awarenesses (consciounesses), or if two separate awarenesses could operate with input from a single set of mind-operations, then I would say yes to both. Awareness is a result of data-processing, 1s and 0s, neurons interacting as either firing or not. Multiple mind operations can be performed in a single processing substrate, ie memory, thought, feeling; which are also results of data processing. If awareness is compromised we have a zombie, open to some discussion as to whether or not other mind functions have been compromised, though it is, of course, generally accepted that behaviorally no change will be apparent.

If the processing media are not communicating A and B are separate awarenesses. If they are reconnected in such a way that neither can operate independently then they are a single awareness. As an aside, I suspect any deviation which occurs between the two during separation could result in bugs up to and including systems failure, unless a separate system exists to handle the reintegration.

I don't believe enough is known about the brain for anyone to answer your second question. Theoretically, if more than one set of cells could be made to function to produce awareness, neuroplasticity may allow this, then a single brain could contain multiple functioning awarenesses. I doubt a lobotomy would produce this effect, more likely the procedure could damage or disable the function of the existing awareness. Corpus callosotomy would be the more likely candidate, but, again, neuroscience is far from giving us the answer. If my brain holds another awareness, I (the one aware of this typing) value myself over the other. That it is inside me rather than elsewhere is irrelevant.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-08T02:31:37.170Z · LW · GW

I definitely agree that incremental change (which gets stickier with incremental non-destructive duplication) is a sticky point. What I find the most problematic to my my thesis is a process where every new datum is saved on a new medium, rather than the traditionally-cited cell-by-cell scenario. It's problematic but nothing in it convinces me to step up to Mr Bowie-Tesla's machine under any circumstances. Would you? How about if instead of a drowning pool there was a team of South America's most skilled private and public sector torture experts, who could keep the meat that falls through alive and attentive for decades? Whatever the other implications, the very eyes seeing these words would be the ones pierced by needles. I don't care if the copy gets techno-heaven/ infinite utility.

Your thought experiment doesn't really hit the point at issue for me. My answer is always "I want to stay where I am". For silicon to choose meat is for the silicon to cease to exist, for meat to choose silicon is for meat to cease to exist. I only value the meat right now because that is where I am right now. My only concern is for ME, that is the one you are talking to, to continue existing. Talk to a being that was copied from me a split second ago and that guy will throw me under the bus just as quickly (allowing for some altruistic puzzles where I do allow that I might care slightly more about him than a stranger, but mostly because I know the guy and he's alright and I can truly empathize with what he must be going through (ie if I'm dying tomorrow anyway and he gets a long happy life, but I may do the same for a stranger). The scenario is simply russian roulette if you won't accept my "I want to stay put" answer.

Shit, if I came to realize that I was a freshly-minted silicon copy living in a non-maleficent digital playground I would be eternally grateful to Omega, my new God whether It likes it or not, and that meat shmuck who chose to drown his monkey ass just before he realized he'd taken the Devil's Bargain.

Not that "meat" has any meaning other than "separate entity" here. If I am sim-meat I want to stay this piece of sim meat.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T08:26:52.788Z · LW · GW

Honestly, I'm not sure what other than intuition and subjective experience we have to go with in discussing consciousness. Even the heavy hitters in the philosophy of consciousness don't 100% agree that it exists. I will be the first to admit I don't have the background in pattern theory or the inclination to get into a head to head with someone who does. If pressed, right now I'm leaning towards the matter-based argument, that if consciousness is not magical then it is tied to specific sets of matter. And that a set of matter can not exist in multiple locations. Therefore a single consciousness can not exist in multiple locations. The consciousness A that I am now is in matter A. If a copy consciousness B is made in matter B and matter A continues to exist than it is reasonable to state that consciousness A remains in matter A. If matter A is destroyed there is no reason to assume consciousness A has entered matter B simply because of this. You are in A now. You will never get to B.

So, if it exists, and it is you, you're stuck in the meat. And undeniably, someone gets stuck in the meat.

I imagine differing definitions of You, self, consciousness, etc would queer the deal before we even got started.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T07:50:08.604Z · LW · GW

Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.

Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.

New question: Yes, an amoral dick who shares my views on consciousness would say yes.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T07:33:20.750Z · LW · GW

I completely respect the differences of opinion on this issue, but this thought made me laugh over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?

Sounds more sinister my way.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T07:25:09.687Z · LW · GW

The genesis of my brain is of no concern as to whether or not I am the consciousness within it. I am, ipso facto. When I say it doesn't matter if I am an original or a copy or a copy of a copy I mean to say just exactly that. To whom are you speaking when you ask the question who are you? if it is to me the answer is "Me" I'm sorry that I don't know whether or not I am a copy but I was UNconscious at the time.

If copy is B and original is A. The question of whether I am A or B is irrelevant to the question of am I ME, which I am. Ask HIM the same question and HE will say the same and it will be true coming from his mouth.

If I drug you and place you in a room with two doors, only I would know which of those doors you entered though. This means that before I told you which one you entered, you would have been equally comfortable with the prospect of being either one. I could have even made you waffle back and forth by repeatedly telling you that I lied. What a strange situation to find yourself in--every possible piece of information about your internal experience is available to you, yet you seem unable to make up your mind about a very simple fact!

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T06:31:06.434Z · LW · GW

No. It's a dick move. Same question and they're not copies of me? Same answer.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T06:29:45.450Z · LW · GW

It's a sticky topic, consciousness. I edited my post to clarify further:

I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding.

Which I recognize might sound a bit mystical to some, but I believe it is a real thing which is a function of brain activity.

As a function of brain (or whatever processing medium) consciousness or self is tied to matter. The consciousness in the matter that is experiencing this consciousness is me. I'm not sure if any transfer to alternate media is possible. The same matter can't be in two different places. Therefore every consciousness is a unique entity, although identical ones can exist via copying. I am the one aware of this mind as the body is typing. You are the one aware of the mind reading it. Another might have the same experience but that won't have any intrinsic value to You or I.

If I copy myself and am destroyed in the process, is the copy me? If I copy myself and am not destroyed, are the copy and the original both me? If I am a product of brain function (otherwise I am a magical soul) and if both are me then my brain is a single set of matter in two locations. Are they We? That gets interesting. Lots to think about but I stand with my original position.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T06:15:25.020Z · LW · GW

I was just having a laugh at the follow-up justification where technical difficulties were cited, not criticizing the argument of your hypothetical, sorry if it came off that way.

As you'd probably assume they would based on my OP, my copies, if I'd been heartless enough to make them and able to control them, would scream in existential horror as each came to know that that-which-he-is was to be ended. My copies would envy the serenity of your copies, but think them deluded.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T06:06:22.955Z · LW · GW

Sorry, I missed that you were the copier. Sure, I'm the copy. I do not care one bit. My life goes on totally unaffected (assuming the original and I live in unconnected universes). Do I get transhuman immortality? Because that would be awesome for me. If so, I git the long end of the stick. It would have no value to poor old original, nor does anything which happens to him have intrinsic value for me. If you had asked his permission he would have said no.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T05:57:40.853Z · LW · GW

You shouldn't murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I'm a copy or not depends on various aspects of my personality.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T05:44:06.023Z · LW · GW

Based on your status as some-guy-on-the-internet and my estimate of the probability that this exact situation could come to be, no I do not believe you.

To clarify: I do not privilege the original self. I privilege the current self.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T05:23:38.100Z · LW · GW

Meat or sim or both meat aren't issues as I see it. I am self, fuck non-self, or not to be a dick, value non-self less than self, certainly on existential issues. "I" am the awareness within this mind. "I" am not memory, thought, feeling, personality, etc. I know I am me ipso facto as the observer of the knower. I don't care if I was cloned yesterday or one second ago and there are many theoretical circumstances where this could be the case. I value the "I" that I currently am just exactly now. I don't believe that this "I" is particularly changeable. I fear senility because "I" am the entity which will be aware of the unpleasant thoughts and feeling associated with the memory loss and the fear of worsening status and eventual nightmarish half-life of idiocy. That being will be unrecognizable as me on many levels but it is me whereas a perfect non-senile copy is not me, although he has the experience of feeling exactly as I would, including the same stubborn ideas about his own importance over any other copies or originals.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T05:08:57.153Z · LW · GW

Great question. Usul and his copy do not care one bit which is which. But perhaps you could put together a convincing evidence chain. At which time copy Usul will still not care.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T04:46:43.174Z · LW · GW

Thanks for the reply. Yeah, I think I just threw a bunch of thoughts at the wall to see what would stick. I'm not really thinking too much about the practical so-I've-got-a-copy-now-what? sort of issues. I'm thinking more of the philosophical, perhaps even best categorized as Zen, implications the concept of mind-cloning has for "Who am I" in the context of changing thoughts, feelings, memories, unconscious conditioned responses, and the hard to get at thing inside which ( I first typed "observes" - bad term: too active) which is aware of these all without thinking, feeling, remembering, or responding. Because if "I" don't come along for the ride I don't think it counts as "me", which is especially important for promises of immortality.

If I'm being honest with myself, perhaps I'm doing a bit of pissing on the parades of people who think they have hope for immortality outside of meat, out of jealousy for their self-soothing convictions, however deluded I believe they are. See also "Trolling of Christians by Atheists, Motivations Behind". Cheers.

Edit: And if I'm being entirely honest with myself, I think that shying away from acknowledging that last motivation is the reason why I titled this "Your transhuman self..." and not "Your transhuman immortality...", which would sum up my argument more accurately.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T04:29:09.578Z · LW · GW

Thanks for the reply. To your last point, I am not speaking of zombies. Every copy I discussed above is assumed to have its own consciousness. To your first points, at no time is there any ambiguity or import to the question of "which one I am". I know which one I am, because here I am, in the meat (or in my own sim, it makes no difference). When a copy is made its existence is irrelevant, even if it should live in a perfect sim and deviate not one bit, I do not experience what it experiences. It is perhaps identical or congruent to me. It is not me.

My argument is, boiled down: That your transhuman copy is of questionable value to your meat self. For the reasons stated above (chiefly that "You" are the result of activity in a specific brain), fuck that guy. You don't owe him an existence. If you that are reading this ever upload with brain destruction, you will have committed suicide. If you upload without brain destruction you will live the rest of your meat life and die. If you brain-freeze, something perfectly you-like will live after you die with zero effect on you.

I stand by that argument, but, this being a thorny issue, I have received a lot of great feedback to think on.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T04:14:00.211Z · LW · GW

Thanks for the reply.

"You conclude that consciousness in your scenario cannot have 1 location(s)." I'm not sure if this is a typo or a misunderstanding. I am very much saying that a single consciousness has a single location, no more no less. It is located in those brain structures which produce it. One consciousness in one specific set of matter. A starting-state-identical consciousness may exist in a separate set of matter. This is a separate consciousness. If they are the same, then the set of matter itself is the same set of matter. The exact same particles/wave-particles/strings/what-have-you. This is an absurdity. Therefore to say 2 consciousnesses are the same consciousness is an absurdity.

"It is indeed the same program in the same state but in 2 locations." It is not. They are (plural pronoun use) identical (congruent?) programs in identical states in 2 locations. You may choose to equally value both but they are not the same thing i two places.

My consciousness is the awareness of all input and activity of my mind, not the memory. I believe it is, barring brain damage, unchanged in any meaningful way by experience. It is the same consciousness today as next week, regardless of changes in personality, memory, conditioned response imprinting. I care about tomorrow-me because I will experience what he experiences. I care no more about copy-me than I do the general public (with some exceptions if we must interact in the future) because I (the point of passive awareness that is the best definition of "I") will not experience what he experiences.

I set a question to entirelyuseless above: Basically, does anything given to a copy of you induce you to take a bullet to the head?

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T03:53:29.790Z · LW · GW

"would require extremely detailed neurological understanding of memory storage and retrieval." Sorry, this on a blog where superintelligences have been known simulate googleplexes of perfectly accurate universes to optimize the number of non-blue paperclips therein?

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T03:46:35.112Z · LW · GW

Hail Xenu! I would need some time to see how the existential horror of going to bed tonight sits with me. Very likely it would overwhelm and around 4:00am tomorrow I'd take your deal. "(There is no afterlife for humans.) " I knew it! SUCK IT, PASCAL!

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T03:39:55.023Z · LW · GW

Thanks for the reply. I don't really follow how the two parts of your statement fit together, but regardless, my first instinct is to agree with part one. I did as a younger (LSD-using) man ascribe to a secular magical belief that reincarnation without memory was probable, and later came to your same conclusion that it was irrelevant, and shortly thereafter that it was highly improbable. But just now I wonder (not to the probability of magical afterlives) but what if I gave you the choice: 1. Bullet to the head. 2. Complete wipe of memory, including such things as personality, unconscious emotional responses imprinted over the years, etc: all the things that make you you, but allowed that the part of your brain/mind which functioned to produce the awareness which passively experienced these things as they happened (my definition of consciousness) to continue functioning. Both options suck, of course, but somehow my #2 sounds appealing relative to my #1 in a way that your #2 doesn't. Which is funny I think. Maybe simply because your #2, transfer of my meat consciousness into another piece of meat, would require a magical intervention to my thinking.

As to your second point: (If it hasn't already been coined) Sophie Pascal's Choice? Would any reward given to the surviving copy induce you to step onto David Bowie Tesla's Prestige Duplication Machine, knowing that your meat body and brain will be the one which falls into the drowning pool while an identical copy of you materializes 100m away, believing itself to be the same meat that walked into the machine?

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T03:19:52.318Z · LW · GW

Thanks for the reply. I am not convinced by the pattern identity theorist because, I suppose, I do not see the importance of memory in the matter, nor the thoughts one might have about those thoughts. If I lose every memory slowly and die senile in a hospital bed I believe that it will be the same consciousness experiencing those events as is experiencing me writing these words. I identify that being which holds no resemblance to my current intellect and personality will be my Self in a way that an uploaded copy with my current memory and personality can never be. I might should have tabooed "consciousness" from the get go, as there is no one universal definition. For me it is passive awareness. This meat awareness in my head will never get out and it will die and no amount of cryonics or uploading to create a perfect copy that feels as if it is the same meat awareness will change that. Glass half-empty I suppose.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T03:09:04.433Z · LW · GW

Great thought experiment, thanks. I do define consciousness as a passively aware thing, totally independent of memory. The demented, the delirious, the brain damaged all have (unless those structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, as I define it, as they did when their brains were intact. Dream Self is the same Self as Waking Self to my thinking. I assume consciousness arises at some point in infancy. From that moment on it is Self, to my thinking.

In your 2 meat scenarios I still count one consciousness, being aware of different things at different times.

In wire form, if those physical structures (wires) on which consciousness operations occur (no other wires matter to the question) are cleaved, two consciousness exist. When their functionality is re-joined, one consciousness exists. Neither, I suppose, can be considered "original" nor "copy", which queers the pot a bit vis a vis my original thesis. But then if, during a split, B is told that it will be destroyed while A lives on. I don't imaging B will get much consolation, if it is programmed to feel such things. Alternately, split them and ask A if which one of the two should be destroyed, I can imaging it wouldn't choose B.

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T02:46:42.604Z · LW · GW

Thanks for the reply. Sleep is definitely a monkey wrench in the works of my thoughts on this, not a fatal one for me, though. I wouldn't count distraction of dissociation, though. I am speaking of the (woo-light alert) awareness at the center of being, a thing that passively receives sensory input, including sense of mind-activity) (and I wonder if that includes non-input?) I do believe that this thing exists and is the best definition of "Self".

Comment by Usul on Your transhuman copy is of questionable value to your meat self. · 2016-01-07T02:39:15.453Z · LW · GW

Thanks for the reply. Perhaps I should mention I have no children and at no point in my life or in my wife's life have either of us wanted children.

Comment by Usul on Open Thread, January 4-10, 2016 · 2016-01-07T02:36:34.173Z · LW · GW

I don't play, craps is the only sucker bet I enjoy engaging in. But if coerced to play, I press with non-sims. Don't press with sims. But not out of love, out of an intimate knowledge of my opponent's expected actions. Out of my status as a reliable predictor in this unique circumstance.

Comment by Usul on The Number Choosing Game: Against the existence of perfect theoretical rationality · 2016-01-06T03:01:58.468Z · LW · GW

I was born a non-Archimedean and I'll die a non-Archimedean.

"0.99 repeating = 1" I only accept that kind of talk from people with the gumption to admit that the quotient of any number divided by zero is infinity. And I've got college calculus and 25 years of not doing much mathematical thinking since then to back me up.

I'll show myself out.

Comment by Usul on Open Thread, January 4-10, 2016 · 2016-01-06T02:05:43.168Z · LW · GW

Thanks for the reply. I'm not sure if your reasoning (sound) is behind the tendency I think I've identified for LW'ers to overvalue simulated selves in the examples I've cited, though. I suppose by population ethics you should value the more altruistic simulation, whomever that should be. But then, in a simulated universe devoted to nothing but endless torture, I'm not sure how much individual altruism counts.

"Totally tangential point" I believe footnotes do the job best. The fiction of David Foster Wallace is a masterwork of portraying OCD through this technique. I am an idiot at formatting on all media, though, and could offer no specifics as to how to do so.

Comment by Usul on Open Thread, January 4-10, 2016 · 2016-01-05T08:55:45.040Z · LW · GW

I appreciate the reply. I recognize both of those arguments but I am asking something different. If Omega tells me to give him a dollar or he tortures a simulation, a separate being to me, no threat that I might be that simulation (also thinking of the Basilisk here), why should I care if that simulation is one of me as opposed to any other sentient being?

I see them as equally valuable. Both are not-me. Identical-to-me is still not-me. If I am a simulation and I meet another simulation of me in Thunderdome (Omega is an evil bastard) I'm going to kill that other guy just the same as if he were someone else. I don't get why sim-self is of greater value than sim-other. Everything I've read here (admittedly not too much) seems to assume this as self-evident but I can't find a basis for it. Is the "it could be you who is tortured" just implied in all of these examples and I'm not up on the convention? I don't see it specified, and in "The AI boxes you" the "It could be you" is a tacked-on threat in addition to the "I will torture simulations of you", implying that the starting threat is enough to give pause.

Comment by Usul on The Number Choosing Game: Against the existence of perfect theoretical rationality · 2016-01-05T08:07:09.730Z · LW · GW

I was bringing the example into the presumed finite universe in which we live, where Maximum Utility = The Entire Universe. If we are discussing a finite-quantity problem than infinite quantity is ipso facto ruled out.

Comment by Usul on The Number Choosing Game: Against the existence of perfect theoretical rationality · 2016-01-05T07:54:27.194Z · LW · GW

"What you've done is take my argument and transform it into an equivalent obvious statement. That isn't a counter-argument. In fact, in mathematics, it is a method of proving a theorem. If you read the other comments, then you'll see that other people disagree with what I've said" You're welcome? Feel free to make use of my proof in your conversations with those guys. It looks pretty solid to me.

If a Perfect Rational Agent is one who can choose Maximum Finite Utility. And Utility is numerically quantifiable and exists in infinite quantities. And the Agent must choose the quantity of Utility by finite number. Then no such agent can exist. Therefore a Perfect Rational Agent does not exist in all possible worlds.

I suppose I'm agreeing but unimpressed. Might could be this is the wrong website for me. Any thought experiment involving infinity does run the risk of sounding dangerously close to Theology to my ears. Angels on pinheads and such. I'm not from around here and only dropped in to ask a specific question elsewhere. Cheers.

Comment by Usul on The Number Choosing Game: Against the existence of perfect theoretical rationality · 2016-01-05T05:56:58.080Z · LW · GW

Sorry, I missed that you postulated an infinite universe in your game.

I don't believe I am misrepresenting your position. "Maximizing utility" is achieved by-, and therefore can be defined as- "choosing the highest number". The wants of the agent need not be considered. "Choosing the highest number" is an example of "doing something impossible". I think your argument breaks down to "An agent who can do the impossible can not exist." or "It is impossible to do the impossible". I agree with this statement, but I don't think it tells us anything useful. I think, but I haven't thought it out fully, that it is the concept of infinity that is tripping you up.

Comment by Usul on The Number Choosing Game: Against the existence of perfect theoretical rationality · 2016-01-05T05:28:57.520Z · LW · GW

Let's taboo "perfect", and "utility" as well. As I see it, you are looking for an agent who is capable of choosing The Highest Number. This number does not exist. Therefore it can not be chosen. Therefore this agent can not exist. Because numbers are infinite. Infinity paradox is all I see.

Alternately, letting "utility" back in, in a universe of finite time, matter, and energy, there does exist a maximum finite utility which is the sum total of the time, matter, and energy in the universe. There will be an number which corresponds to this. Your opponent can choose a number higher than this but he will find the utility he seeks does not exist.

Comment by Usul on The Number Choosing Game: Against the existence of perfect theoretical rationality · 2016-01-05T04:57:04.927Z · LW · GW

I'm just not convinced that you're saying anything more than "Numbers are infinite" and finding a logical paradox within. You can't state the highest number because it doesn't exist. If you postulate a highest utility which is equal in value to the highest number times utility 1 then you have postulated a utility which doesn't exist. I can not chose that which doesn't exist. That's not a failure of rationality on my part any more than Achilles inability to catch the turtle is a failure of his ability to divide distances.

I see I made Bob unnecessarily complicated. Bob = 99.9 Repeating (sorry don't know how to get a vinculum over the .9) This is a number. It exists.

Comment by Usul on The Number Choosing Game: Against the existence of perfect theoretical rationality · 2016-01-05T04:17:11.329Z · LW · GW

There exists an irrational number which is 100 minus delta where delta is infinitesimally small. In my celestial language we call it "Bob". I choose Bob. Also I name the person who recognizes that the increase in utility between a 9 in the googleplex decimal place and a 9 in the googleplex+1 decimal place is not worth the time it takes to consider its value, and who therefore goes out to spend his utility on blackjack and hookers displays greater rationality than the person who does not.

Seriously, though, isn't this more of an infinity paradox rather than an indictment on perfect rationality? There are areas where the ability to mathematically calculate breaks down, ie naked singularities, Uncertainty Principle, as well as infinity. Isn't this more the issue at hand: that we can't be perfectly rational where we can't calculate precisely?