Posts
Comments
I was kind of going off on a speculative tangent on that last sentence. I was wondering if that feeling was somehow reward-system related, and would fuel a musician's drive to excel. Like they try to play better and better to achieve that euphoria which only comes on when they do better than they ever have, with diminishing (dopamine?) returns, but, as a side-effect, increasing their practical talent to ever higher levels. So the musical prodigy over time becomes motivated more by the tangible rewards (fame, increased income), which will never compare to the feelings that made him choose that path in the first place. It would apply to many careers if it was a valid theory.
I also had that same experience on the higher levels of Rock Band. I am not talented with any real-life musical instruments, but you say you feel that with guitar; for you personally, is that an episodic thing, or does that consistently happen when playing serious guitar? Is that something that most musicians know about, because it was exquisitely bizarre--is that the secret allure of musicians? Or does one build up a tolerance that drives one toward excellence in the hopes of catching the "high of accomplishment"?
What is it that makes consciousness, or the thing that it points to (if such a thing is not ephemeral), important? You already said that knowing the exact quantities negates the need for categorization.
Well, now it sounds like you found a useful definition of life; at what point on this spectrum, then, would you consider something conscious? Since it's processes you are looking for, there is probably a process that, without which, you could clearly classify as un-conscious.
It seems to me, though, that there are quite a few axes on which it would be hard to disturb a star's equilibrium. That still keeps it included in your definition. Also, since tungsten is not disruptive to the star's homeostasis, it has no reason to expel it. I appreciate your rational answers, because I'm actually helping you steel-man your theory, it only looks like I'm being a dork.
I agree with your correlation, but I think your definition would make stars and black holes apex predators.
Which is basically the same phrase, but without spaces between words.
You're trying to ad-hom me as a fuzzy-minded irratiolanist. Please don't.
No need, you're doing a fine job of that all by yourself.
I take it that you're nitpicking my grammar because you disagree with my views.
As for what topic I am talking about, it is this: In the most practical sense, what you did yesterday has already happened. What will you do five minutes from now? Let's call it Z.. Yes, as a human agent the body and brain running the program you call yourself is the one who appears to make those decisions five minutes from now, but six minutes from now Z has already happened. In this practical universe there is only one Z, and you can imagine all you like that Z could have been otherwise, but six minutes from now, IT WASN'T OTHERWISE. There may be queeftillions of other universes where a probability bubble in one of your neurons flipped a different way, but those make absolutely no practical difference in your own life. You're not enslaved to physics, you still made the decisions you made, you're still accountable according to all historical models of accountability (except for some obscure example you're about to look up on Wikipedia just to prove me wrong), and you still have no way of knowing the exact outcomes of your decisions, so you've got to do the best you can on limited resources, just like the rest of us. "Free Will" is just a place-holder until we can replace that concept with "actual understanding", and I'm okay with that. I understand that the concept of free-will gives you comfort and meaning in your life, but "I have no need of that hypothesis."
I will answer your question, but I do not understand your last statement; it looks like you retyped it several times and left all the old parts in.
I meant that with a sufficiently detailed understanding of physics, it would be meaningless to even posit the existence of (strong) free will. By meaningless here I mean a pointless waste of one's time. I was willing to clarify, but deep down I suspect that you already knew that.
Now that you mention it, a fable, by definition, requires bullshit.
and then it’s obvious that the ass can be “stuck.”
...seriously?
I think you have a good pattern going here when you classify things as "things you'd say to a..." Maybe, outside of the ritual itself, people could volunteer to be one of those positions for others without those services. Like, the Moombah would be the guy that listens to the things you'd say to a priest, without being a priest. He would listen under an oath of secrecy, to anyone who wanted to confess something. The High Glombix would listen to all the things you'd say to a therapist, without being a therapist, again under secrecy. The Vemerev would listen to all the things you're afraid to tell your friends about yourself, without judging you. It would be an accepted support group without relying on the traditional avenues, and it would also serve the purpose of getting you used to evaluating yourself and to verbally admitting your problems.
It was a property dispute, not a measurement of righteousness. The story served to illustrate Solomon's wisdom; spiritual judgment of the women was not an issue. As for my opinion, I see both of them as stupid, and only evil to the degree that stupidity influences evil.
In that case the question is less interesting, since it's just a matter of how well you can think yourself into the hypothetical in which you have to choose between, say, increasing your child's odds of surviving by 1% and the cost of, say, increasing your guilt-if-the-child-does-die by 200%.
I guess, but in real life I don't sit down with a calculator to figure that out; I'd settle for some definitive research.
Your second-order desires are fixed by your desires as a whole, trivially. But they aren't fixed by your first-order desires. So it makes sense for me to ask whether you harbor a second-order desire to change your first-order desires in this case, or whether you are reflectively satisfied with your first-order desires.
[all that quote], trivially. What I am saying is that even my "own" desires and the goals that I think are right are only what they are because of my biology and upbringing. If I seek to "debug" myself, it's still only according to a value system that is adapted to perpetuate our DNA. So to answer truthfully, I am NOT satisfied with my first-order desires, in fact I am not satisfied with being trapped in a human body, from which the first-order desires are spawned.
Then I still blame the mother in the story for not building one of those!
That is pretty neat, I wholeheartedly endorse using those, just in case. In the unlikely event that I produce more biological offspring, I will make use of that knowledge.
My desires concerning what my desires should be are also determined by my desires, so your question is not valid, it's a recursive loop. You are first assuming that I care about anything at all, secondly assuming that I experience guilt at all, and thirdly that I would care about my children. As it turns out, you are correct on all three assumptions, just keep in mind that those are not always givens among humans.
What I was saying was that in the two situations (my child dies due to SIDS), and (my child dies due to me rolling over onto him), in the first situation not only could I trick myself into believing it wasn't my fault, it's also completely possible that it really wasn't my fault, and that it had some other cause; in the second situation, there's really no question, and a very concrete way to prevent it.
To answer your unasked question, I still do not alieve that keeping my child a safe distance away while sleeping but showing love and care at all other times increases her chance of SIDS. If I was to be shown conclusive research of cause and effect between them, I would reverse my current opinion, mos' def.
I expected that. My own opinion is that if it is necessary for some reason, it's a good idea, but personally I'd rather be possibly, indirectly, and one instance of a poorly understood syndrome responsible for my baby's death than actually being the one that crushed him.
It seems that sleeping separately very drastically decreases your chances of personally killing your baby in your sleep.
No, as you can see by the amount of objections, you are not too cynical. It's closer to a sort of Proto-Bayes, stories like this show that that kind of thinking can turn out wise solutions; Bayesian thinking as it is understood now is more refined.
Given the wording of the story, both women were in the practice of sleeping directly next to their babies. The other woman didn't roll over her baby because she was wicked, she rolled over her baby because it was next to her while she slept. They left out the part where the "good mother" rolled over her own baby two weeks later and everyone just threw up their hands and declared "What can we do, these things just happen, ya' know?"
Well-said. Thank you.
As I read the "Anthropic Trilemma", my response could be summed up thus: "There is no spoon."
So many of the basic arguments contained in it were defined by undefined concepts, if you dig deep enough. We talk about the continuation of consciousness in the same way that we talk about a rock or an apple. The only way that a sense of self doesn't exist is the same way that a rock or apple don't exist, in the strictest technical sense. To accept a human being as a classical object in the first place disqualifies a person from taking a quantum-mechanical cop-out when it comes to defining subjective experience. People here aren't saying to themselves, "Huh? Where do you get this idea that a person exists for more than the present moment?? That's crazy talk!" It's just an attempt to deny the existence of a subjective experience that people actually do, um, subjectively experience.
You are correct. I was interpreting "saving the world" in this article to mean "saving the world [from supervillains]". (fixed in comment now)
The most limiting thing that you have not pointed out is that as a Superhero, you want to save the world. Saving the world [from supervillains] is by definition reactive. A Supervillain's goals have much more room for variation, and one could argue that Supervillains actually are optimizing the world, it just happens to be sub-optimal for everyone else.
t=59 minutes...
AI: Hmm, I have produced in this past hour one paperclip, and the only other thing I did was come up with the solutions for all of humanity's problems, I guess I'll just take the next minute to etch them into the paperclip...
t=2 hours...
Experimenters: Phew, at least we're safe from that AI.
It was determined to be human error on my side. Fixed.
I think it actually may have been an add-on that was intentionally (or just carelessly) installed into Firefox by another family member. I can shut it off myself. Seriously, who would download a program that explicitly promises more popups? (facepalm)
Refer to the nested comment above for the details. So nobody else here has links on those words?
The word "pay" in paragraph 1, the word "details" in paragraph 5, and the word "money" in paragraph 7. It's possible that either my computer or the LW site has some very creative adware.
Why are some of your links triggering scammish popups? Is it supposed to be some sort of humor?
I was mostly curious to see if someone else would independently arrive at my conclusions if asked the same questions, as a way to test the strength of my conclusions.
I'm not offended, that's one of my favorite games. My thought process is so different than my peers that I constantly need to validate it through "coerced replication". I know I'm on the right track when people clearly refuse to follow my train of thought because they squirm from self-reflection. Yesterday I got a person to vehemently deny that he dislikes his mother, while simultaneously giving "safer" evidence of that very conclusion, because, you know, you're supposed to love your parents.
Regarding the hard problem of consciousness, I am not even sure that it's a valid problem. The mechanics of sensory input are straightforward enough; The effects of association and learning and memory are at least conceptually understood, even if the exact mechanics might still be fuzzy; I don't see what more is necessary. All normal-functioning humans pretty much run on the exact same OS, so naturally the experience will be nearly identical. I have a (probably untestable) theory that due to different nutritional requirements, a cat for example would experience the flavor of tuna almost identically to what we taste sugar as. And a cat eating sugar would taste something like eating plain flour, and catnip would be like smoking crack for humans. The experience itself between different creatures can be one of several stock experiences, brought on by different stimuli, just because we all share a similar biological plan (all animals with brains, for instance).
An experience like an orgasm could be classified to be something like, having a Level 455/293 release of relative seratonin and oxytocin levels, whereas eating chocolate causes a Level 231/118 in a specific person. If by some chance you measured the next person to have a Level 455/293 from eating chocolate, then you know that what they are experiencing is basically equivalent to an orgasm, without the mess. One human's baseline experience of blue is likely to be very similar to another's, but their individual experiences would modify it from that point. You know that they experience blue in much the same way that you know they have an anus. It's a function of the hardware. In some rare cases you might be wrong, but there's nothing mysterious about it to me.
Go ahead and tell me what your theories are, I'm sure that I'm not the only one listening. Even if we aren't enlightening anybody, I'm sure we are amusing them.
I did not comment on 3 and 4 because I thought you wanted to judge first whether I understood the first two.
But does it explain why we assign souls to ourselves? How do you justify to yourself the fact that you can personally feel your thoughts, emotions, and sensory input?
To me, yes. I think that a theory of mind is ascribed to oneself first, then extends to other people. On a beginner level, developing a theory of mind toward yourself is easy because you get nearly instant feedback. Your proprioception as a child is defined by instant and (mostly) accurate feedback for everything within your local skin-border. After realizing that you have these experiences, and seeing other humans behave just as if they also have them, and being nearly compelled by our wetware to generalize it to other animals and objects, our "grouping objects" programs group these bundles of driving behaviors into an abstract object (which is visualized subconsciously as a concrete object) which we call a soul.
You didn't define free will like I asked, but that's okay - it indicates that you are implicitly using a definition of free will which is impossible in any logical universe, and thus cannot be coherently defined without contradiction...
That's a much more coherent summary of what I meant, yes.
If it is intuitive to you that axioms can construct people, elaborate a little on the very basics outline of how this might be done.
You just said it--"A universe of made axioms makes sense, right?" My existence in a universe shows that it in fact has been done, saving me the trouble of proving how.
I enjoy your conversation, but I'm not particularly on the brink of an existential crisis here. In reference to my article I am simply admitting that I am aware that it is a limitation of the human brain to be guarded against, much like not sticking my hand on a hot stove prevents tissue damage. I don't expect people to be immune from it, but we'd be better off if we were more conscious of it. Instead I brought on a flurry of angry retorts that amount to "Hey, I'm not subject to fallibility, just who do you think you are accusing me of being human?"
I'd like that, but let's stay on topic here.
What it means is that I'd be indifferent between a normal day with a 1/400 chance of being a whale, and a normal day with guaranteed extra orgasm.
"Not tonight honey, I've determined that I have a 1/399 chance of being a whale!"
I would very much like to see things way too clearly...
1) Universe - deterministic, random or some third thing? Is there even a third option? What is a universe anyway? Is it governed by logic? Can anything not be governed by logic?
Dealing with the local, classical physics universe that my body's senses are adapted to perceive, I'd have to go with "third option" in the "time-loaf" sense. I suspect that MWI is true, so yes to random which one this is, but deterministic in its worldline. To me, logic is shorthand for what is actually permissible in nature. We just are not so good at defining the rules yet. Something can only appear to not be governed by logic through lack of proper resolution of the measurements.
2) Free will - Make a coherent definition. What does your answer to the previous question mean for free will? If you prefer to say that there is no free will, explain why (or whether) it feels like you have free will
I think that any sufficiently detailed understanding of physics renders the existence of person-level free will meaningless. Our savanna-dwelling ancestors had no need for such an understanding. We animals ascribe agency to all kinds of wacky shit, including these bodies. Hence, the ego. I don't feel like I'm being controlled, because in the macro sense, I'm not. The universe just runs, it doesn't have feelings or a way of doing anything but what it actually does; and what it actually does determines what I am able to do.
Oh good, you did understand what I was getting at.
Foma, in other words. The concepts you mentioned are useful because they represent established behavior sets, they are what we make them. A soul is an actual false claim, and only useful when you don't realize that it is false. I don't endorse self-deception such as that, it's a slippery slope from there.
I used quotes because not only is it not a solid concept, it's not even a valid one. The point was that to think that way betrays an a priori belief in a soul.
I'd at least be happy for my clone, because if I am supposed to love my family and offspring as normal people do, I should also love someone who shares 100% of my genetic plan, so I should be glad that someone on "Team MaoShan" got a good result. In fact, I used to use this argument to justify playing the lottery, in the sense that me losing meant that another version of me in the multiverse just did win, so I should be almost as happy. That was before I started using that money to purchase an equivalent amount of chocolate every week.
I think you described it best when you said the issue was "un-asked". Everybody here may be over it, but that is just the point when it gets the chance to creep back in. It was more like as if I was walking around with a giant "BEWARE!" sign--all the other biases seem to be countered by addressing them, and this looked like a big one that was not often talked about. I figured it would be a good addition to the bias-avoidance toolkit, because if you don't include it specifically, the next world dictator (human or otherwise) will have a world-class rationality training, and will use that to rationalize the B.I.A.S. that they were never told to avoid.
I personally think that it's such an ingrained thing that it probably colors my ideas in ways that I am not aware of, so hearing more about it would be helpful to me, if I'm the only one affected. I find that hearing other theories besides my own can help in this situation.
That's a helpful, honest answer, thanks. I have a lot of empathy, but basically no sympathy in my programming. Unfortunately this extends even to my regard for my future selves. I try to avoid death in the moment and the near future, I don't seem even to identify with my future self. So hearing something like "Well, most other people would want so and so, now you know," at least helps me understand humans.
Thanks for the critique; the dialog format would have been much more appropriate. I am actually surprised how few karma I've lost over my first article. I was fully expecting a -100 or so.
Regarding the questionable answers, I purposely got all those answers wrong to show what a "typical" guess might be, not the prevailing LessWrong opinions. I thought it was obvious enough not to point it out, and there was where I was mistaken. Sorry.
I don't fault you for your reasons, as I didn't add enough disclaimers to earn your forgiveness. If by strange you mean "unconventional" formatting, yes, I am guilty of that. I didn't feel that I was smart enough to get away with the rigid format usually found here without sounding pretentious. And I've seen articles downvoted for more petty reasons than this, so I'll take it. And think about this:
I doubt almost anyone on LW will agree with your possible answers.
If someone got every answer on a multiple choice test wrong, that wouldn't seem suspicious to fail better than chance? What I was trying to show was the "gut instinct" answers, which part of any human might answer before their executive functions override it. I am aware that most readers here are more advanced than me, and that there are a few things that, despite reading all the Sequences, I must still be missing. I am trying to find out what I'm wrong about.
I agree. It would be easier if only it weren't such a powerful illusion.
Does this mean that I should not fear death, because since I can in principle be exactly reproduced, it is not fundamentally different from sleep? In a classical sense, it is this body that I actually care about preserving, not my pattern of consciousness--that's where the fear of death is coming from. And deeper, it is really my body that cares about preserving my body--not my consciousness pattern. So the problem that I am having trouble wrapping my head around is that statistics alone makes recreation of my pattern of consciousness likely; cryonics doesn't really add much more likelihood to it, in my opinion. At whatever point in the future that I am recreated by mere chance or simulation, that will be the next time "I" exist, whether it's a billion years from now, on another planet, or another universe. Neither does it stop me from dying, so what is the actual point of cryonics, since it seems to not satisfy either of its purposes?
By soul in this article, I mean a supernatural extra object, but I am aware that many people here rationally reject that notion. What I was trying to get at was that even though we understand that it is not true, many hidden thought processes take the existence of a (supernatural) soul for granted.
I am curious about the opinions of other people here about what actual physical processes would comprise a non-supernatural soul. If I replaced all of my insides with such advanced electronics that nobody else would notice a difference (without a medical examination), does that mean I have the same soul? If I had to define a soul it would be "whatever it takes internally (mentally and physically) to cause an identical outward behavior as observed in a certain point in time". That satisfies the conditions of your definition, as well, although it is not restricted to "people", but actually comes closer to defining what qualia actually are.
I had read that article, which this one was supposed to be a sort of follow-up to. Many people here may disagree with my example answers intellectually, but like the Zombie article points out, that doesn't stop the false intuition that it is so.
Which brings me to the very subject that I hoped to discuss: Why would you or I care whether we get revived one hundred years from now? Reading on this forum, I feel like I should care, but for some reason I don't. Reproducing a similar version of my wavefunction from second to second takes considerably less effort and resources, and I think that's the process that we intuitively care about. That's an easy place for me to draw the line between what I consider "me" and "not-me". What are your personal feelings about identity?