Knowing what you want is a prerequisite to getting what you want 2011-07-12T23:19:20.640Z · score: -3 (16 votes)


Comment by nwthomas on The Irrationality Game · 2012-04-26T06:25:21.161Z · score: 0 (0 votes) · LW · GW

The data I'm working from is that contact with certain people sometimes causes me to have mystical experiences. This has happened somewhere between 20 and 100 times, with less than a dozen people. Sometimes but not always, it happens in both directions; i.e., they also have a mystical experience as a result of the contact.

The simpler hypothesis, from a materialist point of view, is that seeing these people just tripped some switch in my brain, without any direct mind-to-mind interaction being involved. Then we can say that I also tripped such a switch in their brains in the cases where it was reciprocal. We are left with the question of why this weird psychological phenomenon happens.

The religious explanation is in many ways easier and more natural. We can say that my souls brushed up against these people's. It makes sense from within the religious frame of mind that this sort of thing would happen. But obviously we run into the issues with religious views in general.

Comment by nwthomas on Humans are not automatically strategic · 2012-04-26T06:14:24.080Z · score: 2 (2 votes) · LW · GW

In my case, I don't run into "not being able to make myself pursue my goals effectively" a whole lot. What I do run into a lot is, "not being able to figure out what goals I actually want to pursue."

I think that what's going on is this in part. When I find resistance within myself to pursuing some goal (which I read into the comedian watching reruns), I take that as evidence that this goal isn't what I'm really after. I don't spend a lot of time in a state of trying to make myself do something, because of my assumption that whatever I really want to do, I won't need to make myself do. (You seem to be working under a different assumption.)

My experience is that when I hit on something I sincerely want to do, I don't find resistance in myself to doing it. I just do it. Maybe a lot of problems getting ourselves to do what we want are actually misdiagnosed problems of understanding what we want?

(I actually wrote a post on this topic a while back. Realized halfway through this comment that I was repeating what that post said; but it's what leapt into my mind when I read this, so I thought I'd press forward anyway.)

Comment by nwthomas on Knowing what you want is a prerequisite to getting what you want · 2011-07-13T06:38:43.667Z · score: 1 (1 votes) · LW · GW

To put it another way, ACT basically says we screw up our motivation because we direct our attention to goals that are not directly connected to experiencing our terminal values... which I believe is pretty close to what you're saying here, is it not?

It is pretty close, and even insofar as it's different, I think I agree with it. I'm not particularly a fan of the idea of "we can have values over states of the external world," because it seems to me that most, if not all, of our actual terminal values are mental states. In my opinion, if you think you have a value over a state of the external world, this is probably a case where you've misunderstood what your values are.

This of course is not a necessary truth about minds-in-general. I am not suggesting that all possible minds would only value mental states; rather, I am suggesting that this happens to be true of human minds.

Taking this idea to its logical conclusion, I'd be happy to disconnect from my body and the external world, and spend an eternity exploring various possible mental states, as long as there were other disembodied consciousnesses there to share the experience with me.

Comment by nwthomas on Humans are not automatically strategic · 2011-07-12T03:49:01.470Z · score: 0 (0 votes) · LW · GW

I've found that the most helpful thing for me in achieving my goals seems to be picking the right goals to begin with. I try to find goals that I really care about with a large portion of my being, rather than goals that only a small portion of my being cares about. This requires a fair amount of introspection. What do I want? It's not an easy question; counterintuitively, we don't know what we want. But, if I know what I want, then I can get it.

I'll give a couple examples. I used to have the conscious goal, "write music." My real goals, though I did not know it, were "express myself," "experience beauty," and "be more intuitive." Now I pursue those goals directly, in more concentrated forms than I could achieve with music-writing.

I now know that I can express myself better through writing than through music, so I now pursue the goal of self-expression much more efficiently by writing.

I now know that I can experience beauty in almost anything, and so my aesthetic interests are no longer limited to the narrow domain of music.

I have integrated the goal of being more intuitive in numerous ways into my life. However, finding efficient ways of becoming more intuitive remains a challenge for me.

So I no longer make music, but I still get all of the things that I wanted out of music-making, and I get them much more efficiently and in larger quantities than I did with music-making. This was made possible by my increased knowledge of what I want.

So to recap, I think that the first step in getting what you want, is knowing what you want. If you're having trouble getting yourself to pursue your goals, maybe you're wrong about what your goals are.

Comment by nwthomas on Find yourself a Worthy Opponent: a Chavruta · 2011-07-12T02:49:19.203Z · score: 7 (7 votes) · LW · GW

I have one of these, and I highly value our relationship. My friend and I have very basic disagreements in our worldview: I'm a mystic, and he's a rationalist. We spend our time working through our differences. He's done more than anybody else to make me question and revise my views on things. I think I've become significantly more correct, and a little bit wiser, due to his influence. He's also the reason I'm here on Less Wrong.

It's actually surprisingly hard to get to a point with somebody where you respect them and listen to them, despite having fundamental disagreements in your worldview. Every disagreement wears down rapport, and so extended discussions of fundamental differences in beliefs are a challenge for a relationship. I'm grateful to my partner for the fact that he still listens to my ideas when I've said so many things that I know seemed like nonsense to him.

Comment by nwthomas on The Irrationality Game · 2011-07-04T21:47:22.040Z · score: 30 (36 votes) · LW · GW

I have met multiple people who are capable of telepathically transmitting mystical experiences to people who are capable of receiving them. 90%.

Comment by nwthomas on Parapsychology: the control group for science · 2011-07-04T18:18:23.156Z · score: 0 (0 votes) · LW · GW

Good point, with the qualifier that many people (including professional philosophers) presently find themselves unable to wrap their heads around the idea that they have no non-material consciousness. The "argument from absurdity" against materialism is alive and kicking.

Comment by nwthomas on The mathematical universe: the map that is the territory · 2011-07-04T13:09:09.010Z · score: 2 (2 votes) · LW · GW

According to this theory, so far as I can tell, the events of Star Wars literally occurred. Is that correct?

Comment by nwthomas on Reasons for being rational · 2011-07-02T02:04:45.232Z · score: 4 (4 votes) · LW · GW

This is an interesting question. I definitely agree that being a contrarian and being a conformist can both be forms of bias. However, I would add one example which suggests that conformity can in some cases be a positive instinct.

I have never studied general relativity in depth. My belief that "general relativity is right" is based on the heuristics, "most scientists believe in general relativity," and "things that most scientists believe are usually right." In part I think it's also based on the fact that I know that evidence and arguments are available which everybody claims to be very strong.

To show that most of my belief in general relativity comes from popularity-based heuristics, consider the following scenario. Somebody proposes a unified field theory (UFT-1). They claim that evidence and arguments are available which would would convince me that the theory is right. Furthermore, they are the only person who believes in UFT-1. To eliminate further confounding variables, let us suppose that UFT-1 has existed for 35 years and has been examined in detail by 200 qualified physicists.

The main difference between general relativity and UFT-1, from my perspective, having never examined the arguments for either, is that most scientists believe in general relativity, and most scientists do not believe in UFT-1. Yet, I believe that general relativity is almost definitely right, I believe that UFT-1 is almost definitely wrong, and I believe that these are rational judgments.

Furthermore, these rational judgments are based almost entirely on a popularity-based heuristic: that is, the heuristic that popular beliefs are more likely to be true. To review, from the information I have, the main difference between general relativity and UFT-1 is that a lot of people believe in general relativity, and few people believe in UFT-1. Otherwise they are quite similar: both of them have been around for a while, both of them have received significant exposure, and both of them claim to have sound arguments in their favor. (The differences between these arguments cannot enter into my evaluation of the two theories, because I have not examined the arguments for either.)

This example suggests that popularity-based heuristics, telling us that popular beliefs are more likely to be true, rightly have a place in rational people's judgments.

This makes sense. The amount of thinking that the human race as a whole has done vastly exceeds the amount of thinking that I will ever do. It would make sense for me to rely on this vast repository of intelligence in choosing my own beliefs. This is related to the idea of "the wisdom of crowds."

On the other hand, popularity-based heuristics often lead us to the wrong answer. Religion is an obvious example. So we have to be careful in applying them. I'm not sure what general principles would result in our popularity-based heuristics excluding religious beliefs, but including popular scientific theories which we have not evaluated for ourselves. What do you guys think?

Comment by nwthomas on No, Really, I've Deceived Myself · 2011-06-30T06:33:07.101Z · score: 6 (6 votes) · LW · GW

I can relate to this. I had a crisis of faith about a month ago (thanks LessWrong!), and while I've "officially" stopped believing "those things," they still sometimes show up in my thinking. I am, as it were, in the midst of a complex re-architecting process. Particularly hard to eliminate are those beliefs which actually serve a functional purpose in my life. For instance, the beliefs that give me emotional support, and the beliefs that I use to decide my actions, are very hard to deal with. In these cases I need to figure out how to build a new structure which serves the same function, or figure out how to live without that function. This has required a significant amount of creativity and deep thinking.

Comment by nwthomas on The Strangest Thing An AI Could Tell You · 2011-06-27T09:55:35.374Z · score: 6 (6 votes) · LW · GW

I hope you didn't understand me as asserting this. It's certainly not something I believe.

Comment by nwthomas on The Strangest Thing An AI Could Tell You · 2011-06-27T09:39:00.260Z · score: 1 (1 votes) · LW · GW

Good inference! Or, deeply self-deceived. ;-)

Comment by nwthomas on The Strangest Thing An AI Could Tell You · 2011-06-27T09:03:53.819Z · score: 1 (3 votes) · LW · GW

The only thing that humans really care about is sex. All of our other values are an elaborate web of neurotic self-deception.

Comment by nwthomas on The Strangest Thing An AI Could Tell You · 2011-06-27T09:03:21.963Z · score: 0 (0 votes) · LW · GW

The only thing that humans really care about is sex. All of our other values are an elaborate web of neurotic self-deception.

Comment by nwthomas on The Strangest Thing An AI Could Tell You · 2011-06-27T09:03:00.850Z · score: 0 (0 votes) · LW · GW

The only thing that humans really care about is sex. All of our other values are an elaborate web of neurotic self-deception.

Comment by nwthomas on Crisis of Faith · 2011-06-12T07:12:49.016Z · score: 4 (4 votes) · LW · GW

Hi, Alicorn!

  1. Yes. They are drawn from the material at . The philosophy presented there is internally consistent, to the best of my understanding.

  2. There is no physical evidence. All of the "evidence" is in my head. This is a significant point.

  3. There are a variety of points in the source document which could be interpreted as designed to defend its claims against testing. This is a significant point.

  4. I am not aware of any physically testable predictions that these beliefs make. This is a significant point.

  5. The causal history of these beliefs is that I read the aforementioned document, and eventually decided that it was true, mainly on the basis of the fact that it made sense to my intuition and resonated personally with me. This is a significant point.

Thanks for asking!

Comment by nwthomas on The benefits of madness: A positive account of arationality · 2011-06-12T06:55:08.945Z · score: 0 (0 votes) · LW · GW

I am right now trying to fathom the problem of synthesizing rationality and mysticism. Would you like to correspond on this topic?

Comment by nwthomas on Crisis of Faith · 2011-06-11T10:37:20.712Z · score: 8 (8 votes) · LW · GW

For the past three days I have been repeatedly performing the following mental operation:

"Imagine that you never read any documents claimed to be produced by telepathy with extraterrestrials. Now gauge your emotional reaction to this situation. Once calm, ask yourself what you would believe about the world in this situation. Would you accept materialism? Or would you still be seeking mystical answers to the nature of reality?"

I am still asking myself this question. Why? I am struggling to figure out whether or not I am wrong.

I believe things that raise a lot of red flags for "crazy delusion." Things like:

"I came from another planet, vastly advanced in spiritual evolution relative to Earth, in order to help Earth transition from the third dimension to the fourth dimension. My primary mission is to generate as much light and love as possible, because this light and love will diffuse throughout Earth's magnetic fields and reduce the global amount of strife and suffering while helping others to achieve enlightenment. I am being aided in this mission by extraterrestrials from the fourth dimension who are telepathically beaming me aid packages of light and love."

These beliefs, and many others like them, are important to my worldview and I use them to decide my actions. Because I like to think of myself as a rational person, it is a matter of great concern to me to determine whether or not they are true.

I have come across nobody who can put forth an argument that makes me question these beliefs. Noboby except for one person: Eliezer Yudkowsky. This man did what no other could: he made me doubt my basic beliefs. I am still struggling with the gift he gave me.

This gift is that he made me realize, on a gut level, that I might be wrong, and gave me motivation to really figure out the truth of the matter.

So many intelligent people believe patently absurd things. It is so difficult to escape from such a trap once you have fallen into it. If I am deluded, I want to be one of the fortunate ones who escaped from his insanity.

The thing is, I really don't know whether or not I am deluded. I have never before been so divided on any issue. Does anybody have anything they'd like to add, which might stimulate my thinking towards resolving this confusion?