Anticipation vs. Faith: At What Cost Rationality?
post by Wei Dai (Wei_Dai) · 2009-10-13T00:10:47.818Z · LW · GW · Legacy · 106 commentsContents
106 comments
Anticipation and faith are both aspects of the human decision process, in a sense just subroutines of a larger program, but they also generate subjective experiences (qualia) that we value for their own sake. Suppose you ask a religious friend why he doesn’t give up religion, he might say something like “Having faith in God comforts me and I think it is a central part of the human experience. Intellectually I know it’s irrational, but I want to keep my faith anyway. My friends and the government will protect me from making any truly serious mistakes as a result of having too much faith (like falling into dangerous cults or refusing to give medical treatment to my children)."
Personally I've never been religious, so this is just a guess of what someone might say. But these are the kinds of thoughts I have when faced with the prospect of giving up the anticipation of future experiences (after being prompted by Dan Armak). We don't know for sure yet that anticipation is irrational, but it's hard to see how it can be patched up to work in an environment where mind copying and merging are possible, and in the mean time, we have a decision theory (UDT) that seems to work fine, but does not involve any notion of anticipation.
What would you do if true rationality requires giving up something even more fundamental to the human experience than faith? I wonder if anyone is actually willing to take this step, or is this the limit of human rationality, the end of a short journey across the space of possible minds?
106 comments
Comments sorted by top scores.
comment by PhilGoetz · 2009-10-13T01:13:16.244Z · LW(p) · GW(p)
You have to have a core of arational desires/goals/values. Otherwise, you're just a logic engine with nothing to prove.
Replies from: None, UnholySmoke, DanArmak↑ comment by UnholySmoke · 2009-10-15T12:56:33.656Z · LW(p) · GW(p)
Also upvoted, and very succintly put.
Rationality is a tool we use get to our terminal value. And what do we do when that tool tells us our terminal value is irrational?
Never ask that question.
↑ comment by DanArmak · 2009-10-13T16:12:02.765Z · LW(p) · GW(p)
I agree, and I think that pretty much answers the post's question:
What would you do if true rationality requires giving up something even more fundamental to the human experience than faith?
"True rationality" is, pretty much by definition, the best way of achieving your goals. The above question should be written as: do you have goals that are so important, that you would agree to give up something fundamental to the human experience to achieve them?
My personal answer is: mine only such goals are of the form "do not undergo a huge amount of torture/suffering".
comment by Christian_Szegedy · 2009-10-20T00:02:35.508Z · LW(p) · GW(p)
If someone would come and tell you: "I can't fall in love. I've never fallen in love and I also think it is wrong, harmful, crazy and irrational to fall in love." What would you tell?
Most people would say opt for one of the following answers:
- Cool, that's great, good for you, finally a sane person!
- That's pitiful, you miss something important!
Falling in love has a lot of parallels to real religious conviction: It's irrational. It's not supported by evidence, still can't be argued away. It can be harmful for the person and also easily exploited and often it is not mutual. :)
Replies from: RobinZ↑ comment by RobinZ · 2009-10-20T00:11:18.038Z · LW(p) · GW(p)
As a clarification of the subjective nature of religious faith, this is most helpful ... but it doesn't really answer Wei_Dai's question.
Replies from: Christian_Szegedy↑ comment by Christian_Szegedy · 2009-10-20T00:24:28.495Z · LW(p) · GW(p)
No, it's not a clarification, it's just an extension.
You can ask the same questions about falling in love as about religious faith. Whether you come to similar or different conclusions, you learn something on the way.
Replies from: RobinZ↑ comment by RobinZ · 2009-10-20T02:20:16.083Z · LW(p) · GW(p)
That's ... not really accurate. With love, the question we in this community would ask is "would seeking to maintain or expand this relationship be a good idea?" With religion, it's "does this being with whom a relationship is suggested actually exist?"
Replies from: Christian_Szegedy↑ comment by Christian_Szegedy · 2009-10-20T18:03:05.208Z · LW(p) · GW(p)
In a lot of cases, one asks later whether the person one fell in love actually exists.
Replies from: RobinZcomment by Jonii · 2009-10-13T00:59:46.742Z · LW(p) · GW(p)
One such thing could be a conscious mind. You want to be rational and efficient, and you're offered a deal by a local neurosurgeon to reprogram your brain so that you become the perfect Bayesian that does everything to further your values. No acrasia, no bias blind spots, no hindrance at all. Only downside in the transformation process is that something that enables our conscious experiences(let it be something like self-reflection, or conflicts within the system) is lost. Wanna do it?
I'm well aware that people here might disagree if this would be possible even in principle, but as far as I know, it could be, so this could then work as an example of a huge sacrifice made for furthering rationality.
Edit: Also, it would seem to me that "eliminating anticipation" is a case of explaining away.
Replies from: Alicorn, MichaelVassar↑ comment by Alicorn · 2009-10-13T01:06:31.221Z · LW(p) · GW(p)
The transformation is at least partially self-undermining, if one of your values is conscious experience.
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-10-13T15:39:41.298Z · LW(p) · GW(p)
Can I get xeroxed and have the transformation performed on one randomly selected version of me before it wakes up?
Replies from: DanArmak↑ comment by DanArmak · 2009-10-13T16:07:24.113Z · LW(p) · GW(p)
To have the "pure rational" version serve the human one? Have it achieve goals that aren't defined by ownership (e.g., 'reduce total suffering'), because it's better at those things? This sounds reasonable and I don't see why we shouldn't do this - assuming it's easier or otherwise preferable to creating a GAI from scratch to achieve these goals.
Replies from: aausch↑ comment by aausch · 2009-10-16T02:02:06.790Z · LW(p) · GW(p)
Would you enter a lottery where a small portion (0.01%) of the entrants are selected as losers at random, and transformed without being copied?
Replies from: DanArmak↑ comment by DanArmak · 2009-10-16T02:07:14.431Z · LW(p) · GW(p)
Being modified like this is much the same as death. I would lose personhood and the modified remains would not be serving my original goals, but those of the winners in the lottery.
So this is equivalent to asking: would I enter a lottery with 0.01% chance of instant death and a prize for everything else? I might, depending on the prize. If the prize is service by the modified individuals, and they aren't based on myself but on other people (so they're not very well suited to advancing my personal goals), and I have to timeshare them with all the other winners (every winner receives 0.01 / 99.99 ~~ 10^-4 % service time) - that doesn't seem worthwhile.
Replies from: aausch↑ comment by MichaelVassar · 2009-10-16T21:48:22.782Z · LW(p) · GW(p)
Not happily, but I won't be unhappy about it for long, so sure. I'm pretty sure that one such person is all the world would need, and, you know that epigram about 'what profit a man'? Well, it profit his utility function a lot.
comment by Tyrrell_McAllister · 2009-10-13T01:21:45.621Z · LW(p) · GW(p)
What would be the consequence of giving up the idea of a subjective thread of consciousness?
I wonder if believers in subjective threads of consciousness can perform a thought experiment like Chalmers' qualia-zombie thought experiment. I gather that advocates of the subjective thread hold that it is something more than just having certain clumps of matter existing at different times holding certain causal relationships with one another. (Otherwise you couldn't decide which of two future copies of yourself gets to inherit your subjective thread). So, advocates, does this mean that you can imagine an alternate universe in which matter is arranged in the same way as in our own throughout time, but in which no subjective threads bind certain clumps together? That is, do you think that "subjective-thread zombies" are possible in principle?
Just as in the Chalmers thought experiment, subjective-thread zombies would go around insisting that they have subjective threads. After all, their brains and lips would be participating in the same causal processes that lead you to say such things in this universe. And yet they would be wrong. They would not be saying these things because they have subjective threads, since they don't. And so, it seems, your insistence that you have a subjective thread also cannot have anything to do with whether you in fact do.
It seems that the idea of subjective-thread zombies is subject to all the problems that qualia zombies have. How do advocates of the subjective thread address or evade these problems?
comment by gjm · 2009-10-13T09:12:06.609Z · LW(p) · GW(p)
I don't see any reason why you should give up anticipation of future experiences. It's possible that in situations involving duplication and merging of minds you should accept that anticipation is an unreliable guide for decision-making, but that's not at all the same thing. (It's more like a religious person agreeing that if he gets seriously ill he should see a doctor rather than relying on his god to cure him -- which, in fact, religious people generally do.) At least for the present, the sort of switching and copying and merging and muddling of minds that would make our usual anticipation-based thinking fail badly doesn't happen to any appreciable extent.
comment by SilasBarta · 2009-10-13T02:30:24.779Z · LW(p) · GW(p)
Suppose you ask a religious friend why he doesn’t give up religion, he might say something like “Having faith in God comforts me and I think it is a central part of the human experience. Intellectually I know it’s irrational, but I want to keep my faith anyway. My friends and the government will protect me from making any truly serious mistakes as a result of having too much faith (like falling into dangerous cults or refusing to give medical treatment to my children)."
Um, no. If you were close with that friend, and he proved himself to be pretty intelligent, and he downed a few beers and you kept prying, his answer would be something more like,
"Yeah, I know all that God stuff is a load of garbage, but the public profession of faith in 'God' provides the social glue that allows a welfare-maximizing mutualist) group to form, involving people of varying intelligence levels who take this stuff literally and not literally, which grants me access to a large social network with enforcement mechanisms for the prisoner's dilemma, and the ability to trade labor for labor, such as for babysitting, at more favorable rates than cash purchases would allow. Also, it uses psychological mechanisms that allow me to believe strongly enough in my healing to invoke the placebo effect in the body, which gives me real healing. Finally, my price for joining was low enough.
"Show me an atheist group that does all that, and I'm in. *hic* [passes out]."
Replies from: PhilGoetz, thomblake, Furcas, Kaj_Sotala↑ comment by PhilGoetz · 2009-10-13T15:22:13.242Z · LW(p) · GW(p)
No, that isn't what they would say. I was a Christian for many years. Most of them sincerely believe everything they're supposed to believe - at least in conservative churches, which I think comprise a large majority of US churches.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-13T15:36:04.686Z · LW(p) · GW(p)
No, they don't really believe it; their actions are severely suboptimal for that belief set. They might have a greater belief that they believe it, however.
Replies from: Peter_de_Blanc, GuySrinivasan, PhilGoetz, Tyrrell_McAllister, PhilGoetz, Jack↑ comment by Peter_de_Blanc · 2009-10-13T20:28:20.465Z · LW(p) · GW(p)
Oh, come on. Show me a human whose actions aren't severely suboptimal for their belief set.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-13T20:52:03.526Z · LW(p) · GW(p)
Fair point, but here the difference is the most obvious, if the professed beliefs accurately represent their internal predictive model of reality, which I claim it doesn't.
↑ comment by SarahSrinivasan (GuySrinivasan) · 2009-10-13T16:10:09.634Z · LW(p) · GW(p)
In the past I believed it, and was sad when "the flesh was weak" and I took actions severely suboptimal for that belief set. I wish I had only believed I believed it. I guess it's possible I only now believe I believed it and in fact in the past I merely believed I believed it, but I'm guessing no.
Humans are well known to be terrible at taking actions that aren't severely suboptimal for their beliefs, and for holding contradictory beliefs, and for holding beliefs which imply actions which are suboptimal for their other, contradictory beliefs.
↑ comment by Tyrrell_McAllister · 2009-10-13T17:25:31.860Z · LW(p) · GW(p)
Unless you are a person to whom faith ever came naturally, you should be very skeptical that your mind contains an accurate model of "that belief set" or of the minds that profess it.
Faith never came naturally to me. After long interaction with "faithful" people, I've developed some tentative hypotheses about how some of them think on some things. Cautious though these hypotheses are, they are enough to rule out the applicability of your characterization to most cases.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-13T17:43:52.471Z · LW(p) · GW(p)
If I'm right and they're just cynically going through the motions, do you think they're going to tell you that? Do you think they're even going to give evidence consistent with that? Of course not! They'll just keep up the charade. They'd only admit it if you were a close friend, and they were drunk at the time, like in my example.
The social benefits break down when you make your genuine beliefs become public knowledge.
Replies from: UnholySmoke, Tyrrell_McAllister↑ comment by UnholySmoke · 2009-10-15T13:07:37.550Z · LW(p) · GW(p)
Beware of generalising across people you haven't spent much time around, however tempting the hypothesis. Drawing a map of the city from your living room etc.
My first 18 years were spent attending a Catholic church once a week. To the extent that we can ever know what other people actually believe (whatever that means), most of them have genuinely internalised the bits they understand. Like, really.
We can call into question what we mean by 'believe', but I can't agree that a majority of the world population is just cynically going with the flow. Finally, my parish priest is one of the most intelligent people I've ever met, and he believed in his god harder/faster/whatever than I currently believe anything. Scary thought, right?
↑ comment by Tyrrell_McAllister · 2009-10-13T17:45:59.870Z · LW(p) · GW(p)
If I'm right and they're just cynically going through the motions, do you think they're going to tell you that? Do you think they're even going to give evidence consistent with that? Of course not! They'll just keep up the charade.
If it's as hard to gather evidence as you claim, then you should be all the more skeptical of your own conclusions.
ETA: And if it's so crucial to avoid letting the slightest hint of doubt creep out, then we should expect evolution to find the simplest way to keep that from happening: Make a mind with the capacity to genuinely believe this stuff.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-13T18:15:44.424Z · LW(p) · GW(p)
When anthropologists study religion, they focus mostly on the rituals, the social cohesion, the punishment of defectors (in the PD sense), and formation of authority structures, and not so much on the factual content of the adherents' purported beliefs.
My position is just: do that.
Replies from: Tyrrell_McAllister, PhilGoetz, bogus↑ comment by Tyrrell_McAllister · 2009-10-13T19:34:53.306Z · LW(p) · GW(p)
When anthropologists study religion, they focus mostly on the rituals, the social >cohesion, the punishment of defectors (in the PD sense), and formation of authority >structures, and not so much on the factual content of the adherents' purported beliefs.
My position is just: do that.
I'm not suggesting that you rely only on their portrayal of their own beliefs. On the contrary, I'm suggesting long and careful observation of their behavior (including professions of belief) before you reach any confident conclusion.
And even after you've gathered many such observations, you will still be misled if you use the wrong approach in incorporating those observations into a model. Many natural-born atheists use the following fallacious approach to understanding the religious: They think to themselves, "What would it take to make me act like that and say those things? Well, for that to happen, I'd need to have the following things going on inside my mind: <...>. Therefore, those things must also be going on in the minds of theists (or at least of the intelligent ones)."
The flaw with this approach is that you're modeling the mind of a theists using the mind of a natural-born atheist, a mind which almost certainly works differently from a theist's mind when it comes to theological issues, almost by definition. That is why you should be skeptical that your mind contains an accurate model of a theist's mind.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-13T20:58:59.238Z · LW(p) · GW(p)
I'm not suggesting that you rely only on their portrayal of their own beliefs. On the contrary, I'm suggesting long and careful observation of their behavior (including professions of belief) before you reach any confident conclusion.
...And even after you've gathered many such observations, you will still be misled if you use the wrong approach in incorporating those observations into a model. Many natural-born atheists use the following fallacious approach to understanding the religious: They think to themselves, "What would it take to make me act like that and say those things? ...
Well, I'm already relying on a large data set, and I was born into a Catholic family. My theory still makes more sense. Here are some more data points:
-The parallels between religion and politics: how they force people into teams, say whatever it takes to defend the team, look for cues about whether you're on their team when they ask about your beliefs,
-The history of religious warfare. It makes no sense to view these people as going out to die for inscrutable theological doctrines, but complete sense to view their motives the same as they would be if you replaced the religion with some other memetic group.
Replies from: Tyrrell_McAllister↑ comment by Tyrrell_McAllister · 2009-10-13T22:50:22.339Z · LW(p) · GW(p)
My theory still makes more sense. Here are some more data points:
-The parallels between religion and politics: how they force people into teams, say whatever it takes to defend the team, look for cues about whether you're on their team when they ask about your beliefs,
I'd say that this data point supports my position. Get an extreme left-winger or right-winger drunk and you're not going to hear them say, "yeah, those extreme political positions I espouse, I don't really think they're true. I just pretend to because of the social benefits I reap." On the contrary, you're going to hear them spout even more extreme views, views that they'd realize they ought to keep to themselves had they been sober.
-The history of religious warfare. It makes no sense to view these people as going out to die for inscrutable theological doctrines, but complete sense to view their motives the same as they would be if you replaced the religion with some other memetic group.
I agree. I'm not saying that every action ostensibly justified by religious beliefs is really done because of those beliefs. But that says nothing about whether those beliefs are sincerely held.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-14T19:47:30.533Z · LW(p) · GW(p)
I'd say that this data point supports my position. Get an extreme left-winger or right-winger drunk and you're not going to hear them say, "yeah, those extreme political positions I espouse, I don't really think they're true. I just pretend to because of the social benefits I reap."
It's not necessary for my claim that they think about it in those terms. But they:
a) enjoy the bonding with people "on their team" (yeah, aren't those Republican's so greedy, heh heh, not like us nosiree)
b) would take back more extreme things they said to "support their team", e.g., "Yeah, I don't really think Obama's health plan is the best thing in the world, I just want policy to move in sorta that direction and this is best I can hope for -- of course there are flaws". Now, if you steer the conversation into a duel from the beginning, I'm sure you can get one.
I agree. I'm not saying that every action ostensibly justified by religious beliefs is really done because of those beliefs. But that says nothing about whether those beliefs are sincerely held.
No, that would be evidence that the belief in belief is sincerely held, not the belief itself. An actual belief (zeroth level) that "God's divine essense is embedded in children even before baptism" would correspond to some noticeable activity other than "let's kill the people who think God's divine essense isn't embedded in people until baptism". Yet in the history of religious wars, you saw exactly that.
Replies from: Tyrrell_McAllister↑ comment by Tyrrell_McAllister · 2009-10-14T20:26:24.698Z · LW(p) · GW(p)
An actual belief (zeroth level) that "God's divine essense is embedded in children even before baptism" would correspond to some noticeable activity other than "let's kill the people who think God's divine essense isn't embedded in people until baptism".
Why would you think that? I see little reason to think so. I suspect that you think so because you reason, "Were I to believe that God's divine essence is embedded in children even before baptism, I would never kill people for thinking that God's divine essence isn't embedded in people until baptism. Therefore, anyone who holds that belief wouldn't kill people for that reason."
I've already tried to explain why I think that this reasoning is invalid. You're modeling how your own mind would behave under certain circumstances, and you're then extrapolating to how other minds behave under those circumstances. The problem is that the other minds are theistic, so, by definition, they differ from your mind in a way that's obviously highly relevant to how they will behave in the circumstances under consideration.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-14T20:33:48.011Z · LW(p) · GW(p)
You're still blurring the distinction between belief and belief-in-belief, or, at least, incorrectly considering them to be similar.
I'm not saying, "If I believed X, this is what I would do." I'm saying the belief X has implications for your actions, at least in some counterfactual sense, or it's not really a belief, but better called a belief-in-belief.
Imagine: I tell you I think monsters live under my bed. I tell you I think that the monsters kill whoever sleeps in the bed. I tell you I don't want to die.
I sleep in my bed.
Tomorrow, I'm going to go a "BedMonster Study Group", a type of meeting at which many of my male friends have met their future wives.
Do you think I believe there's a monster under my bed, in the normal sense of the terms? Or do I just believe that I do?
Replies from: Tyrrell_McAllister↑ comment by Tyrrell_McAllister · 2009-10-14T21:53:32.404Z · LW(p) · GW(p)
You should argue your case using the actual pertinent facts (i.e., the actual actions and professed beliefs of the religious), not hypothetical ones.
But, even granting your hypothetical--
If I heard you say "there's a monster under my bed" with the same earnestness and insistence that I hear when the religious profess their beliefs,
then I would strongly suspect that your mind draws conclusions from evidence in a manner very different from that in which mine does. In particular, I would expect that you reason from your beliefs to your actions very differently from how I do. I would therefore be very cautious about inferring from your actions to your actual beliefs.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-14T22:09:04.029Z · LW(p) · GW(p)
You should argue your case using the actual pertinent facts (i.e., the actual actions and professed beliefs of the religious), not hypothetical ones.
There's nothing wrong with presenting (what I consider) the same, relevant dynamic in a hypothetical context in order to make a point about the conclusions you should draw in a different one.
My hypothetical simply takes the problematic elements of the situation at hand and amplifies them. In religions, it's hard to see the disconnect between the professed beliefs and the adherent's actual internal predictive model of reality. My example made the disconnect obvious, and also showed the surrounding motives that give evidence as to what they really believe.
In such a scenario, I would conclude that the person is using the term "believe" differently than the term is normally used. You would conclude that the person knowingly puts himself in a situation where he expects to die, despite not wanting to die.
I think my conclusion is more reasonable.
Replies from: Tyrrell_McAllister↑ comment by Tyrrell_McAllister · 2009-10-14T23:51:30.399Z · LW(p) · GW(p)
There's nothing wrong with presenting (what I consider) the same, relevant dynamic in a hypothetical context in order to make a point about the conclusions you should draw in a different one.
If the relevant dynamic in the actual situation is really the same, then why not just refer to the actual situation? If you have to "amplify" the problematic elements, then you are giving yourself the burden of proving that you haven't amplified them to the point that they yield a different conclusion than the original setting would.
In religions, it's hard to see the disconnect between the professed beliefs and the adherent's actual internal predictive model of reality. My example made the disconnect obvious, and also showed the surrounding motives that give evidence as to what they really believe.
If the disconnect is "hard to see" in the case of religion, then you ipso facto need strong evidence to establish that the disconnect exists. By moving to a situation where the disconnect is easier to see, so that less evidence is necessary, you are moving to a situation where your burden of proof is less. Therefore, establishing your claim in your hypothetical does not suffice to establish your claim in the original situation.
In such a scenario, I would conclude that the person is using the term "believe" differently than the term is normally used. You would conclude that the person knowingly puts himself in a situation where he expects to die, despite not wanting to die.
Your conclusion would be one real possibility. Another possibility is that, although he doesn't want to die, he prefers it to sleeping somewhere other than his bed. Perhaps sleeping elsewhere seems, to him, a fate worse than death. Since I'm manifestly dealing with a crazy person, that remains a real possibility, at least until I learn more about how he thinks.
The more someone professes different beliefs from yours, the more evidence there is that their mind works differently from yours in some crucial respect, and so the less credit you should give to your mental model of them.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-15T02:53:36.926Z · LW(p) · GW(p)
In approximate order of your objections to the hypothetical:
Tyrrell, it's not my fault if you can't handle reasoning from hypotheticals, but it certainly doesn't make that form of reasoning -- used in the rest of the civilized world -- off-limits. If you think the analogy I'm making doesn't hold, you can politely show where it breaks down. I had already attempted to speak to the specific situation in dispute -- about religion -- but in that case it's less obvious how one's actions aren't following from one's beliefs if the professed beliefs are the real ones.
I actually think it's obvious enough why actual beliefs in (certain religions') doctrine of eternal hellfire would logically imply a direct transition to an ascetic lifestyle or other drastic choices, and we can go that route if it keeps things in your comfort zone.
But the point is pretty simple: in the hypothetical, we can quite easily draw conclusions: either a) the person doesn't actual have an internal predictive model of reality including a deadly bed monster, or b) he has some kind of weird psychology.
In short, you go with b) and I go with a). Which is why I think this kvetching about hypotheticals suddenly being off-limit is moot: even when I make the situation "more favorable" to my theory, you just bite a bigger bullet, cutting off whatever implication I would have claimed follows back to the original topic of religion.
So, let's review that position:
Another possibility is that, although he doesn't want to die, he prefers it to sleeping somewhere other than his bed. Perhaps sleeping elsewhere seems, to him, a fate worse than death. Since I'm manifestly dealing with a crazy person, that remains a real possibility, at least until I learn more about how he thinks.
The more someone professes different beliefs from yours, the more evidence there is that their mind works differently from yours in some crucial respect, and so the less credit you should give to your mental model of them.
But note the difficult position you've forced yourself into. You have to believe he is obviously crazy despite:
not being obviously crazy in any other area of his life
the psychological unity of mankind somehow breaking for a huge class of people that have been interbreeding with the rest of humankind and whom no one seriously suggests mandatory psychotherapy
his mental model of other phenomena (let's reasonably suppose) being superior to yours in several areas
his actions associated with these "beliefs" greatly helping him achieve many non-crazy personal goals: having a social network, meeting a compatible spouse, greater assurance of spousal fidelity, the feeling of belonging.
the similarity (discussed before) between him and the uncountable historical instances of people supposedly going to great lengths for inscrutable theological doctrines, but actually protecting a meme they benefit from.
Do you see why this is an implausible chain to follow? Especially when the alternative is the majestically simple "Belief means something different in this context that is not an internal predictive model of reality"?
Replies from: Tyrrell_McAllister↑ comment by Tyrrell_McAllister · 2009-10-17T17:55:56.627Z · LW(p) · GW(p)
Tyrrell, it's not my fault if you can't handle reasoning from hypotheticals, but it certainly doesn't make that form of reasoning -- used in the rest of the civilized world -- off-limits.
Hypotheticals have their uses, but they are easy to abuse. Hypotheticals are usually fine for
making the meaning of a claim clear by putting it in a simpler context (e.g., explaining the physics of a pendulum by imagining that it has an inelastic and frictionless rod), and
constructing counter-examples to absolute claims (e.g., "Stealing is always wrong!" "Really? What about if you washed up on shore after a shipwreck, and you had to steal food immediately or else die of starvation?"
Neither of these apply to your argument. Your claim was already clear, and I'm making no absolute claim. I'm the one claiming that there is inadequate evidence to justify your confidence in your conclusion.
Despite their uses, hypotheticals are usually useless for resolving disagreement. As I've explained, by passing to a new hypothetical situation, you only increase your burden of proof. In addition to proving the original claim, you must now show that the hypothetical doesn't differ significantly from the actual situation. Moreover, in practice, when both participants are intelligent and thoughtful, the hypothetical will almost always fail to capture the heart of the disagreement.
For example, you've said that your claim is "hard to see" and "less obvious" in the case of religion. It shouldn't surprise you to learn that the elements making it hard to see and less obvious are why I doubt your claim. Thus, by moving to a hypothetical where those elements are absent, you fail to address the source of my doubt.
I actually think it's obvious enough why actual beliefs in (certain religions') doctrine of eternal hellfire would logically imply a direct transition to an ascetic lifestyle or other drastic choices, and we can go that route if it keeps things in your comfort zone.
This is a strong claim. I'm skeptical that religious claims are precise and unambiguous enough to logically imply such things. You should be able to give a logically rigorous demonstration of this implication if it is so obvious. I will concede your point if you can do this.
But the point is pretty simple: in the hypothetical, we can quite easily draw conclusions: either a) the person doesn't actual have an internal predictive model of reality including a deadly bed monster, or b) he has some kind of weird psychology.
In short, you go with b) and I go with a). Which is why I think this kvetching about hypotheticals suddenly being off-limit is moot: even when I make the situation "more favorable" to my theory, you just bite a bigger bullet, cutting off whatever implication I would have claimed follows back to the original topic of religion.
You make a fair point, but one to which I have a response. Why did I object to your hypothetical if I reached the same conclusion within it anyway?
Naturally, if you keep adding conditions to your hypothetical, I will eventually agree that, given those conditions, your conclusion follows. But the more conditions you add, the further you take us from the actual situation, and so the less bearing the hypothetical has on the actual situation.
You had not yet added so many conditions to your hypothetical that I agreed with you, but I could already foresee that you eventually would, though it would be pointless for the reason above. Indeed, you added several such conditions below, conditions that were not present in your original formulation.
In particular you added that the person is "not … obviously crazy in any other area of his life" and you add that "his mental model of other phenomena … [is] superior to yours in several areas." That is new information. It is certainly not what I would expect of the actual typical person who claims that there is a killer monster under his bed. Suppose you really did learn of some adult that he earnestly insisted that there was a killer monster under his bed. Prior to further information, wouldn't you expect that this person suffers from numerous other severe delusions?
You've now ruled that out, but only by changing the hypothetical. I'll save you work by repeating that, if you add enough such conditions, I will agree that this individual is probably lying about his beliefs. But that concession will have no bearing on what I think about actual theists.
Now to consider your list of conditions as they pertain to theists:
But note the difficult position you've forced yourself into. You have to believe he is obviously crazy despite:
- not being obviously crazy in any other area of his life
Religious beliefs appear to me to spread to other areas of the believer's life in a way consistent with what I'd expect. But I don't expect them to spread as far as you do because I don't see the "obvious" logical implications that you claim above.
- the psychological unity of mankind somehow breaking for a huge class of people that have been interbreeding with the rest of humankind and whom no one seriously suggests mandatory psychotherapy
The psychological unity of humankind presents no problem for my hypothesis, because the overwhelming majority of people seem inclined to believe religious claims. It is you and I, natural-born atheists, who are the freaks :). We seem very rare.
- his mental model of other phenomena (let's reasonably suppose) being superior to yours in several areas
As I argued above, I would expect evolution to make minds capable of partitioning their beliefs in this way.
- his actions associated with these "beliefs" greatly helping him achieve many non-crazy personal goals: having a social network, meeting a compatible spouse, greater assurance of spousal fidelity, the feeling of belonging.
It seems to me that the easiest way to reap all these benefits of belief is to make a mind that can sincerely believe without interfering with important day-to-day conveniences. This would be harder if religion really logically implied asceticism, as you claimed. But I think that the religions we're talking about are designed (by natural memetic selection) to avoid doing that kind of thing.
- the similarity (discussed before) between him and the uncountable historical instances of people supposedly going to great lengths for inscrutable theological doctrines, but actually protecting a meme they benefit from.
This presents no particular problem. People often sincerely love their children, but they also often falsely invoke that love as their reason for doing other things, such as going to war, etc.
↑ comment by PhilGoetz · 2009-10-13T20:36:42.982Z · LW(p) · GW(p)
No; then you would have done that, rather than making an assertion about what they believed.
Perhaps you only believe that you believe that. :)
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-13T21:00:32.171Z · LW(p) · GW(p)
Well, I am doing that in the sense of judging religions by the factors anthropologists study rather than focusing on how I can well I can disprove the claim that the earth is 6000 years old.
↑ comment by bogus · 2009-10-13T18:27:40.264Z · LW(p) · GW(p)
Yes. Confucianism is the prototypical example of a "religion" which has no cosmological beliefs per se, but still provides for community cohesion (i.e. protection from perceived threats), an ethical code (the analects of Confucius are often quoted as proverbs/dogmas), a focus on authority figures and so forth.
↑ comment by Jack · 2009-10-13T17:00:51.484Z · LW(p) · GW(p)
their actions are severely suboptimal for that belief set.
I agree that this is the case. However, that they don't really believe in the tenets from Christianity only follows from this if we have a strictly behaviorist definition of "belief". I doubt many people hold that view and I'm not sure why anyone should.
↑ comment by thomblake · 2009-10-13T15:04:11.593Z · LW(p) · GW(p)
I've invoked similar arguments in favor of organized religion. While atheists could in principle get together every week and sing together, I don't know any who actually do, and I think we're worse off for it. Probably part of the appeal of humanistic churches.
Replies from: LauraABJ, SilasBarta↑ comment by LauraABJ · 2009-10-13T15:13:16.520Z · LW(p) · GW(p)
I recommend the Unitarian Universalist church. I went to one as a child, and the focus was on humanism and morality and not god and faith. The sunday school taught a different religion every weekend, making it nearly impossible to believe any of them were true. Most of the people there didn't really believe in god anyway, but were there for the reasons you so name.
Replies from: Alicorn↑ comment by Alicorn · 2009-10-13T15:16:02.917Z · LW(p) · GW(p)
And there's the less ubiquitous Ethical Culture Society, which is even less religious than Unitarianism.
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-10-13T15:35:56.001Z · LW(p) · GW(p)
I wonder, though, if an x-rationalist would get a feeling of belonging there.
Replies from: Alicorn↑ comment by SilasBarta · 2009-10-13T15:38:36.910Z · LW(p) · GW(p)
Note the functions that I listed: the singing isn't strictly necessary; any bonding/reinforcement mechanism would work, but singing is very effective. If you could get the general mutualist functions down, then you'd have a competitive option.
Replies from: AllanCrossman↑ comment by AllanCrossman · 2009-10-13T16:41:43.422Z · LW(p) · GW(p)
Ugh. The horrible music is the worst thing about church. Give me sermons about fire and brimstone any day.
Replies from: komponisto↑ comment by Furcas · 2009-10-13T06:44:56.294Z · LW(p) · GW(p)
You seem to think that intelligent religious people are less crazy than dumb religious people. They're not.
Replies from: LauraABJ, SilasBarta↑ comment by LauraABJ · 2009-10-13T12:16:59.633Z · LW(p) · GW(p)
Yes, and I would say actual faith is a cognitive error more akin to deja-vu than double think, in that it is a feeling of knowledge for which adequate logical justification may not exist. A friend of mine once said, "I'm sorry that I'm so bad at explaining this [the existence of God], but I just know it, and once you do too, you'll understand."
People can have experiences of faith in non-religious contexts, such as having faith (a sense of certainty or foreknowledge) that a critically ill loved-one will pull through. Intuition and gut-feelings maybe considered faith-light, but I think certainty is part of the faith experience, and just because that certainty is false, doesn't make the feeling any less real.
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-10-13T15:37:57.988Z · LW(p) · GW(p)
I would say actual faith is a cognitive error more akin to deja-vu than double think, in that it is a feeling of knowledge for which adequate logical justification may not exist.
It looks to me like greater intelligence pulls people away from deja-vu faith and toward doublethink faith, but this is a generalization based on little data. Still, that little data seems to show that smart people who think about their religions end up with Escher-painting minds.
Replies from: LauraABJ↑ comment by LauraABJ · 2009-10-13T16:13:46.131Z · LW(p) · GW(p)
I don't have a large enough sample either, but I think what you interpret as doublethink and 'Escher-painting minds' may be the result of rationalizing a faith that at its core is an emotional attachment to a cognitive error. The friend I mentioned probably doesn't have an IQ much below the median for the readers of this blog-- double major in biochem and philosophy at an ivy-league school, head of a libertarian club (would probably agree with Robin Hanson on almost everything).
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-10-13T17:46:31.172Z · LW(p) · GW(p)
Well, yes, lots of rationalization is exactly how you end up with an Escher-painting mind. Even human beings aren't born that twisted.
See also: Occam's Imaginary Razor.
↑ comment by SilasBarta · 2009-10-13T15:56:43.580Z · LW(p) · GW(p)
How is it crazy to cynically go along with rituals for the social benefits? Risky, maybe, but crazy?
Replies from: Furcas↑ comment by Furcas · 2009-10-13T17:10:43.829Z · LW(p) · GW(p)
It isn't crazy at all. I was saying that your intelligent, religious, and very drunk friend would never say those words, because there's no religious person who believes them. All these reasons may be the ultimate cause of religious beliefs, but that doesn't mean religious believers are aware of them, consciously or even subconsciously.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-14T22:28:24.547Z · LW(p) · GW(p)
and very drunk friend would never say those words, because there's no religious person who believes them.
Just to clarify, what do you mean by "religious" here? Do you define it by whether they're active in a church?
If so, how much money would you bet I can't find you a counterexample?
Replies from: Furcas↑ comment by Furcas · 2009-10-15T00:22:43.346Z · LW(p) · GW(p)
I mean a person who holds self-deceptive beliefs that serve as the basis for a moral code of some sort. Church attendance is irrelevant.
I know there are some people who act religious and call themselves religious but aren't religious at all, but I don't think that's the kind of person you were talking about, since such a person couldn't benefit from the placebo effect. You're talking about the kind of person who has successfully fooled himself into holding religious beliefs, and yet is still so fully aware that it's all self-deception that he calls it "a load of garbage".
There may be real religious believers who would say something like what you've written, but I'm certain that it would just be a rationalization for them, a way to hide the ridiculousness of their beliefs behind a veneer of fake instrumental rationality.
Considering I'm currently unemployed and have very little money left in my bank account, I would bet a thousand Canadian dollars that you can't find a real religious believer who will say those words and honestly mean them.
EDIT:
And if you were talking about people who completely fake being religious, well, in my experience most of them don't ever admit to themselves that they're really atheists in their heart of hearts. I suppose there must be exceptions, though.
Replies from: Alicorn, SilasBarta↑ comment by Alicorn · 2009-10-15T00:29:45.626Z · LW(p) · GW(p)
You can benefit from the placebo effect even if you know you're taking a placebo.
Replies from: thomblake, Furcas↑ comment by thomblake · 2009-10-15T13:30:00.378Z · LW(p) · GW(p)
Relevant article in Wired: Placebos are getting stronger - researchers are starting to study the placebo response to see how it can be better utilized to aid in healing.
↑ comment by Furcas · 2009-10-15T00:40:24.699Z · LW(p) · GW(p)
I'm guessing that by the word "know" you mean "acknowledge that the evidence is strongly in favor of", which doesn't necessarily entail belief, as many religious believers have demonstrated.
If that isn't what you mean, I have no clue what you're talking about
Replies from: Alicorn↑ comment by Alicorn · 2009-10-15T01:08:16.030Z · LW(p) · GW(p)
No. I mean you can swallow a sugar pill, in full knowledge of, belief in, and acknowledgment-of-evidence-for the fact that it is a sugar pill, and still improve relative to not taking a sugar pill. It's not obvious to me why psychological "sugar pills" wouldn't work the same way.
Replies from: Furcas↑ comment by Furcas · 2009-10-15T01:13:44.477Z · LW(p) · GW(p)
I mean you can swallow a sugar pill, in full knowledge of, belief in, and acknowledgment-of-evidence-for the fact that it is a sugar pill,
... and belief that sugar pills don't cure diseases / alleviate symptoms?
and still improve relative to not taking a sugar pill.
I thought the placebo effect had to do with belief.
Replies from: Alicorn, Psychohistorian↑ comment by Alicorn · 2009-10-15T01:26:33.048Z · LW(p) · GW(p)
... and belief that sugar pills don't cure diseases / alleviate symptoms?
Yup, that too.
Replies from: Furcas↑ comment by Furcas · 2009-10-15T01:28:05.446Z · LW(p) · GW(p)
Geez.
Replies from: Alicorn, SilasBarta↑ comment by SilasBarta · 2009-10-15T03:11:36.544Z · LW(p) · GW(p)
I'm going to have to distance myself from Alicorn on this one. (Surprise, I know.) I think she's confusing the general meaning of "placebo effect" (any positive effect manifesting in a control case) with the specific meaning (curing of a condition attributable at least in part to believing a treatment to work).
The general meaning of it clearly exists and is mentality-independent. For example, after an oil spill, if you dump oil-eating bugs on one part of the affected area, and not the other (the latter being the control), oil will dissipate even in the control, just by natural processes and not because of the bugs. That's a placebo effect baseline against which to compare the bugs.
I endorse the stronger claim that the specific kind exists, and withstands conscious non-belief, so long as you use other modes to trick your body/brain into believing it. This shouldn't be surprising: you behavior is often hard to consciously modify. For example, it's easier to look confident by having a social group you belong to than by trying to control all the micromovements of muscles necessary to give off confident signals.
If that's what Alicorn meant, I apologize, she didn't err, and I agree with her.
↑ comment by Psychohistorian · 2009-10-15T01:22:52.936Z · LW(p) · GW(p)
I thought the placebo effect had to do with belief.
If we really understood the placebo effect, it wouldn't be the placebo effect.
↑ comment by SilasBarta · 2009-10-15T03:05:27.075Z · LW(p) · GW(p)
Considering I'm currently unemployed and have very little money left in my bank account, I would bet a thousand Canadian dollars that you can't find a real religious believer who will say those words and honestly mean them.
Okay, see, we're going in circles here: I'm trying to ask about the existence of someone who knows "it's all a load of garbage", heck, maybe even contributes to this very board, but cynically joins a church to get the social benefits.
And then you keep saying, no, such people don't exist, if you mean people who are also really religious. But that's the very point under discussion: how many people go through the motions of formal religions for the benefits, say the right applause lights, etc. for the social benefits while holding the conscious belief that there's no literally God in the sense the people there espouse, etc. ?
And if you were talking about people who completely fake being religious, well, in my experience most of them don't ever admit to themselves that they're really atheists in their heart of hearts. I suppose there must be exceptions, though.
I don't see the difference. If you take the LW rationalist position on God, doesn't that mean you're an atheist? So what does it matter if you admit it to yourself. Is there some internal psychological ritual now? If you believe you're a duck, you're a duck...self-believer.
Replies from: Furcas↑ comment by Furcas · 2009-10-15T03:26:56.228Z · LW(p) · GW(p)
Okay, see, we're going in circles here: I'm trying to ask about the existence of someone who knows "it's all a load of garbage", heck, maybe even contributes to this very board, but cynically joins a church to get the social benefits.
All right. I was misled by the fact that your first commend was a reply to Wei Dai, who was talking about real religious people. I thought you believed that (most?) intelligent people who say they're religious aren't really religious.
I don't see the difference. If you take the LW rationalist position on God, doesn't that mean you're an atheist? So what does it matter if you admit it to yourself.
It's the difference between your average forthright atheist and someone like Karen Armstrong, who believes that God "is merely a symbol that points beyond itself to an indescribable transcendence". If you look past the flowery language she's no more a theist than Richard Dawkins is. However, she likes to think of herself as a religious believer, so you'll never get her to admit the true reasons for her profession of belief, no matter how much alcohol she drinks, because she doesn't even admit it to herself.
Replies from: SilasBarta↑ comment by SilasBarta · 2009-10-15T03:31:56.257Z · LW(p) · GW(p)
All right. I was misled by the fact that your first commend was a reply to Wei Dai, who was talking about real religious people. I thought you believed that (most?) intelligent people who say they're religious aren't really religious.
Aren't religious in the sense of consciously taking it all literally, correct, that's my position.
It's the difference between your average forthright atheist and someone like Karen Armstrong, who believes that God "is merely a symbol that points beyond itself to an indescribable transcendence".
So, let's see, she gets benefit of approval from the numerous religious groups by saying all of the applause lights, while maintaining rationality about the literal God hypothesis.
Does that count as intelligent or foolish? I'll leave that as an exercise for the reader.
↑ comment by Kaj_Sotala · 2009-10-14T18:17:41.278Z · LW(p) · GW(p)
I still feel the occasional temptation to start believing, and it has nothing to do with social benefits or a desire to heal my body.
comment by Jonathan_Graehl · 2009-10-13T00:22:38.591Z · LW(p) · GW(p)
Do you think an intellectual argument is all you need to self-modify so drastically? Or is this just in anticipation of some future mind-surgery technology?
Replies from: Wei_Dai↑ comment by Wei Dai (Wei_Dai) · 2009-10-13T00:56:31.246Z · LW(p) · GW(p)
That's part of what I'm asking, I guess. Does anyone value rationality for its own sake, enough to give up anticipation if it turns out to be irrational, purely on intellectual grounds? And what would you do if mind copy/merging technology becomes a reality in the future (which I assume most of us here think is more likely than not)?
Replies from: Eliezer_Yudkowsky, Psychohistorian↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-10-13T01:26:07.404Z · LW(p) · GW(p)
Does anyone value rationality for its own sake, enough to give up anticipation if it turns out to be irrational, purely on intellectual grounds?
You don't make a conscious decision to give up something like that, if it needs giving up. You learn more, see that what you once thought was sense was in fact nonsense, and in the moment of realization, you have already lost that which you never had. Really this is the wrong way to phrase the question: you should properly ask, "If the idea of anticipation is complete nonsense and all our thoughts about it are mere helpless clinging to our own confusion, would you rather know what was really going on?" and to this I answer "Yes."
If someone offered to tell me the Real Story, saying, "Once you learn the Real Story, you will lose your grasp of that which you once called 'anticipation'; the concept will dissolve, and you will find it difficult to remember why you ever once believed such a notion could be coherent; just as you once lost 'time'," I would indeed reply "Tell me, tell me!"
Replies from: Wei_Dai, CronoDAS, aausch, Jonathan_Graehl, Wei_Dai, MichaelVassar↑ comment by Wei Dai (Wei_Dai) · 2009-10-13T09:17:26.635Z · LW(p) · GW(p)
I think when I wrote my previous response I may have missed your point somewhat. I guess what you're really saying is that, if anticipation is truly irrational, then once we sufficiently understand why it's irrational, we won't value it anymore, and it won't require any particular "effort" to give it up. Is this a better summary of your position?
If so, are you really sure it's true, that the human mind has that much flexibility and meta-rationality? Why? (Why do you believe this? And why would evolution have that much apparent foresight?)
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-10-13T15:30:43.984Z · LW(p) · GW(p)
It is a better summary; and I can give no better answer than, "It's always worked that way for me before." I think the real difficulty would come for someone who was told that they had to give up anticipation, rather than seeing it for themselves in a thunderbolt of dissolving insight.
Replies from: Wei_Dai↑ comment by Wei Dai (Wei_Dai) · 2009-10-13T18:01:35.059Z · LW(p) · GW(p)
My reasoning here is that evolution in general has very limited foresight, therefore there must be a limit to human rationality somewhere that is probably far short of ideal rationality. "It's always worked that way for me before" doesn't seem like very strong evidence in comparison to that argument.
↑ comment by CronoDAS · 2009-10-13T08:59:47.359Z · LW(p) · GW(p)
So, it might very well be the case that "time" is something you can get rid of from fundamental physics equations and still get the right answers.
But clocks still work. Even if time is only an "emergent" phenomenon (tee hee), it's still something that's there...
↑ comment by aausch · 2009-10-16T02:11:40.670Z · LW(p) · GW(p)
If someone offered to tell me the Real Story, saying, "Once you learn the Real Story, you will lose your grasp of that which you once called 'anticipation'; the concept will dissolve, and you will find it difficult to remember why you ever once believed such a notion could be coherent; just as you once lost 'time'," I would indeed reply "Tell me, tell me!"
What about this situation:
"As a significant shortcut to developing an understanding of the Real Story, you can follow a formula which begins with a forced loss of your grasp of that which you once called 'anticipation'. I can promise that, once you do understand the Real Story, you will find it difficult to remember why you ever once believed the notion of 'anticipation' could be coherent. I have never found it useful to think about reversing the shortcut formula, so I cannot promise that the process is reversible"
Replies from: MichaelVassar↑ comment by MichaelVassar · 2009-10-16T21:50:35.037Z · LW(p) · GW(p)
I'll do that too. Lots of chemicals can help with this.
↑ comment by Jonathan_Graehl · 2009-10-13T18:36:25.894Z · LW(p) · GW(p)
The possibility of losing the natural feeling of anticipation, or time, isn't really on the table (yet). Knowing the real nature of things intellectually is always good, but does knowing that a feeling is an illusion remove its interference with comfort in the face of a rational decision?
Part of the thrill in bungee jumping is in the overriding. Are you saying that you can manipulate your decision making so that counterproductive instincts fade away?
↑ comment by Wei Dai (Wei_Dai) · 2009-10-13T03:10:42.385Z · LW(p) · GW(p)
You don't make a conscious decision to give up something like that, if it needs giving up.
I don't agree with this. I tend to make all other important decisions consciously. What's so special about this one? (ETA: Also, one potential way of giving it up is to edit my brain using some future technology. I think I definitely want to make that decision consciously.)
The rest of your comment seems to be saying that you're not yet convinced that anticipation is irrational. That's fair enough, but doesn't really address the main point of my post, which is that we regard some parts of our decision making process as having terminal values, and may decide to keep them (as luxuries, more or less) even if we come to believe that they no longer have positive instrumental value as decision subroutines.
↑ comment by MichaelVassar · 2009-10-16T21:49:28.733Z · LW(p) · GW(p)
I'd like to vote this up several times.
Replies from: Wei_Dai↑ comment by Wei Dai (Wei_Dai) · 2009-10-16T23:07:52.830Z · LW(p) · GW(p)
Can you explain why? I personally can't get used to this writing style, and it took me a few hours to figure out what Eliezer was getting at. I also don't understand why he chose to use a tone of high confidence, on something that he has rather flimsy evidence about.
Replies from: Douglas_Knight↑ comment by Douglas_Knight · 2009-10-16T23:34:56.142Z · LW(p) · GW(p)
If you spent hours to figure out what something meant, it's probably worth writing it out in your own words. At least it should help people who find the first style natural understand and communicate with you.
Replies from: Wei_Dai↑ comment by Wei Dai (Wei_Dai) · 2009-10-16T23:50:46.704Z · LW(p) · GW(p)
I did. See here.
↑ comment by Psychohistorian · 2009-10-17T01:23:59.703Z · LW(p) · GW(p)
Does anyone value rationality for its own sake, enough to give up anticipation if it turns out to be irrational, purely on intellectual grounds?
Anticipation is an experience. I don't really see how one could decide to give up anticipation because it's irrational any more than they could decide to give up hunger just because it's irrational.
At least, so long as anticipation refers to "the good (bad) feeling one gets when thinking about an upcoming good (bad) event." I'm not really sure what else you'd mean by it, and I'm not sure how the truth could hope to destroy it.
Replies from: pengvado↑ comment by pengvado · 2009-10-17T14:25:58.541Z · LW(p) · GW(p)
I'm pretty sure that everyone here who's considering giving up "anticipation", uses that term to mean not just any thinking about future experiences, but a consistent method of assigning concrete probabilities to future experiences. And the hypothesis about the irrationality of experiencing such anticipation, is merely a corollary of that factual hypothesis: If your probability estimates on some particular question consistently fail describe good bets, then binding emotions to those probabilities motivates bad decisions.
comment by Mycroft65536 · 2009-10-15T21:12:27.219Z · LW(p) · GW(p)
Faith is easy to dismiss because it can fairly be defined as "belief without evidence".
What exactly is meant by "anticipation"?
comment by Jack · 2009-10-13T17:16:59.127Z · LW(p) · GW(p)
I haven't kept up with all the decision theory stuff but can someone demonstrate to me that anticipation might be irrational and not just that anticipation has to be reduced and re-understood?
Replies from: Wei_Dai↑ comment by Wei Dai (Wei_Dai) · 2009-10-13T18:29:49.016Z · LW(p) · GW(p)
I'm still trying to understand the nature of anticipation well enough to either produce a conclusive demonstration that it's irrational, or show how it can re-understood to fit into a rational decision process. Others are probably working on the same thing (I hope). So far, the best argument I have that anticipation might be irrational is the one I gave in the post.
comment by LauraABJ · 2009-10-13T02:07:15.160Z · LW(p) · GW(p)
I believe this idea has been played out in numerous works of science fiction (Kurt Vonnegut's Tralfamidorians for example). It's difficult to comprehend what it would be like to know/understand the future so well as to not anticipate it, and thus I cannot say if I would 'give it up.' I think Eliezer is right though- there would probably be no dramatic choice in the matter; either we understand or we don't.
comment by loqi · 2009-10-14T00:22:41.350Z · LW(p) · GW(p)
This seems analogous to giving up the belief that free will is somehow ontologically basic. The experience of having made a "choice" can be arbitrarily superimposed on pretty much any action I perform. I value the experience of choosing, but recognize it as subjective fiction. Similarly, I find your suggestion
perhaps it can run some computations on the side to generate the qualia of anticipation and continuity
to be highly intuitively acceptable, but I feel that I'm missing something, perhaps a compelling counter-example.
comment by timtyler · 2009-10-13T06:40:32.719Z · LW(p) · GW(p)
No notion of "anticipation"?
That's only because it subcontracts that work to a "mathematical intuition subroutine" that allows the formation of beliefs about the likely consequences of actions.
Replies from: Wei_Dai↑ comment by Wei Dai (Wei_Dai) · 2009-10-13T07:32:50.325Z · LW(p) · GW(p)
True, but the output of the mathematical intuition subroutine is beliefs about mathematical propositions, relationships, structures, and the like, not expectations of future experiences, which is what we seem to value having. So unless you have some point that I'm not getting, your comment doesn't really seem to address my questions.
Replies from: timtyler↑ comment by timtyler · 2009-10-13T08:04:27.106Z · LW(p) · GW(p)
"Anticipation" refers to feelings associated with the contemplation of future events. All reasonable decision theories contemplate future events - so is it the associated feelings that you are querying? You are suggesting that a "zombie" or "spock" agent might perform just as well? Maybe so - though one might claim that the evaluations of the desirability of the future events such agents contemplate is closely analogous to the "feelings" of more conventionally constructed creatures.