"Is there a God" for noobs
post by loup-vaillant · 2011-03-25T00:26:56.091Z · LW · GW · Legacy · 87 commentsContents
Missing Introduction Truth is universal We should avoid false beliefs "I don't know" is a stance Disagreement is not intolerance No exception None 87 comments
I am trying to write a small essay about the issue. I intend to eventually submit it to reddit (both r/religion and r/atheism), and to show it to my family. This is basic stuff. I basically want to show that:
- Either God exists, or it doesn't. Therefore, either theism or atheism is false.
- It is worthwhile to actively seek truth in this matter
- Being confident is not the same as being arrogant, or intolerant.
My hope is to be able to be able to have meaningful discussions about the topic without being called arrogant or disrespectful. The present draft still miss an introduction, but I think I'll just state what I told above. So, what do you think? Did I miss something? Did I underestimate inferential distances? Could I use other wording? Is this just pointless? Overall, how someone who've never read Less Wrong nor Dawkings might react to that?
Edits:
- Replaced "odds" by "likelihood".
- Change my quotation to remove religious references (I kept Flat Earth, though). It is now (hopefully obviously) a full piece of fiction.
- Replaced "meta" by a small explanation of it.
- Removed "exist", in the hope of avoiding argument about the meaning of "existence".
Missing Introduction
Truth is universal
We all live in the same world. Of course, each of us perceive it differently. We don't see the same things, we don't live in the same places, we don't meet the same people. Because of that and more, we don't hold the same beliefs. But there's only one reality. If a statement is true, it is so for everyone.
For instance, I happen to wear black socks at the time of this writing. Believe it or not, that's the reality, so "Loup was wearing blacks socks when he wrote this" is true for everyone, including you. Even if you believe I'm lying, I am wearing black socks. You can't be absolutely certain of this fact, but a fact it is.
Now imagine I believe the Earth is flat, and you believe the earth is (roughly) spherical. Those two beliefs are mutually contradictory. Clearly, one of us is mistaken.
We should avoid false beliefs
31th day
My captain has gone nuts. I couldn't believe that at first. He's a good leader, and got us out of many tight situations. But he wants to sail west towards India. He actually believe that the Earth has the shape of a ball. A ball. Some sort of giant orb, floating in… nothing, I suppose. I'm no navigation master, but I do know one thing: if we sail west far enough, we will all fall down the bottomless pit of despair at the edge of the world.
I tried to talk him out of this folly, but he would have none of it. I'm sorry captain, but you leave me no choice. Tonight will be your end. For the sake of the crew. Please forgive me.
Holding false beliefs is dangerous. It has consequences, sometimes innocuous, sometimes tragic. You never know until you correct a previously false belief. If you care about anything, you should try and hold only true beliefs, because one false belief can be enough to destroy what you hold dear. Incidentally, that's basically why most of the time, lying is not nice.
Flaws in reasoning are even worse: they generate or sustain false beliefs. They are also more difficult to correct. Basically they're a reliable way to be wrong, which is potentially much more dangerous than any single wrong belief. If you find a flaw in your reasoning, eliminate it, then re-check your beliefs. If someone proposes you to adopt one, do not drink that cup, it's poisoned.
"I don't know" is a stance
Are my socks black? Think about it for 30 seconds, look at the evidence at your disposal, then answer honestly. There are 3 kinds of answers you might produce:
- "Your socks are black." Meaning, you are reasonably sure that my socks are black.
- "Your socks are not black." Meaning, you are reasonably sure that my socks aren't black.
- "I don't know". Meaning that from your point of view, my socks could be black, or they could be of a different colour. You're not sure either way.
Note that all three answers share a common structure. They could all be phrased thus: "I estimate that the likelihood of your socks being black is X%". If X is close to 100%, you believe my socks are black. If it is close to 0%, you believe they're not. If X is, say, between 10% and 90%, then you're not sure. Anyway you're bound to choose a value for X, and that will be your stance. It is no less respectable than any other, provided you did your best to estimate the odds.
Disagreement is not intolerance
Say I'm 99.9% confident that the Earth is flat, and you are 99.999% confident it is spherical. If we also know of each other's opinion, then we automatically strongly believe the other is mistaken. This is not intolerance. This is the direct consequence of our respective beliefs. If you weren't so sure that I'm wrong, you wouldn't be so sure that the Earth is spherical either. This is a matter of consistency.
There is hope however: if we are both reasonable, don't have flaws in our reasoning, have roughly equal access to evidence, and honestly attempt to reach the truth together, then we will eventually agree. At least one of us will radically change his mind.
Let's say that halfway through such a quest, you are still 99.999% confident the Earth is spherical, but I am only 60% confident. It means two things:
- I changed my mind.
- We still disagree.
This time, the disagreement is not as strong, but still significant: you estimate that flat Earth is barely worth considering. I on the other hand, think sailing West means 40% chances of falling at the edge of the world, which is just too risky.
No exception
These rules apply to any question. Even controversial, emotional questions. So. Is there a God?
-
Either there is a God, or there isn't. Either way this is a fact. Inevitably, of atheists and theists, one group is mistaken.
-
This is an important question. A wrong answer can for instance lead us to forsake our lives or our souls for naught.
-
Agnosticism is less comfortable than it sounds. First, agnostics disagree with both theists and atheists. Second, any significant evidence should mostly turn them into either theists or atheists. And the importance of the question suggest they should seek such evidence.
-
Many atheists are very sure there is no God, and many theists are very sure there is —even though they know of each other's opinions. Therefore, they both believe the other group is mistaken. This is not intolerance, this is consistency.
I'm worried however by the lack of consensus after all this time. "Is there a God" is an old and important question, and as far as I know, there is plenty of widely accessible evidence, and numerous debates. I suppose our thinking still have problems.
Now, is there a God? Your answer should be of the form "My estimate of the likelihood that there is a God is X%". Don't style yourself as an atheist, believer, or agnostic. Assess the evidence at your disposal (science, scriptures, what you where told…), then give your number. Just bear in mind these sanity checks:
-
Your estimate may be very close to either 0% or 100%, which means you are very confident. Just to be sure, could you live up to your confidence, and say to the face of someone of the opposite opinion "you are mistaken"?
-
On the other hand, your estimate may be close to 50%. Just to be sure, are you positive that the evidence at your disposal is that balanced? It is not stronger one way or another?
- If you want to share your estimates with friends, make sure you talk about the same idea of God. A good starting point can be "a supernatural, sentient being that created the universe, is omnipotent and omniscient". You could add "is benevolent", or, "is still active in the universe", or "listen to prayers", all the way down to any particular religious dogma if you want to.
Like I said, these principles can apply to any question. Including the really scary ones, like "is there an afterlife?"
87 comments
Comments sorted by top scores.
comment by JoshuaZ · 2011-03-25T00:56:33.786Z · LW(p) · GW(p)
Either God exist, or it doesn't. Therefore, either theism or atheism is false.
Note that some versions of what people mean by God are very vague. So at that level, it may not be a meaningful question.
Replies from: CronoDAS, Will_Newsome, byrnema↑ comment by CronoDAS · 2011-03-25T01:33:33.584Z · LW(p) · GW(p)
If you want to disambiguate a bit, I like using the phrase "God of Abraham". (The simulation argument suggests that the universe may indeed have had some kind of Creator, but it clearly wasn't YHWH.)
Replies from: FAWS↑ comment by FAWS · 2011-03-25T02:09:03.032Z · LW(p) · GW(p)
Clearly?
The simulation took 6 days to put together. At first YHWH started with an empty universe with just a single planet. To take a quick look around he opened the debug console and forced a light source with the command...
...The advanced editing software would recalculate everything to fit together each time he implemented a new feature to avoid discontinuities in the simulation. On the third day when he decided on the overall structure of his universe it calculated backwards to create a history consistent with the planet he had already been working on for two days, and later on it would recalculate the history of life each time he imported a new critter...
...Intrigued how the simulation looked like from inside YHWH took a partial copy of his mind, scaled it down to fit in one of the sims and inserted it into the simulation. Since had had some interesting interactions with some of the sims and didn't want to wipe them he anchored the recalculation at the conception of his avatar...
Replies from: Oscar_Cunningham, CronoDAS↑ comment by Oscar_Cunningham · 2011-03-25T10:54:15.045Z · LW(p) · GW(p)
This story doesn't account for YHWH's legal and moral decrees, or most of his other numerous interventions.
Replies from: FAWS↑ comment by Will_Newsome · 2011-03-25T09:03:58.354Z · LW(p) · GW(p)
Note that some versions of what people mean by God are very vague. So at that level, it may not be a meaningful question.
Note that all versions of what people mean by 'exist' are pretty vague too. So at that level, it may not be a meaningful question.
↑ comment by byrnema · 2011-03-26T02:10:28.702Z · LW(p) · GW(p)
I would refine that it's not a meaningful argument to have. Theists can still question whether what they mean by God exists in the way that they mean.
I think atheists make a mistake in tackling whether God exists when they don't know what theists mean by 'God' or 'exist' (and theists don't either). Atheists tend to press the meaning of the words towards overly empirical interpretations, so that they pretend theists believe in some sort of creature. In particular, arguments about a flying spaghetti monster, etc, feel entirely irrelevant to someone that is theist. They know they are considering something different.
'Something different' doesn't mean slippery and shape-changing just so that you can never get a hold of it in an argument. It's something a linguist or a psychologist could probably get a hold of.
My impression of the best way to convert a theist, based on intuition rather than practice in the field, is to challenge the consequences of this God-that-exists.
For example, what does God do and how is he good? Try to get the theist to feel for a rough answer, and then ask again in a week. As soon as the theist articulates a consequence, they realize it's not true as they go about their day with this thought in hand and their image of what God does erodes. Until their concept of God doesn't really do anything, and he's not really that good in the sense they originally meant. Even if they still "believe in God", their beliefs don't have any consequence.
It's depressing. I wouldn't deliberately deconvert anyone.
comment by atucker · 2011-03-25T02:07:03.117Z · LW(p) · GW(p)
Before I say anything, I'd like to say that convincing other people to become atheist is really hard. Really hard. I wish you the best of luck if you want to go through with it.
I think that if someone isn't already atheist, explaining to them why they should be will need to cover a lot of ground before its likely to work, and will probably need to be really long.
You never know until you correct a previously false belief. If you care about anything, you should try and hold only true beliefs, because one false belief can be enough to destroy what you hold dear.
This conclusion isn't obvious to non-LW readers, so you should try unpacking your thinking a little bit. I think you should stick a real life example in here to make it more tangible, ideas like this tend to be glossed over when read casually.
one false belief can be enough to destroy what you hold dear.
Many people hold beliefs dear.
If someone proposes you to adopt one, do not drink that cup, it's poisoned.
If you have any advice about how to detect flaws in reading, tell them how. Most people would that they shouldn't adopt obviously fallacious modes of thought, but most don't know that they have cognitive biases built-in.
Note that all three answers share a common structure.
Most people will not leap to probability from this.
Also, it might help to make it clearer that "I don't know" means that you spread out your probability.
There is hope however: if we are both reasonable, don't have flaws in our reasoning, have roughly equal access to evidence, and honestly attempt to reach the truth together, then we will eventually agree. At least one of us will radically change his mind.
Let's say that halfway through such a quest, you are still 99.999% confident the Earth is spherical, but I am only 60% confident. It means two things:
I changed my mind. We still disagree. This time, the disagreament is not as srong, but still significant: you estimate that flat Earth is barely worth considering. I on the other hand, think sailing West means 40% chances of falling at the edge of the world, which is just too risky.
This sweeps all of the math behind Aumann's Agreement Theorem under the carpet, and does so before people are convinced of the whole probability as belief point.
Also, there are very many visceral reasons that someone disagreeing with you feels like they're attacking you which you don't address.
Other notes:
I'd suggest that you talk about what constitutes proper evidence for a belief. Almost every religious person I know insists that something in their life shows them that God exists. I suggest mentioning (and explaining) belief in belief.
I hope that helps.
Replies from: TheOtherDave, Eneasz↑ comment by TheOtherDave · 2011-03-25T14:59:10.770Z · LW(p) · GW(p)
I'd suggest that you talk about what constitutes proper evidence for a belief. Almost every religious person I know insists that something in their life shows them that God exists. I suggest mentioning (and explaining) belief in belief.
One of the key pieces to this that I find is surprisingly nonobvious to a lot of people is the difference between concluding that event E is evidence for belief B on the one hand, and concluding that if I perceive E then B must be true on the other.
I am reminded of tutoring high-school physics, where an astonishing number of people have trouble with the idea that when you toss a ball in the air, it immediately starts to accelerate downward, even though it is traveling up. The idea that the acceleration vector and the velocity vector can point in opposite directions just seems to be hard for some people to wrap their brains around, and until they get it their thinking about ballistics is deeply confused.
I think something similar happens with people's understanding of the relationship between evidence and beliefs, and it is perhaps worth addressing explicitly before you get into examples where people have something emotionally at stake.
One way to short-circuit this is to train the habit of thinking in terms of confidence intervals rather than binary beliefs, but I suspect that's even more inferential steps away for most of your audience. So I recommend spending some time on this aspect explicitly,
Replies from: atucker↑ comment by atucker · 2011-03-26T02:07:35.106Z · LW(p) · GW(p)
One of the key pieces to this that I find is surprisingly nonobvious to a lot of people is the difference between concluding that event E is evidence for belief B on the one hand, and concluding that if I perceive E then B must be true on the other.
Good point, will keep in mind.
I once told my Dad that his car accident was only proof of God because he was looking to find evidence to support God, when really it was just medical science and his own body's repair mechanisms that saved him.
It had no effect.
↑ comment by Eneasz · 2011-03-25T17:32:42.401Z · LW(p) · GW(p)
Before I say anything, I'd like to say that convincing other people to become atheist is really hard. Really hard.
I have to completely disagree, although in practice this will make no difference.
Convincing other people to become atheist is so easy that you don't even have to do it. They will do it all by themselves, given one condition. That one condition is that they value knowing what's real (which includes verification of "How do I know that what I know is true?") more than they value almost anything else - including fitting comfortably into the social groups they've been raised in. If you can get them to place such a high value on truth then you don't even need to bring up the question of god. At some point they'll stumble upon it themselves and then they are trapped - they won't be able to stop until they've completely disabused themselves of the notion.
Unfortunately, getting people to care more about knowing what's real than almost anything else is really hard. Really hard. So in practice, the difficulty of the task hasn't been altered at all.
Replies from: Lila, atucker↑ comment by Lila · 2011-04-04T05:00:07.413Z · LW(p) · GW(p)
I care quite a lot about knowing what's real, but not more than almost anything else. Yet, I was still able to become atheist--by reading this website, and especially Eliezer's post Excluding the Supernatural. I was full-blown religious, and becoming atheist was very painful, and still is.
↑ comment by atucker · 2011-03-26T02:09:55.233Z · LW(p) · GW(p)
If you can get them to place such a high value on truth then you don't even need to bring up the question of god.
Agreed. Though, this seems to be more of a personality trait.
I find that most people I've met who seem prone to becoming atheistic already are, or will quickly become so once its sufficiently supported.
People who claim to be agnostic or believe in some divine force seem swayable.
Its a rare few who seem willing to become atheistic, but are full-blown religious.
Granted, I'm trying to figure out how to deal with that, but it seems difficult.
comment by Vladimir_Nesov · 2011-03-25T15:43:33.536Z · LW(p) · GW(p)
Belief in God is just a symptom, and comparatively unimportant at that (it serves as a focus of cultural development of anti-epistemology, but probably not significantly in individual people). You should heal people's epistemology instead, the God issue will resolve on its own as a result.
Using belief in God as an example while teaching good epistemology is a bad idea, because it activates the existing anti-epistemology, which would interfere with the education. Confronting belief in God would be an advanced exercise, performed after your skills become sufficiently strong.
comment by falenas108 · 2011-03-25T01:10:47.029Z · LW(p) · GW(p)
This is meta dangerous.
Most people probably wouldn't know what that means; I suggest a change to increase accessibility.
Replies from: Dreaded_Anomaly↑ comment by Dreaded_Anomaly · 2011-03-25T01:58:17.271Z · LW(p) · GW(p)
Perhaps something like "This is dangerous at a deeper level."
comment by David_Gerard · 2011-03-25T10:35:44.304Z · LW(p) · GW(p)
I suspect the best approach is not a lengthy, coherent argument - but planting a seed of doubt. They'll object strenuously and often stupidly, but it'll stick in their minds.
Read this review of a TV show on the historicity of the Bible. When someone starts by presuming a television show about actual history and archaeology is an "attack" and says, as if it's a debate-winning argument, "there isn't much point trying to attack a religion using facts" ... you're not in the rational zone, but in the "foundations of my world are rocking somewhat" zone.
(This is why I really like the notion of planting seeds of rationality - it's like the difference between exposing someone to radiation and feeding them radioactive material. Perhaps that's not the very best analogy ;-) )
comment by Normal_Anomaly · 2011-03-25T00:49:18.265Z · LW(p) · GW(p)
Nitpick: Can you give some context or say who's speaking in the quote about spherical Earth-ism being of the devil? For a moment I thought there was supposed to be a different speaker in the second paragraph and I got confused.
Substantive comment: I think this is a good thing to have out there to show to the theists who hide behind "but God's existence isn't a scientific question!" On the other hand, don't be disappointed if you get no apparent success. I went though a phase of talking to religious people on the internet, and I never got anything out of it. From what I've heard, most people who lose their religion came to question it on their own, and then saw some atheist material.
Replies from: jwhendy↑ comment by jwhendy · 2011-03-25T04:12:38.207Z · LW(p) · GW(p)
I concur re. the second paragraph "quote" thingy. I kind of expected to see a footnote -- it was written as if out of another work and I wasn't sure to make of it. Maybe clarify that it's your own prose or preface it with some kind other kind of intro?
comment by kpreid · 2011-03-25T12:30:08.704Z · LW(p) · GW(p)
Your “Me believing in Flat Earth” paragraph is not showing the consequences of a false belief but of a (probably strawman for your audience) religious-and-false belief; I expect it to offend rather than illustrate, and considered in the overall structure of the essay, I think it brings in the topic of religion too soon.
Replies from: loup-vaillant↑ comment by loup-vaillant · 2011-03-25T14:29:22.977Z · LW(p) · GW(p)
I wanted situation where believing the wrong thing will obviously lead to a catastrophe. Maybe I should just kill the captain to save the crew, and leave the bishop out of it? That may do it if I make clear say, that we're all on the same boat and I'm second in command.
Replies from: hairyfigment↑ comment by hairyfigment · 2011-04-14T05:08:49.004Z · LW(p) · GW(p)
Christians may very well know of the Flat Earth strawman and connect it to their own stereotype of an argumentative atheist. If you include it, you have to explicitly mention that educated people at the time really knew the shape of the Earth, or else some reader will use this as an excuse to dismiss you. Probability near 1.
comment by rstarkov · 2011-03-25T11:37:06.594Z · LW(p) · GW(p)
I have found that the logical approach like this one works much more rarely than it doesn't, simply because it appears that people can manage not to trust reason, or to doubt the validity of the (more or less obvious) inferences involved.
Additionally, belief is so emotional that even people who see all the logic, and truly seem to appreciate that believing in God is completely silly, still can't rid themselves of the belief. It's like someone who knows household spiders are not dangerous in any way and yet are more terrified of them than, say, an elephant.
Perhaps what's needed in addition to this is a separate "How to eschew the idea of god from your brain" guide. It would include practical advice collected from various self-admitted ex-believers. Importantly, I think people who have never believed should avoid contributing to such a guide unless they have reasons to believe that they have an extraordinary amount of insight into a believer's mind.
Replies from: rstarkov↑ comment by rstarkov · 2011-03-25T11:42:53.444Z · LW(p) · GW(p)
To expand a bit on the first paragraph, I feel that such reasonable arguments are to many people about the same as the proof of Poincaré conjecture is to me: I fully understand the proposition, but I'm not nearly smart enough to follow the proof sufficiently well to be confident it's right.
Importantly, I can also follow the outline of the proof, to see how it's intended to work, but this is of course insufficient to establish the validity of the proof.
So the only real reason I happen to trust this proof is that I already have a pre-established trust in the community who reviewed the proof. But of course the same is also true of a believer who has a pre-established trust in the theist community.
So the guide would require a section on "how to pick authorities to trust", which would explain why it's necessary (impractical to verify everything yourself) and why the scientific community is the best one to trust (highest rate of successful predictions and useful conclusions).
comment by DanielLC · 2011-03-25T05:52:33.544Z · LW(p) · GW(p)
On the other hand, your estimate may be very close to 50%. Just to be sure, are you positive that the evidence at your disposal is that balanced? It is not stronger one way or another?
This is quite feasible. It's not because the evidence towards the existence of a god is balanced. It's because the evidence towards your bias is balanced.
Each time you make an observation, you have to estimate the probability and use that to update your priors. Unfortunately, the probabilities will be in error. If the error were random, the error would be proportional to the square root of the number of observations. Unfortunately, some of the error is bias, and it correlates. The error increases exponentially with the number of observations.
That explanation is kind of shaky, but it can be shown easily. A lot of people are very certain about whether or not there is a god, but a good chunk of them are completely wrong. The error in their confidence is huge.
comment by Cyan · 2011-03-25T00:34:20.375Z · LW(p) · GW(p)
Nitpick: you use the term "odds" as if it were a synonym of "probability".
Replies from: loup-vaillant↑ comment by loup-vaillant · 2011-03-25T00:40:03.715Z · LW(p) · GW(p)
Noted, I'll correct that. Thanks. (but first, I need to sleep).
Edit: done.
comment by Jayson_Virissimo · 2011-03-25T08:05:01.945Z · LW(p) · GW(p)
If you care about anything, you should try and hold only true beliefs, because one false belief can be enough to destroy what you hold dear.
If you provide an argument for this statement, then your article will be much more persuasive. Firstly, some of the things people "care about" are their beliefs. Secondly, there are (seemingly plausible) pragmatic arguments for holding certain beliefs regardless of their truth value. For instance, according to Harry Gensler, William James argued that:
The belief in God gives practical life benefits (courage, peace, zeal, love, etc...).
All beliefs that give practical life benefits are pragmatically justifiable.
Therefore, the belief in God is pragmatically justifiable.
comment by David_Allen · 2011-03-25T03:43:14.848Z · LW(p) · GW(p)
I'm working on a way to explain this concept to the nice strangers who stop by my house from time to time.
Written for the LW audience:
The problem with the God hypothesis is that it is indistinguishable from innumerable other stories that can be made up to explain the same phenomena, and whose validity is supported equally well by the evidence.
For example: I once made a tuna sandwich, ordinary in every way except that it had the special ability to create the universe, both past and future. It was not God, which I verified by eating it. This is the tuna sandwich hypothesis for the existence of the universe (TSH).
The TSH is superior to the God Hypothesis in two important ways: first of all it was tasty; and second, the theory is much simpler. Instead of needing all kinds of crazy special attributes like omnipotence, omnipresence, and omniscience, the tuna sandwich only requires the special ability to create the universe for all time; so it is a more likely explanation for the creation of the universe. In other words the likelihood that God created the universe is relatively lower than the TSH due to the joint probably of all the claims that the God hypothesis makes about God. All that existence you enjoy (or suffer from) is better evidence for the TSH.
This can be repeated for the other phenomena attributed to God, and instead of "tuna sandwich" you can pick anything -- for example any real number -- which makes the set of choices uncountable.
Given the other possibilities, the likelihood that the God hypothesis is true is infinitely small; which rational systems will round to zero.
I am looking for a way to make this easier for a layman to understand, and to make it less offensive to those with strongly held beliefs.
Replies from: jwhendy↑ comment by jwhendy · 2011-03-25T04:10:08.400Z · LW(p) · GW(p)
I definitely get these types of explanations, but don't know that they actually hold up. You need to posit a timeless tuna sandwich, and we only know about material (bounded by space-time) tuna sandwiches. It would seem that whatever you suggest will be an incredible aberration from daily experience.
Not that god doesn't suffer from the same issues... just saying that making up a universe-creating tuna sandwich doesn't, in my opinion, actually have any advantages over "timeless/spaceless being with ability to create the universe." At least that is posited to have existed before/outside creation vs. you recognizing from within it that something within it had the power to create from without.
Does that make any sense?
I think it all breaks down and my personal answer at the moment is simply that if there is and "outside time-and-space", then no one on this side of the wall can possibly know what's on the other side unless we make some serious technological advances. Most philosophical trains of thought break down for me as soon as they say it's logically necessary for a being to have existed "before" the big bang. "Before" is utterly meaningless prior to the existence of cause and effect. We use our knowledge of causation from inside the universe to posit the type of relationship that must have existed between the big bang event and some other being or power. I don't think that flies.
In any case, I am currently more satisfied with that defense vs. trying to come up with an analogy that, in my opinion, suffers from all of the same pitfalls as the deity hypothesis and even a few more :)
Replies from: David_Allen↑ comment by David_Allen · 2011-03-25T04:52:46.066Z · LW(p) · GW(p)
In any case, I am currently more satisfied with that defense vs. trying to come up with an analogy that, in my opinion, suffers from all of the same pitfalls as the deity hypothesis and even a few more :)
This is of course the point. The TSH is obviously stupid, but the God hypothesis is weak against even very stupid alternatives. Some of the other innumerable alternatives may in fact strike some as more plausible than the God hypothesis, but that doesn't make them better.
Replies from: jwhendy↑ comment by jwhendy · 2011-03-25T21:48:30.808Z · LW(p) · GW(p)
I don't understand... why bring up an obviously flawed response if you know it's flawed? I guess I don't see the point in using something like a tuna sandwhich that a theist will simply reject out of hand rather than explain the problem with the god hypothesis directly.
Even if you insist on using the tuna sandwhich... it still has more issues than the god hypothesis, so I still don't see the point, I guess.
Replies from: David_Allen↑ comment by David_Allen · 2011-04-04T21:47:59.122Z · LW(p) · GW(p)
I did state the problem with the God hypothesis directly; I said:
The problem with the God hypothesis is that it is indistinguishable from innumerable other stories that can be made up to explain the same phenomena, and whose validity is supported equally well by the evidence.
I gave the TSH as an example of ridiculous sounding creation theory that is actually more likely than the God hypothesis, within the space of unfalsifiable theories that the God hypothesis occupies.
To restate this: The God hypothesis is indistinguishable from a story that somebody just made up. For every case that boils down to "God did it", we can create innumerable equivalent stories that claim "X did it", for some value of X; with no way to select which story (if any) is correct.
Replies from: jwhendy↑ comment by jwhendy · 2011-04-04T22:30:05.827Z · LW(p) · GW(p)
Thanks for clarifying. I think we're nearly on the same page in that we agree that both hypotheses are ridiculous. The only issue I see with TSH vs. god is that god has been defined as something that is outside time/space, omni-max, etc.
Does that mean anything about the plausibility of such a thing? No way.
But... it has been defined that way, whereas a tuna sandwich will always be confined to time and space. You could fix this by being the sole witness to a tuna sandwich bursting through a rip in space-time and informing you that it was outside time and space.
God, by the apologists' definition, could actually be more plausible than the tuna sandwich you put together in your kitchen because the tuna sandwich is made of matter and go is defined as not. Matter seems only to exist in space-time and I see that as a strike against the TSH. It runs into objections re. the infinite regress and how it has existed long enough to create the universe from "outside" (or before or however you want to think about it) the universe when it's made from perishable bread and fish-stuffs.
Does that make any more sense?
Replies from: David_Allen↑ comment by David_Allen · 2011-04-05T17:34:22.509Z · LW(p) · GW(p)
The only issue I see with TSH vs. god is that god has been defined as something that is outside time/space, omni-max, etc.
Actually, you may not be aware that mayonnaise is critical to universe creation. Since God does not contain mayonnaise the God hypothesis is less plausible than the TSH.
So you claim that existing outside space and time is necessary for the creation of the universe and I claim that mayonnaise is necessary. Do either of these claims allow us to select between the theories? I don't see how; but by adding these additional requirements we increase the complexity of the theories and reduce their relative likelihood within the set of unfalsifiable theories.
Christian apologists can make compelling arguments because in the realm of made-up-stuff there is plenty that appeals to our cognitive biases. I agree that existing outside of space and time feels like a better property of a universe creator than containing mayonnaise; but that feeling is based from our very human perspective and not from any actual knowledge about how the universe came to be the way we see it now.
Replies from: jwhendy↑ comment by jwhendy · 2011-04-05T17:46:50.613Z · LW(p) · GW(p)
Actually, you may not be aware that mayonnaise is critical to universe creation. Since God does not contain mayonnaise the God hypothesis is less plausible than the TSH.
:)
Christian apologists can make compelling arguments because in the realm of made-up-stuff there is plenty that appeals to our cognitive biases. I agree that existing outside of space and time feels like a better property of a universe creator than containing mayonnaise...
Now that was what I needed. As soon as you started going there above with mayonnaise-as-necessity, I started wondering if perhaps I'm just intent on the "outside-time-and-space" requirement because that's what I've always heard debated and argued.
...but that feeling is based from our very human perspective and not from any actual knowledge about how the universe came to be the way we see it now.
And then you actually went there and all is clear.
Thanks!
comment by jtk3 · 2011-04-16T21:39:31.403Z · LW(p) · GW(p)
"Now imagine I believe the Earth is flat, and you believe the earth is (roughly) spherical. Those two beliefs are mutually contradictory. Clearly, one of us is mistaken."
Nitpick: . The given beliefs are contradictory but not exhaustive. At least one of the disputants is mistaken, but both could be wrong. The earth could have another shape.
I think theist and atheist can reasonably be defined to be contradictory and exhaustive. Agnostics do not affirm an alternate opinion about whether God exists, they're simply undecided.
comment by Dorikka · 2011-03-30T21:24:02.051Z · LW(p) · GW(p)
You might want to add something about asking what betting odds one would take on a proposition being objectively true to determine how much probability one assigns to it since one might find giving such odds easier than making a probability estimate from 0 to 1.
comment by [deleted] · 2011-03-25T12:47:03.779Z · LW(p) · GW(p)
Overall, how someone who've never read Less Wrong nor Dawkings might react to that?
wince
comment by Prismattic · 2011-03-25T02:52:33.104Z · LW(p) · GW(p)
Agnosticism is less comfortable than it sounds. First, agnostics disagree with both theists and atheists. Second, any significant evidence should mostly turn them into either theists or atheists. And the importance of the question suggest they should seek such evidence.
Speaking as an actual agnostic, I don't think this is correct. The agnostic position is not "I am unsure whether God exists." The agnostic position is "The answer to this question is unknowable, and my time is better spent on questions that can be answered."
I also think you need to spell out why the question is important, since it is not obvious to me. I know classical philosophy isn't really in favor on Less Wrong, but as I see it, Euthyphro's Dilemma pretty much eliminates the relevance of God to normative ethics, and I'm not sure why else we should care.
Replies from: None, loup-vaillant, drethelin↑ comment by [deleted] · 2011-03-25T03:04:16.859Z · LW(p) · GW(p)
Speaking as an actual agnostic, I don't think this is correct. The agnostic position is not "I am unsure whether God exists." The agnostic position is "The answer to this question is unknowable, and my time is better spent on questions that can be answered."
Could you elaborate on this? I don't see how the answer is unknowable at all. Additionally, this idea that the answer cannot be determined raises a few alarm bells because it sounds like the fallacy of gray. However, I'll gladly retract this complaint if you can explain why the question is unanswerable.
Replies from: Prismattic↑ comment by Prismattic · 2011-03-25T23:39:31.680Z · LW(p) · GW(p)
To clarfiy: there are some questions of cosmology that lend themselves to empirical refutation. We have excellent reason to accept that the universe is billions, not thousands of years old. I'm not agnostic regarding biblical literalism; I'm agnostic regarding deism. I don't think anyone has any way of testing that empirically. I understand that others are arguing for a position based on pure reason rather than empiricism, which I find unpersuasive. Here's an analogy which I hope explains both why I say I don't know and why I say I don't care:
Suppose one day I wake to discover I am locked in a room that is pitch-black. There are many questions I would be interested in discovering, that I could answer with some degree of confidence even lacking any instrumentation: the dimensions of the room; the location, if any, of doors and windows, the kind of lock on the door, and if I'm really sensitive to gravity, my altitude. What I'm not going to do, once I've concluded that there aren't any windows or other potential light source, is ponder the color of the walls, because a)there is absolutely no way I am going to be able to update my priors about it, b)it does not appear to have any practical consequences, and c)I have, presumably, a limited amount of time, and other questions are more pressing.
Replies from: None↑ comment by [deleted] · 2011-03-26T04:28:27.965Z · LW(p) · GW(p)
Just because you can't currently conceive on an empirical test of deism, (a) that doesn't mean that someone won't come up with one eventually, and (b) that doesn't mean that you can't make inferences about deism.. The key point is this: Just because something is untestable, that doesn't mean it's unknowable. However, you said that you find such arguments "unpersuasive" for three reasons, which I'll respond to individually:
a)there is absolutely no way I am going to be able to update my priors about it
You can update priors based on an argument--see the post I linked to above.
b)it does not appear to have any practical consequences, and
Even so, it is still a fact about reality and so it does have meaning even if it doesn't have practical value to you personally. Again, see the post I linked to above.
c)I have, presumably, a limited amount of time, and other questions are more pressing.
This is logically distinct from the question of whether something is unknowable. While it may be a reason for you not to care, it's not a valid argument that the question is unsolvable.
In summary: I don't think there are reasons to believe that the answer to "is there a deistic God" is unknowable. Although there might be reasons that you don't care about the answer, this is not the same thing as saying that no answer exists.
↑ comment by loup-vaillant · 2011-03-25T08:55:05.602Z · LW(p) · GW(p)
"The answer to this question is unknowable" probably means the answer won't have any effect on your observations, one way or another. In that case, you're better of assuming the simplest answer is most probable.
The problem is I don't know how to tell about Occam's Razor to a layman.
↑ comment by drethelin · 2011-03-25T03:36:57.702Z · LW(p) · GW(p)
If you don't think belief in god has real world impact on other beliefs you have then you have a very odd view of the situation. If god exists, we should want to believe he exists. If he doesn't, we should want to believe he doesn't exist. If we're unsure, just shrugging and saying "we'll never know" doesn't get us anywhere closer to the truth.
Replies from: jwhendy, prase↑ comment by jwhendy · 2011-03-25T04:21:20.097Z · LW(p) · GW(p)
True... but coming from the perspective of a former-believer, I can absolutely state that it's a serious mind-fuck trying to answer the question with the utmost certainty. I can relate with the mentality, as it's come to me in phases. I have delved into study and then simply burnt out because of how many subject areas this debate covers. See my running book list.
Don't get me wrong -- I still want to answer the question, but to a degree, I have become a bit "Bleh" about it as I just don't know what will raise my confidence to such a degree that I can just live my life in peace until god himself comes down from the sky to tell me of his existence.
For the time being, I can simply state that I don't believe.. but that's about it. I find it unlikely, am not satisfied by the evidence, and think there's some serious issues with Christianity in particular.
Then again, the uncertainty lingers in my mind and creates a bit of an obsession. It's been hard for me to move on with my life -- that causes me to research intensely, and that burns me out. This cycle brings me to my current state where I have tried to just accept that I simply don't believe in god and that I find it faaaar more pleasurable to do woodworking and make friends very nice cribbage boards.
Does that make any sense? I just wanted to chime in from the point of view of someone in an odd situation. I may have wrongly assumed that you perhaps have been a non-believer for quite a while. For someone coming from relatively recent belief (1.25 years ago), I have experienced the frustrations of thinking "We'll never know."
↑ comment by prase · 2011-03-25T17:14:25.047Z · LW(p) · GW(p)
The pattern "if X is true, we should want to believe X, if X is false, we should want to believe non-X" is perhaps a good rhetorical device, but I still wonder what it means practically (it would be easier without the "want to" parts). If it means "do not believe in falsities" then I agree. If it means "try to have a correct opinion about any question ever asked", it's clearly a poor advice for any agent with limited cognitive capacities.
Moreover you probably deny the existence of compartmentalisation. There are lots of religious believers who are generally sane, intelligent and right in most of their other beliefs.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2011-03-25T18:55:47.207Z · LW(p) · GW(p)
If I believe something, and someone proves me wrong, what is my reaction to being proved wrong?
For most people and most subjects, it is negative... we don't like it. Being proven wrong feels like a loss of status, a defeat. We talk about "losing" an argument, for example, which invokes the common understanding that it's better to win than to lose.
I understand that pattern to be encouraging a different stance, in which if someone proves me wrong, I should instead thank them for giving me what I want: after all, X was false, so I wanted to believe not-X (though I didn't know it), and now I do in fact believe not-X. Yay!
Replies from: prase↑ comment by prase · 2011-03-26T10:14:02.244Z · LW(p) · GW(p)
The problem I see is that once I know that X is false, I may be angry for losing the argument, but I already believe non-X. Somebody (Wittgenstein?) said that if there was a verb meaning "believing falsely", it would have no first person singular.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2011-03-26T12:59:29.723Z · LW(p) · GW(p)
I don't quite see the problem.
Yes, once you've convinced me X is false, I believe non-X.
But I still have a choice: I can believe non-X and be happy about it, or believe non-X and be upset about it.
And that choice has consequences.
For example, I'm more likely in the future to seek out things that made me happy in the past, and less likely to seek out things that have upset me. So if being shown to be wrong upsets me, I'm less likely to seek it out in the future, which is a good way to stay wrong.
Replies from: jwhendy↑ comment by jwhendy · 2011-03-28T21:41:58.565Z · LW(p) · GW(p)
But I still have a choice: I can believe non-X and be happy about it, or believe non-X and be upset about it.
I really wonder about this -- how much control do you think you have to believe ~X? I actually highly doubt that you do have a choice. You're either convinced or you're not and belief or non-belief is the end result, but you don't choose which path you take.
Or perhaps you could clarify which beliefs you think fall in this category?
How about X = the sky is blue. Do you still think your statement holds?
- You can believe the sky is blue and be happy
- You can believe the sky is not blue and be unhappy
I don't think you have a choice about believing that the sky is blue. Were you actually able to believe whatever you wanted about the matter... I'd possibly be a) impressed, b) fascinated, or c) quite concerned :)
What do you think? The relationship between choice and belief is quite interesting to me. I've written some about it HERE.
Edit: this whole comment was based on my imagination... I'm going to leave it anyway -- I've made a mistake and been corrected. Sorry, TheOtherDave; I should have read more carefully.
Replies from: TheOtherDave, pedanterrific↑ comment by TheOtherDave · 2011-03-29T03:44:39.736Z · LW(p) · GW(p)
Sorry, TheOtherDave; I should have read more carefully.
No worries.
And, just for the record, while I do think we have a fair amount of control over what we believe, I don't think that it works the way you're probably envisioning my having meant what you thought I said. In particular, I don't think it's a matter of exerting willpower over my beliefs, or anything like that.
If I wanted to change my belief about the color of the sky, I'd have to set up a series of circumstances that served as evidence for a different belief. (And, indeed, the sky is often grey, or white, or black. Especially in New England.) That would be tricky, and I don't think I could do it in the real world, where the color of the sky is such a pervasive and blatant thing. But for a lot of less concrete beliefs, I've been able to change them by manipulating the kinds of situations I find myself in and what I attend to about those situations.
Come to that, this is more or less the same way I influence whether I'm happy or upset.
Replies from: jwhendy↑ comment by jwhendy · 2011-03-31T03:30:15.899Z · LW(p) · GW(p)
Interesting -- that make sense... though do you think you'd need to somehow perform these acts subconsciously? I guess you clarified that the sky was too obvious, but when you first wrote this, I thought that it wouldn't work if I had a "meta-awareness" of my "rigging" of circumstances to produce a given belief. I'd know I was trying to trick myself and thus it'd seem like a game.
But perhaps that's why you retracted for such a clear-cut objective case?
I'll ponder this more. I appreciate the comment. I'm sure I do this myself and often talk aloud to myself when I'm feeling something I think is irrational, say about being late to a meeting and feeling extremely self-criticizing or worrying about what others think. I kind of talk to myself and try to come the conclusion that what's happened has happened and despite the setback which led me to be late, I'm doing the best I can now and thus shouldn't be condemning myself.
Kind of like that?
Replies from: TheOtherDave↑ comment by TheOtherDave · 2011-03-31T12:45:35.025Z · LW(p) · GW(p)
IME, "subconsciously" doesn't really enter into it... I'm not tricking myself into believing something (where I would have to be unaware of the trick for it to work), I'm setting up a series of situations that will demonstrate what I want to believe. It's a little bit like training my dog, I guess... it works without reference to an explicit cognitive representation of what's being learned.
But then, I've never tried to do this for something that I actually believe to be false, as opposed to something that I either believe to be true but react to emotionally as though it were false, or something where my confidence in its truth or falsehood is low and I'm artificially bolstering one of them for pragmatic reasons.
Replies from: jwhendy↑ comment by jwhendy · 2011-03-31T13:37:26.313Z · LW(p) · GW(p)
Maybe I just need a concrete example :)
I do think it makes more sense now that you've added that it's [most likely] something you already believe in but are not "emotionally aligned" with. I'd be interested in an example of using this to promote action in a near 50/50 truth/falsehood estimate situation.
Thanks for the continued discussion!
Replies from: TheOtherDave↑ comment by TheOtherDave · 2011-03-31T15:03:15.416Z · LW(p) · GW(p)
My favorite example is sort of a degenerate case, and so might be more distracting than illustrative, but I'll share it anyway: a programmer friend of mine has a utility on his desktop called "placebo."
When executed, it prints out the following text over the course of about 15 seconds:
"Working.........Done."
That's all.
It's something he uses when caught up in a complex project to remind himself that he can write code projects that work, and thereby to alter his confidence level in his ability to make this code project work.
This is, of course, ridiculous: his ability to write a three-line program that generates text on a screen has no meaningful relationship to his ability to make complicated code work as intended -- that "placebo" runs is just as consistent with the current project failing as it is with it succeeding, and is therefore evidence of neither -- and in any case running it for a second time doesn't give him any new evidence that he didn't already have. It's purely a mechanism for irrationally changing his beliefs about his likely success. (That said, the choice of whether and when to use that mechanism can be a rational choice.)
Replies from: wnoise, jwhendy↑ comment by jwhendy · 2011-03-31T15:30:38.226Z · LW(p) · GW(p)
That's great, though it probably would be helpful to have a perhaps more pertinent/universal example of something to go along with your original explanation:
...I'm setting up a series of situations that will demonstrate what I want to believe.
I think I'm still a bit lost on what category of beliefs you would use this on. It seems like they are generally subjective sorts of "flexible" beliefs; nothing concerning empirical evidence. Is that right?
More like, "I want to be happy in all circumstances, and happiness is within my control, thus I will make myself believe that event x should increase my happiness." (And then you go about "setting up a series of situations" that increases your happiness about X.)
Am I remotely close?
Replies from: TheOtherDave↑ comment by TheOtherDave · 2011-03-31T16:20:50.256Z · LW(p) · GW(p)
I would say that "I will successfully complete this project" is an empirical belief, in the sense that there's an expected observation if it's true that differs from the expected observation if it's false.
So if I set up a series of events (e.g., "placebo" executions) that alter my confidence level about that assertion, I am in fact modifying an empirical belief.
Would you disagree?
Anyway, the placebo example could be reframed as "I want to be confident about my success on this project, and my confidence is subject to my influence, thus I will act so as to increase my estimate that I'll succeed." And then I go about setting up a series of situations (e.g., "placebo" executions) that increase my estimate of success.
Which is similar to what you suggested, though not quite the same.
Replies from: jwhendy↑ comment by jwhendy · 2011-03-31T18:57:36.752Z · LW(p) · GW(p)
I would say that "I will successfully complete this project" is an empirical belief, in the sense that there's an expected observation if it's true that differs from the expected observation if it's false.
Yes... I now see that I could have been much clearer. That belief is testable... but only after one or the other has occurred. I more meant that you're adjusting beliefs prior to the existence of the empirical evidence needed to verify the binary outcome.
So, for you to increase/decrease your confidence toward an anticipated outcome, there's actually some other empirical evidence supporting justification to increase anticipated success -- for example, a history of completing projects of a similar skill level and time commitment/deadline.
So, in that case, it seems like we more or less agree -- you're adjusting malleable feelings to align with something you more or less know should be the case. I still don't think (and don't think you're saying this either) that you're malleably altering any beliefs about known empirical results themselves. Again, you already said about as much in discussing the color of the sky.
I guess to illustrate, substitute "this project" with "design a spacecraft suitable for sustaining human life to Pluto and back within one month." My guess is that your description of how you set up a "series of situations that increase your estimate of success" would break down, and in such a case you would not consider it advantageous to increase your confidence in an anticipated outcome of success. Or would you? Would you say that it's beneficial to always anticipate success, even if one has good reason to suspect upcoming failure? Or perhaps you only increase confidence in an outcome of success where you have good reason to already think such will occur.
In other words, you don't arbitrarily increase your confidence level simply because it can be influenced; you increase it if and when there are some other factors in place that lead you to think that said confidence should be increased.
Is that any clearer than mud? :)
Replies from: TheOtherDave↑ comment by TheOtherDave · 2011-03-31T19:36:54.060Z · LW(p) · GW(p)
Agreed that, when testable propositions are involved, I use this as a mechanism for artificially adjusting my expectations of the results of an as-yet-unperformed test (or an already performed test whose results I don't know).
Adjusting my existing knowledge of an already performed test would be... trickier. I'm not sure how I would do that, short of extreme measures like self=hypnosis.
Agreed that arbitrarily increasing my confidence level simply because I can is not a good idea, and therefore that, as you say, I increase it if there are other factors in place that lead me to think it's a good idea.
That said, those "other factors" aren't necessarily themselves evidence of likely success, which seems to be an implication of what you're saying.
To pick an extreme example for illustrative purposes: suppose I estimate that if I charge the armed guards wholeheartedly, I will trigger a mass charge of other prisoners doing the same thing, resulting in most of us getting free and some of us getting killed by the guards, and that this the best available result. Suppose I also estimate that, if I charge the guards, I will likely be one of the ones who dies. Suppose, I further estimate that I am not sufficiently disciplined to be able to charge the guards wholeheartedly while believing I will die; if I do that, the result will be a diffident charge that will not trigger a mass charge.
Given all of that, I may choose to increase my confidence level in my own survival despite believing that my current confidence level is accurate, because I conclude that a higher confidence level is useful.
Of course, a perfect rationalist would not need such a mechanism, and if one were available I would happily share my reasoning with them and wait for them to charge the fence instead, but they are in short supply.
Of course, these sorts of scenarios are rare. But it's actually not uncommon to enter into situations where I've never done X before, and I don't really know how difficult X is, so a prior probability of 50% of success/failure seems reasonable... but I also suspect that entering the situation with a 50% estimate of success will make failure more likely than entering with an 85% estimate of success... so I artificially pick a higher prior, because it's useful to do so.
So, while I would not say it's always beneficial to anticipate success, I would say that it's sometimes beneficial even if one has good reason to suspect failure.
Whether a trip to Pluto could ever be such an example, and how I might go about artificially raising my estimate of success in that case, and what the knock-on effects of doing so might be... I don't know. I can't think of a plausible scenario where it would be a good idea.
Replies from: jwhendy↑ comment by jwhendy · 2011-03-31T21:38:07.306Z · LW(p) · GW(p)
Well, this has been a lovely discussion. Thanks for the back and forth; I think we're in agreement, and your last example was particularly helpful. I think we've covered that:
- we're not talking about arbitrarily increasing confidence for no reason (just because we can)
- we're also [probably] not talking about trying to increase belief in something contrary to evidence already known (increase belief in ~X when the evidence supports X). (This is actually the category I originally thought you referring to, hence my mention of "tricking" one's self. But I think this category is now ruled out.)
- this technique is primarily useful when emotions/motivations/feelings are not lining up with the expected outcome given available evidence (success is likely based on prior experience, but success doesn't feel likely and this is actually increasing likelihood of failure)
- there are even some situations when an expectation of failure would decrease some kind of utilitarian benefit and thus one needs to act as if success is more probable, even though it's not (with the caveat that improving rationality would help this not be necessary)
Does that about sum it up?
Thanks again.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2011-04-01T03:31:36.021Z · LW(p) · GW(p)
Works for me!
↑ comment by pedanterrific · 2011-03-28T22:02:53.085Z · LW(p) · GW(p)
Um, that's not what he actually said, you know.
It's even right there in the part you quoted.
TheOtherDave doesn't think you have a choice whether you believe X or non-X, just how you feel about your beliefs. To use your analogy, the only choice is deciding whether the fact that (you believe that) the sky is blue makes you happy or not.
Replies from: jwhendy↑ comment by jwhendy · 2011-03-29T01:58:33.013Z · LW(p) · GW(p)
Doh! You are absolutely correct. I left out a "non" in the first clause and thought that the comment was on the "adjustability" of the belief, not the adjustability of the feelings about the inevitable belief. Whoops -- thank you for the correction.
comment by David_Allen · 2011-03-25T02:12:00.424Z · LW(p) · GW(p)
Truth is universal
Actually the nature of truth is an unresolved and debated question.
There are reasons to discount the concept of absolute truth.
Take your example:
I happen to wear black socks at the time of this writing. Believe it or not, that's the reality, so "Loup was wearing blacks socks when he wrote this" is true for everyone, including you. Even if you believe I'm lying, I am wearing black socks. You can't be absolutely certain of this fact, but a fact it is.
Your statement is true as assessed from a certain context; if assessed from other contexts it might be false. For example, under a different light source your socks may appear to blue due to illuminant metameric failure, or may even glow due to phosphorescence.
Truth can only be determined from the context of assessment.
For those that disagree I'll take your down-votes, but please also comment, pointing me to references if possible; I'm actively researching this aspect of epistemology.
Replies from: prase, CronoDAS↑ comment by prase · 2011-03-25T17:32:28.710Z · LW(p) · GW(p)
By non-universality of truth you mean that there is no function from the set of propositions formulated in a natural language to {0,1} which fulfills the expectation we have from truth? That's true (ehm...) , but the reason is rather trivial: words of natural languages don't specify the meaning uniquely and some amount of interpretation is always needed to figure out the meaning of a proposition. Given that propositions can probably be formulated with arbitrary precision if needed (even if not infinitely so), the disputes about meaning can be always resolved.
Replies from: David_Allen↑ comment by David_Allen · 2011-04-04T23:22:27.829Z · LW(p) · GW(p)
...words of natural languages don't specify the meaning uniquely and some amount of interpretation is always needed to figure out the meaning of a proposition.
The need for interpretation is not limited to natural languages; it is required for any language. A context of assessment will derive meaning from a proposition based on its prior assumptions. For example a raw bit string may be interpreted to different meanings when read by different programs.
Given that propositions can probably be formulated with arbitrary precision if needed (even if not infinitely so), the disputes about meaning can be always resolved.
To resolve such disputes there must be a computable path to the resolution, and there won't always be such a path. At a fundamental level not all problems are decidable. In more practical terms, the contexts involved in the dispute must implement some system that allows for convergence for all possible inputs; this condition will not always be satisfied.
↑ comment by CronoDAS · 2011-03-25T04:28:02.453Z · LW(p) · GW(p)
http://yudkowsky.net/rational/the-simple-truth
Replies from: David_Allen↑ comment by David_Allen · 2011-03-25T06:07:20.955Z · LW(p) · GW(p)
I'm not claiming that there is no truth, or that belief alters reality; I'm claiming that truth can only be determined within the context of assessment; that there is no absolute or objective truth.
I realize that this is at odds with some of Eliezer's claims. But to provide an example related to Eliezer's belief in mathematical realism. (Why isn't this an obvious mind projection fallacy?)
Let's say Fred performs a calculation that results in 2 + 3 = 6. This isn't just a onetime mistake, he always calculates 2 + 3 as 6; if he sees the formula written down he will say, "Hey, that's true."
Why is this clearly false statement true to Fred? It's because his mental implementation of the arithmetic abstraction is broken... from our perspective. Just because he believes that 2 + 3 = 6, it doesn't mean that it's true in any other context of assessment.
Just because we believe that 2 + 3 = 5, that doesn't mean that the statement is an absolute truth. We can only evaluate the truth of the statement by assessing it within contexts that can give the statement meaning. My mailbox doesn't do math at all, but my calculator does a fair job; although for my calculator I have to first transform the symbols into key presses.
Edited to add:
In the story you reference, the character Mark presumably dies when he jumps off a cliff believing that his beliefs would allow him to fly. Our minds are not a context of assessment for physical stuff, like our bodies or the ground; we can observe stuff by forming meaning from our senses, but we don't provide the framework that gives that stuff its existence and that allows it to interact. We are subject to the context of assessment that generates our physical reality.
comment by Vaniver · 2011-03-25T01:12:51.731Z · LW(p) · GW(p)
So, you're saying that if I open myself up to doubting God, the devil might snatch my soul. Nice try, but you're going to have to do better than that!
Here is perhaps the largest issue when it comes to discussing religion with someone else: are you discussing theology? Arguments like this seem to suppose that you are- I mean, why else would people profess belief in God? That rabbit hole is worth contemplating.
comment by Seremonia · 2011-03-30T10:17:24.000Z · LW(p) · GW(p)
There are some people who are so sure about something 100%, but some are not. They assume there is no 100% proof of God. Whether our future is determined by more number of decisions with 100% accuracy? Not always. But even when dealing with everyday life we do not always have to require 100% accuracy.
It is an art to analyze and decide. Its strength lies not on getting a solution 100%, but with our limitations in finding solutions or answers which are not 100% it can lead us to overcome the problem or even put us in the future in the best condition that can be earned, and it's just through using only accuracy equal to less than 100%.
It's a matter of proficiency that can only be proven after a period of time. It's unique, personal, that's why this is the art of making wise decision.
When faced with the possibility, there is a way to face possibility is to let one of possibility and see if it leads to something more definite. Diversity in the future will narrow the chances of the possibility of something, and it will clarify for the possibility of something (from the past) in the future.
I consider there is God. I have proof of existence of God. You may try to read at: http://lesswrong.com/r/discussion/lw/51d/there_is_god/
Replies from: Desrtopa↑ comment by Desrtopa · 2011-03-30T10:30:41.816Z · LW(p) · GW(p)
That doesn't seem to be a valid link, and I'm not sure how to parse this comment.
Replies from: Seremonia, Seremonia↑ comment by Seremonia · 2011-03-30T10:49:58.436Z · LW(p) · GW(p)
I've updated. My article is in draft status, and I see a link as I gave. I am not sure where link should i place here to point to my post.
Replies from: Richard_Kennaway↑ comment by Richard_Kennaway · 2011-03-30T11:53:24.601Z · LW(p) · GW(p)
If it's a draft, only you can see it. That's what a draft is, in the terms of the software that runs this site. For anyone else to see it, you must publish it, although judging by what you've written so far I don't have high hopes of the result.
You may agree or disagree, I respect that
That sounds like pre-emptively pressing the Ignore button on such disagreement. A disagreement is a problem. Some disagreements may be too trifling to be worth resolving, but disagreeing over whether there is a God is a rather large problem. It's not something to which you can just say, "well, I believe this and you believe that, and that's ok" without undermining the very activity of seeking the truth. Have you in fact anticipated such disagreement and want to avoid engaging with it?
Replies from: Seremonia, Seremonia, Seremonia, Seremonia↑ comment by Seremonia · 2011-03-30T13:08:24.169Z · LW(p) · GW(p)
Apparently I need a minimum of karma = 2 in order to publish in the discussion. Is there someone help me to have the opportunity to publish? Thank you.
Replies from: Desrtopa↑ comment by Desrtopa · 2011-03-30T16:22:05.099Z · LW(p) · GW(p)
If you read these two articles and this sequence, I promise to upvote your comments enough that you will be able to post in the discussion section.
Some warnings:
*The sequence is kind of long.
*If you post the draft you've already written to discussion, it will probably be downvoted enough that you'll have a hard time getting back the privilege to post articles.
*This, and the downvotes you've already received, are not because anyone is intimidated by your argument, or because they feel threatened by the possibility of your being right.
*The articles and sequence may be difficult for you to understand at this point.
However, if you do all this reading, and understand it, I think you will understand why you've been downvoted, and will be able to predict how your draft will be received.
Replies from: Seremonia↑ comment by Seremonia · 2011-03-30T20:18:15.768Z · LW(p) · GW(p)
I really like that lesswrong have the spirit of seeking truth. That's why I change the comment to conform with the spirit of seeking truth.
I read all as you suggested, that's great. Thanks.
Replies from: Desrtopa↑ comment by Desrtopa · 2011-03-30T20:53:36.556Z · LW(p) · GW(p)
I'm glad you took the time to check it out. Karma disbursed.
Replies from: Seremonia↑ comment by Seremonia · 2011-03-30T21:58:56.876Z · LW(p) · GW(p)
Thank you I already submitted on http://lesswrong.com/r/discussion/lw/51d/there_is_god/
↑ comment by Seremonia · 2011-03-30T12:31:56.763Z · LW(p) · GW(p)
I've edited to cancel my statement "Some people may feel intimidated by such an argument, and this can continue to turn off the post so that it collapsed (hidden).) I just want to make clear to someone of this type to appreciate and not easily turn off the post. I do not understand the lesswrong, whether some people who turn off an article by doing a few times downvoted, will result in an article not visible to all readers. But if downvoted for many times does not make an article to be collapsed, only to certain people, then it's certainly a very good thing."
I understand now. On LessWrong it's about share honestly with the spirit of seeking truth. Thanks.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2011-03-30T16:16:17.524Z · LW(p) · GW(p)
Re: the voting system:
People who like your post/comment and want more like it can upvote it.
People who dislike it or want less like it can downvote it.
Comments where downvotes = upvotes + 3 are collapsed by default, as are all child comments under them, and people have to choose to see them.
Re: "intimidated"... I'm choosing for now to consider that word-choice an artifact of your being a non-native English speaker.