How could one (and should one) convert someone from pseudoscience?
post by Vilx- · 2015-10-05T11:53:21.478Z · LW · GW · Legacy · 54 commentsContents
54 comments
I've known for a long time that some people who are very close to me are somewhat inclined to believe the pseudoscience world, but it always seemed pretty benign. In their everyday lives they're pretty normal people and don't do any crazy things, so this was a topic I mostly avoided and left it at that. After all - they seemed to find psychological value in it. A sense of control over their own lives, a sense of purpose, etc.
Recently I found out however that at least one of them seriously believes Bruce Lipton, who in essence preaches that happy thoughts cure cancer. Now I'm starting to get worried...
Thus I'm wondering - what can I do about it? This is in essence a religious question. They believe this stuff with just anecdotal proof. How do I disprove it without sounding like "Your religion is wrong, convert to my religion, it's right"? Pseudoscientists are pretty good at weaving a web of lies that sound quite logical and true.
The one thing I've come up with is to somehow introduce them to classical logical fallacies. That at least doesn't directly conflict with their beliefs. But beyond that I have no idea.
And perhaps more important is the question - should I do anything about it? The pseudoscientific world is a rosy one. You're in control of your life and your body, you control random events, and most importantly - if you do everything right, it'll all be OK. Even if I succeed in crushing that illusion, I have nothing to put in its place. I'm worried that revealing just how truly bleak the reality is might devastate them. They seem to be drawing a lot of their happiness from these pseudoscientific beliefs, either directly or indirectly.
And anyway, more likely that I won't succeed but just ruin my (healthy) relationship with them. Maybe it's best just not to interfere at all? Even if they end up hurting themselves, well... it was their choice. Of course, that also means that I'll be standing idly by and allowing bullshit to propagate, which is kinda not a very good thing. However right now they are not very pushy about their beliefs, and only talk about them if the topic comes up naturally, so I guess it's not that bad.
Any thoughts?
54 comments
Comments sorted by top scores.
comment by tailcalled · 2015-10-05T15:07:42.371Z · LW(p) · GW(p)
The one thing I've come up with is to somehow introduce them to classical logical fallacies.
That seems extremely dangerous. Most of the time, this will just make people better at rationalization, and many things that are usually considered fallacies are actually heuristics.
Replies from: Lumifer, Richard_Kennaway, Richard_Kennaway↑ comment by Lumifer · 2015-10-05T15:16:32.903Z · LW(p) · GW(p)
That seems extremely dangerous.
LOL. Word inflation strikes again with a force of a million atomic bombs! X-)
Are you really arguing for keeping ideologically incorrect people barefoot and pregnant, lest they harm themselves with any tools they might acquire?
Replies from: tailcalled, Kawoomba, lfghjkl↑ comment by tailcalled · 2015-10-05T15:23:13.302Z · LW(p) · GW(p)
No, but attempting to go from irrational contrarian to rational contrarian (thinking about arguments, for instance by considering fallacies, is contrarian-ish) without passing through majoritarian seems like something that could really easily backfire.
Replies from: Lumifer↑ comment by Lumifer · 2015-10-05T15:48:43.712Z · LW(p) · GW(p)
without passing through majoritarian seems like something that could really easily backfire.
That's meaningless hand-waving. Do you have evidence?
By the way, if it's extremely dangerous, maybe we should shut down LW -- unenlightened people can get ideas here that "could really easily backfire", couldn't they?
Replies from: tailcalled↑ comment by tailcalled · 2015-10-05T17:13:13.178Z · LW(p) · GW(p)
That's meaningless hand-waving. Do you have evidence?
I don't think it's fair to say that it is meaningless. Surely it must convey some, arguably a lot, of meaning. For example, it includes the advice of making people trust authorities more, and a critique of certain traditional rationalist ideas.
In terms of evidence... well, I don't have scientific evidence, but obviously I have anecdotes and some theory behind my belief. I can write the anecdotes if you think you're going to find knowing their details relevant, but for now I'll just skip them, since they're just anecdotes.
The theory behind my claim can roughly be summed up in a few sentences:
Inside view was what got them into this mess to begin with.
This seems to be something to do with tribal politics, which is known for being annoying and hard to deal with. Probably best to not give them ammunition.
People who know a lot about biases don't seem to be any better at agreeing with each other (instead, they seem to argue much more), which indicates that they're not that rational.
Essentially, don't try to teach people 'hard mode' until they can at least survive 'easy mode'.
By the way, if it's extremely dangerous, maybe we should shut down LW -- unenlightened people can get ideas here that "could really easily backfire", couldn't they?
'Extremely dangerous' could be considered a hyperbola; what I meant is that if you push them down into the hole of having ridiculous ideas and knowing everything about biases, you might not ever be able to get them up again.
I don't think the Sequences are that dangerous, because they spend a lot of time trying to get people to see problems in their own thinking (that's the entire point of the Sequences, isn't it?). The problem is that actually doing that is tricky. Eliezer has had a lot of community input in writing them. so he has an advantage that the OP doesn't have. Also, he didn't just focus on bias, but also on a lot of other (IMO necessary) epistemological stuff. I think they're hard to use for dismissing any opposing argument.
Replies from: Viliam, Lumifer↑ comment by Viliam · 2015-10-06T12:49:25.522Z · LW(p) · GW(p)
I don't think the Sequences are that dangerous, because they spend a lot of time trying to get people to see problems in their own thinking (that's the entire point of the Sequences, isn't it?).
Also my limited experience from LW meetups suggests that people who come there only for the feeling of contrarianism usually avoid reading the Sequences.
Probably for the same reason they also avoid reading a serious textbook on the subjects they have strong opinions about. (I am not saying that the Sequences are a serious textbook, but rather that the dislike towards textbooks also translates to dislike towards the Sequences and probably anything other than sensational online videos).
Thus, ironically despite various accusations against Eliezer and his education, the Sequences can act as a filter against crackpots. (Not a perfect filter, but still.)
↑ comment by Lumifer · 2015-10-05T17:56:03.691Z · LW(p) · GW(p)
Surely it must convey some, arguably a lot, of meaning.
You want it to, but that doesn't mean it actually happens :-/
Inside view was what got them into this mess to begin with.
Teaching people to notice fallacies explicitly pushes them into the meta (reflective) mode and promotes getting out of the inside view.
This seems to be something to do with tribal politics, which is known for being annoying and hard to deal with. Probably best to not give them ammunition.
Oh. It's even worse -- I read you as "keep 'em ignorant so they don't hurt themselves" and here you are actually saying "keep 'em ignorant because they are my tribal enemies and I don't want them to get more capable".
People who know a lot about biases don't seem to be any better at agreeing with each other (instead, they seem to argue much more), which indicates that they're not that rational.
That's... a common misunderstanding. Rational people can be expected to agree with each other on facts (because science). Rational people can NOT be expected to agree, nor do they, in fact, agree on values and, accordingly, on goals, and policies, and appropriate trade-offs, etc. etc.
Recall your original statement: "attempting to go from irrational contrarian to rational contrarian ... without passing through majoritarian seems like something that could really easily backfire". What are the alternatives? Do you want to persuade people that the mainstream is right, and once you've done that do you want to turn around and persuade them that the mainstream is wrong? You think this can't backfire?
Replies from: tailcalled↑ comment by tailcalled · 2015-10-05T18:18:09.508Z · LW(p) · GW(p)
Teaching people to notice fallacies explicitly pushes them into the meta (reflective) mode and promotes getting out of the inside view.
By Inside View I meant focusing on object-level arguments, which a lot of bias/fallacy teaching supports. The alternative would be meta-level Outside View, where you do things like:
Assume people who claim to be better than the mainstream are wrong.
Pay greater attention to authority than arguments.
Avoid things that sound cultish.
etc.
Oh. It's even worse -- I read you as "keep 'em ignorant so they don't hurt themselves" and here you are actually saying "keep 'em ignorant because they are my tribal enemies and I don't want them to get more capable".
I'm actually saying that everybody, friend or foe, who engages in tribal politics, should be taught to... not engage in tribal politics. And that this should be done before we teach them the most effective arguments, because otherwise they are going to engage in tribal politics.
And why is tribal politics bad? Cuz it doesn't lead to truth/a better world, but instead to constant disagreement.
That's... a common misunderstanding. Rational people can be expected to agree with each other on facts (because science). Rational people can NOT be expected to agree, nor do they, in fact, agree on values and, accordingly, on goals, and policies, and appropriate trade-offs, etc. etc.
Sure. But most of the time, they seem to disagree on facts too.
Recall your original statement: "attempting to go from irrational contrarian to rational contrarian ... without passing through majoritarian seems like something that could really easily backfire". What are the alternatives? Do you want to persuade people that the mainstream is right, and once you've done that do you want to turn around and persuade them that the mainstream is wrong? You think this can't backfire?
I think it will backfire less.
Replies from: Lumifer↑ comment by Lumifer · 2015-10-05T18:40:03.403Z · LW(p) · GW(p)
By Inside View I meant focusing on object-level arguments, which a lot of bias/fallacy teaching supports. The alternative would be meta-level Outside View, where you do things like:
Assume people who claim to be better than the mainstream are wrong.
Pay greater attention to authority than arguments.
In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.
because otherwise they are going to engage in tribal politics.
Heh. Otherwise? You just said they're engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don't want to teach them anything until they stop, you just will not teach them anything, period.
Replies from: tailcalled↑ comment by tailcalled · 2015-10-05T19:13:18.518Z · LW(p) · GW(p)
In this case I would like to declare myself a big fan of the Inside View and express great distrust of the Outside View.
Well, that makes sense for people who know what they are talking about, are good at compensating for their biases and avoid tribal politics. Less so for people who have trouble with rationality.
Remember: I'm not against doing stuff in Inside View, but I think it will be hard to 'fix' completely broken belief systems in that context. You're going to have trouble even agreeing what constitutes a valid argument; having a discussion where people don't just end up more polarized is going to be impossible.
Heh. Otherwise? You just said they're engaging in tribal politics anyway and I will add that they are highly likely to continue to do so. If you don't want to teach them anything until they stop, you just will not teach them anything, period.
I want to teach them to not get endlessly more radical before I teach anything else. Then I want to teach them to avoid tribalism and stuff like that. When all of that is done, I would begin working on the object-level stuff. Doing it in a different order seems doomed to failure, because it's very hard to get people to change their minds.
Replies from: Lumifer↑ comment by Lumifer · 2015-10-05T19:29:49.617Z · LW(p) · GW(p)
Well, that makes sense for people who ... Less so for people who have trouble with rationality.
So, is this an elites vs dumb masses framework? Quod licet Iovi, non licet bovi?
Your approach seems to boil down to "First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them". I don't think it's a good approach -- either desirable or effective. You don't start to reality-adjust weird people by performing a lobotomy. Not any more, at least.
And that's besides the rather obvious power/control issues.
Replies from: tailcalled↑ comment by tailcalled · 2015-10-05T19:51:29.339Z · LW(p) · GW(p)
So, is this an elites vs dumb masses framework?
For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can't really be justified in considering your opinion more accurate than the mainstream one.
Once we have the framework, the question is the cause of the dumb masses. Personally, I think it's tribal stuff, which means that I honestly believe tribalism should be solved before people can be made more rational. In my experience, tribal stuff seemed to die down when I got more accepting of majoritarianism (because if you respect majoritarianism, you can't really say "the mainstream is silencing my tribe!" witthout having some important conclusions to make about your tribe).
Your approach seems to boil down to "First, they need to sit down, shut up, listen to authority, and stop getting ideas into their head. Only after that we can slowly and gradually start to teach them". I don't think it's a good approach -- either desirable or effective. You don't start to reality-adjust weird people by performing a lobotomy. Not any more, at least.
It's probably not a good approach for young children or similarly open minds, but we're not working with a blank slate here. Also, it's not like the policies I propose are Dark Side Epistemology; avoiding object level is perfectly sensible if you are not an expert.
Replies from: Lumifer↑ comment by Lumifer · 2015-10-05T20:17:40.508Z · LW(p) · GW(p)
For contrarianism (e.g. atheism, cryonics, AI, reductionism) to make epistemological sense, you need an elites vs dumb masses framework, otherwise you can't really be justified in considering your opinion more accurate than the mainstream one.
Epistemologically, the final arbiter is reality. Besides, what do you call "mainstream" -- the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.
From the context it seems you associate "mainstream" with "dumb masses", but the popular views are often remarkably uninformed and are also actively shaped by a variety of interests (both political and commercial). I doubt just being a contrarian in some aspect lifts you into "elite" status (e.g. paleo diets, etc.)
the question is the cause of the dumb masses. Personally, I think it's tribal stuff
I don't understand. Are you saying that the masses are dumb because (causative!) the tribal affiliation is strong with them??
but we're not working with a blank slate here
Which you want to wipe down to the indifferent/accepting/passive moron level before starting to do anything useful?
avoiding object level is perfectly sensible if you are not an expert.
Another claim I strongly disagree with. Following this forces you to believe everything you're told as long as sufficient numbers of people around you believe the same thing -- even though it's stupid on the object level. I think it's a very bad approach.
Replies from: tailcalled↑ comment by tailcalled · 2015-10-05T20:49:07.248Z · LW(p) · GW(p)
Besides, what do you call "mainstream" -- the current scientific consensus or the dominant view among the population? They diverge on a fairly regular basis.
Perhaps I focused too much on 'mainstream' when I really meant 'outside view'. Obviously, outside view can take both of these into account to different degrees, but essentially, the point is that I think teaching the person to use outside view is better, and outside view is heavily biased (arguably justifiably so) in favor of the mainstream.
I doubt just being a contrarian in some aspect lifts you into "elite" status (e.g. paleo diets, etc.)
But that's my point: a lot of different contrarian groups have what the OP calls "a web of lies that sound quite logical and true". Do you really think you can teach them how to identify such a web of lies while they are stuck in one?
Instead, I think you need to get them unstuck using outside view, and then you can teach them how to identify truth correctly.
I don't understand. Are you saying the the masses are dumb because (causative!) the tribal affiliation is strong with them??
Yes. The masses try to justify their ingroup, they don't try to seek truth.
Another claim I strongly disagree with. Following this forces you to believe everything you're told as long as sufficient numbers of people around you believe the same thing -- even though it's stupid on the object level. I think it's a very bad approach.
The way is see it is this: if I got into a debate with a conspiracy theorist, I'm sure they would have much better object-level arguments than I do; I bet they would be able to consistently win when debating me. The reason for this is that I'm not an expert on their specific conspiracy, while they know every single shred of evidence in favor of their theory. This means that I need to rely on meta-level indicators like nobody respecting holocaust deniers in order to determine the truth of their theories, unless I want to spend huge amounts of time researching them.
Sure, there are cases where I think I can do better than most people (computer science, math, physics, philosophy, gender, generally whatever I decide is interesting and start learning a lot about) and in those case I'm willing to look at the object level, but otherwise I really don't trust my own ability to figure out the truth - and I shouldn't, because it's necessary to know a lot of the facts before you can even start formulating sensible ideas on your own.
If we take this to the extreme where someone doesn't understand truth, logic, what constitutes evidence or anything like that, I really would start out by teaching them how to deal with stuff when you don't understand it in detail, not how to deal with it when you do.
Replies from: Lumifer↑ comment by Lumifer · 2015-10-06T18:01:47.271Z · LW(p) · GW(p)
when I really meant 'outside view'
Let's sort out the terminology. I think we mean different things by "outside view".
As far as I understand you, for you the "outside view" means not trying to come to any conclusions on your own, but rather accept what the authorities (mainstream, experts, etc.) tell you. Essentially, when you recommend "outside view" to people you tell them not to think for themselves but rather accept what others are telling them (see e.g. here).
I understand "outside view" a bit more traditionally (see e.g. here) and treat it as a forecasting technique. Basically, when you want to forecast something using the inside view, you treat that something as 'self-propelled', in a way, you look at its internal workings and mechanisms to figure out what will happen to it. If you take the outside view, on the other hand, you treat that something as a black box that is moved primarily by external forces and so to forecast you look at these external forces and not at the internals.
Given this, I read your recommendation "teaching the person to use outside view is better" as "teach the person to NOT think for himself, but accept whatever most people around think".
I disagree with this recommendation rather strongly.
Do you really think you can teach them how to identify such a web of lies while they are stuck in one?
Why, yes, I do. In fact, I think it's the normal process of extracting oneself from "a web of lies" -- you start by realizing you're stuck in one. Of course, no one said it would be easy.
An example -- religious deconversion. How do you think it will work in your system?
Yes. The masses try to justify their ingroup, they don't try to seek truth.
Well, this theory implies some consequences. For example, it implies high negative correlation between IQ (or more fuzzy "smartness") and the strength of tribal affiliation. Do we observe it? The theory also implies that if the tribal affiliation increases (e.g. because your country got involved in a military conflict), everyone suddenly becomes much dumber. Do we observe that?
if I got into a debate with a conspiracy theorist ...I bet they would be able to consistently win when debating me.
I don't know about that. You think of winning a debate in high-school debate club terms, or maybe in a TV debate terms -- the one who scores the most points with the judges wins. That's not how real life operates. The way for the conspiracy theorist to win the debate is to convince you. Unless you became a believer at the end, he did NOT win the debate. Most debates end in a draw.
otherwise I really don't trust my own ability to figure out the truth
That's probably the core difference that leads to our disagreements. I do trust my own ability (the fact that I'm arrogant should not be a surprise to anyone). Specifically, I trust my own ability more than I trust the mainstream opinion.
Of course, my opinion and the mainstream opinion coincide on a great deal of mundane things. But when they don't, I am not terribly respectful of the mainstream opinion and do not by default yield to it.
In fact, I don't see how your approach is compatible with being on LW. Let's take Alice who is a LessWrongian and is concerned about FAI risks. And let's take Bob who subscribes to your approach of defering to the mainstream.
Alice goes: "I'm concerned about the risk of FAI."
Bob: "That's silly. You found yourself a cult with ridiculous ideas. Do you have a Ph.D. in Comp Sci or something similar? If not, you should not try have your own opinion about things you do not understand. Is the mainstream concerned about FAI? It is not. So you should not, as well."
What can Alice reply to Bob? She is, in fact, not a Ph.D. and has no particular expertise in AI.
If we take this to the extreme where someone doesn't understand truth, logic, what constitutes evidence or anything like that
I don't think you can extrapolate from very-low-IQ people to general population. By the same token, these people should not manage their own money, for example, or, in general, lead an unsupervised life.
Replies from: tailcalled↑ comment by tailcalled · 2015-10-06T19:09:46.433Z · LW(p) · GW(p)
I understand "outside view" a bit more traditionally and treat it as a forecasting technique.
The thing is, you can apply it more widely than just forecasting. Forecasting is just trying to figure out the future, and there's no reason you should limit yourself to the future.
Anyway, the way I see it, in inside view, both when forecasting and when trying to figure out truth, you focus on the specific problem you are working on, try to figure out its internals, etc.. In outside view, you look at things outside the problem, like track record of similar things (which I, in my list, called "looks like cultishness"; arguably I could have named that better), other's expectations of your success (hey bank, I would like to borrow money to start a company! what, you don't believe I will succeed?), etc.. Perhaps 'outside view' isn't a good term either (which kinda justifies me calling it majoritarianism to begin with...), but whatever. Let's make up some new terms, how about calling them the helpless and the independent views?
Why, yes, I do. In fact, I think it's the normal process of extracting oneself from "a web of lies" -- you start by realizing you're stuck in one. Of course, no one said it would be easy.
Well, how often does it happen?
An example -- religious deconversion. How do you think it will work in your system?
How much detail do you want it in and how general do you want it to be? What is the starting point of the person who needs to be deconverted? Actually, to skip all these kinds of questions, could you give an example of how you would write how deconversion would work in your system?
Well, this theory implies some consequences. For example, it implies high negative correlation between IQ (or more fuzzy "smartness") and the strength of tribal affiliation. Do we observe it?
IQ != rationality. I don't know if there is a correlation, and if there is one, I don't know in which direction. Eliezer has made a good argument that higher IQ gives a wider possible range of rationality, but I don't have the evidence to support that.
Anyway, I at least notice that the times where people are wrong, it's often because they try to signal loyalty to their tribe (of course, there often is an opposing tribe that is correct on the question where the first one was wrong...). This is anecdotal, though, so YMMV. What do you observe? That people who have made certain answers to certain questions part of their identity are more likely to be correct?
The theory also implies that if the tribal affiliation increases (e.g. because your country got involved in a military conflict), everyone suddenly becomes much dumber. Do we observe that?
...probably? Not so much with military conflicts, because you are not doing as much politics as you are doing fighting, but I generally see that if a discussion becomes political, everybody starts saying stupid stuff.
I don't know about that. You think of winning a debate in high-school debate club terms, or maybe in a TV debate terms -- the one who scores the most points with the judges wins. That's not how real life operates. The way for the conspiracy theorist to win the debate is to convince you. Unless you became a believer at the end, he did NOT win the debate. Most debates end in a draw.
But the only reason I don't get convinced is because of the helpless view (and, of course, things like tribalism, but let's pretend I'm a bounded rationalist for simplicity). In the independent view, I see lots of reasons for believing him, and I have no good counterarguments. I mean, I know that I can find counterarguments, but I'm not going to do that after the debate.
In fact, I don't see how your approach is compatible with being on LW.
Again, I believe in an asymmetry between people who have internalized various lessons on tribalism and other people. I agree that if I did not believe in that asymmetry, I would not have good epistemic reasons for being on LW (though I might have other good reasons, such as entertainment).
What can Alice reply to Bob? She is, in fact, not a Ph.D. and has no particular expertise in AI.
"Smart people like Bill Gates, Stephen Hawking and Elon Musk are worried about AI along with a lot of experts on AI."
This should also be a significant factor in her belief in AI risk; if smart people or experts weren't worried, she should not be either.
I don't think you can extrapolate from very-low-IQ people to general population. By the same token, these people should not manage their own money, for example, or, in general, lead an unsupervised life.
I've been in a high-IQ club and not all of them are rational. Take selection effects into account and we might very well end up with a lot of irrational high-IQ people.
Replies from: Lumifer↑ comment by Lumifer · 2015-10-06T20:07:00.381Z · LW(p) · GW(p)
there's no reason you should limit yourself to the future
Actually, there is -- future is the only thing you can change -- but let's not sidetrack too much.
how about calling them the helpless and the independent views?
Sure, good names, let's take 'em.
[Religious deconversion]
The reason I brought it up is that there is no default "do what the mainstream does" position there. The mainstream is religious and the helpless view would tell you to be religious, too.
I don't have much experience with deconversions, but even looking at personal stories posted on LW, they seem to rotate around doubting particular elements on the objective level, not on the "this belief is too weird" level.
IQ != rationality.
Well, yes, but "rationality" is not terribly well defined and is a whole another can of worms. In particular, I know how to measure IQ and I know how it's distributed in populations and sub-groups. I do NOT know how to measure the degree of rationality and what's happening with it in populations. That makes discussions of rationality as an empirical variable to be handwavy and... not stand on solid data ground.
because they try to signal loyalty to their tribe
First, signaling loyalty could be a perfectly rational thing to do. Second, there is the issue of the difference between signaling and true beliefs -- signaling something other than what you believe is not uncommon.
the only reason I don't get convinced is because of the helpless view
No, I don't think so. You have priors, don't you? Presumably, quite strong priors about certain things? That's not the same thing as a helpless view. Besides, being convinced or not involves much more than being able to debunk every single point of the argument. Gish Gallop is not a particularly convincing technique, though it's good at scoring points :-/
I believe in an asymmetry between people
How do you know who is who? And who gets to decide? If I am talking to someone, do I have to first have to classify her into enlightened or unenlightened?
Smart people ... are worried about AI
That's not a winning line of argument -- it's argument for popularity can be easily shut down by pointing out that a lot more smart people are not worried, and the helpless approach tells you not to pick fringe views.
we might very well end up with a lot of irrational high-IQ people.
The basic question is, how do you know? In particular, can you consistently judge the rationality of someone of noticeably higher IQ?
Replies from: tailcalled↑ comment by tailcalled · 2015-10-06T22:14:58.301Z · LW(p) · GW(p)
The reason I brought it up is that there is no default "do what the mainstream does" position there. The mainstream is religious and the helpless view would tell you to be religious, too.
Of course, but you can ask for the asymmetry between $yourcountry, USA, Germany, Italy, Japan and Israel (or whichever group of places you prefer). These places have wildly different attitudes to religion (or, well, at least they follow different religions, somewhat) with no-one being in a better position in terms of figuring out the right religion, so you can conclude that while some religion must be correct, we don't know which one.
I don't have much experience with deconversions, but even looking at personal stories posted on LW, they seem to rotate around doubting particular elements on the objective level, not on the "this belief is too weird" level.
Something something selection bias.
Anyway, I don't know about religious deconversion, but I know I've had a lot of stupid political views that I've removed by using helpless view.
IIRC my brother deconverted via helpless view, but I might misremember. Either way, that would be n=1, so not that useful.
Well, yes, but "rationality" is not terribly well defined and is a whole another can of worms. In particular, I know how to measure IQ and I know how it's distributed in populations and sub-groups. I do NOT know how to measure the degree of rationality and what's happening with it in populations. That makes discussions of rationality as an empirical variable to be handwavy and... not stand on solid data ground.
I quite like Eliezer's suggestion of using the question of MWI as a test for rationality, but I'm biased; I independently discovered it as a child. :P
First, signaling loyalty could be a perfectly rational thing to do. Second, there is the issue of the difference between signaling and true beliefs -- signaling something other than what you believe is not uncommon.
The problem here is that there isn't necessarily a difference between signaling and true beliefs. Imagine your outgroup saying the most ridiculous thing you can. That thing is likely a kind of signaling, but in some ways (not all, though) it acts like a belief.
You have priors, don't you?
... I can sometimes for simplicity's sake be modeled as having priors, but we're all monkeys after all. But yeah, I know what you mean.
Presumably, quite strong priors about certain things?
Sure. But if I lived in a world where most people believed the holocaust is a hoax, or a world where it was controversial whether it was a hoax but the knowledge of the evidence was distributed in the same way as it is today, I'm pretty sure I would be a holocaust denier.
(Of course, in the second case the evidence in favor of the holocaust having happened would rapidly emerge, completely crush the deniers, and send us back to the current world, but we're "preventing" this from happening, counterfactually.)
Anyway, this shows that a large part of my belief in the holocaust comes from the fact that everybody knows holocaust deniers are wrong. Sure, the evidence in favor of the holocaust is there, but I (assume I would have (I haven't actually bothered checking what the deniers are saying)) no way of dealing the denier's counterarguments, because I would have to dig through mountains of evidence every time.
(If holocaust deniers are actually trivial to disprove, replace them with some other conspiracy theory that's trickier.)
How do you know who is who? And who gets to decide? If I am talking to someone, do I have to first have to classify her into enlightened or unenlightened?
Well, most of the time, you're going to notice. Try talking politics with them; the enlightened ones are going to be curious, while the unenlightened ones are going to push specific things. Using the word 'majoritarian' for the helpless view might have made it unclear that in many cases, it's a technique used by relatively few people. Or rather, most people only use it for subjects they aren't interested in.
However, even if you can't tell, most of the time it's not going to matter. I mean, I'm not trying to teach lists of biases or debate techniques to every person you talk to.
That's not a winning line of argument -- it's argument for popularity can be easily shut down by pointing out that a lot more smart people are not worried, and the helpless approach tells you not to pick fringe views.
Gates is one of the most famous people within tech, though. That's not exactly fringe.
Actually, I just re-read your scenario. I had understood it as if Alice subscribed to the helpless view. I think that in this case, Bob is making the mistake of treating the helpless view as an absolute truth, rather than a convenient approximation.I wouldn't dismiss entire communities based on weak helpless view knowledge; it would have to be either strong (i.e. many conspiracy theories) or independent view.
In the case described in the OP, we have strong independent view knowledge that the pseudoscience stuff is wrong.
The basic question is, how do you know? In particular, can you consistently judge the rationality of someone of noticeably higher IQ?
I think so. I mean, it even had a Seer who hosted a reasonably popular event at the club, so... yeah. IQ high, rationality at different levels.
Also, 'noticeably higher IQ' is ambiguous. Do you mean 'noticeably higher IQ' than I have? Because it was just an ordinary high-IQ thing, not an extreme IQ thing, so it's not like I was lower than the average of that place. I think its minimum IQ was lower than the LW average, but I might be mixing up some stuff.
Replies from: Lumifer↑ comment by Lumifer · 2015-10-07T15:10:10.343Z · LW(p) · GW(p)
so you can conclude that while some religion must be correct, we don't know which one.
Sorry, under the helpless approach you cannot conclude anything, much less on the basis of something as complex as a cross-cultural comparative religion analysis. If you are helpless, you do what people around you do and think what people around you think. The end.
I quite like Eliezer's suggestion of using the question of MWI as a test for rationality
Oh-oh. I'm failing this test hard.
Besides, are you quite sure that you want to make an untestable article of faith with zero practical consequences the criterion for rationality? X-/
But if I lived in a world where most people believed the holocaust is a hoax ... I'm pretty sure I would be a holocaust denier.
That's an argument against the helpless view, right? It sure looks this way.
Well, most of the time, you're going to notice.
Well, yes, I'm going to notice and I generally have little trouble figuring out who's stupid and who is smart. But that's me and my personal opinion. You, on the other hand, are setting this up as a generally applicable rule. The problem is who decides. Let's say I talk a bit to Charlie and decide that Charlie is stupid. Charlie, on the basis of the same conversation, decides that I'm stupid. Who's right? I have my opinion, and Charlie has his opinion, and how do we resolve this without pulling out IQ tests and equivalents?
It's essentially a power and control issue: who gets to separate people into elite and masses?
Do you mean 'noticeably higher IQ' than I have?
In your setup there is the Arbiter -- in your case, yourself -- who decides whether someone is smart (and so is allowed to use the independent approach) or stupid (and so must be limited to the helpless approach). This Arbiter has a certain level of IQ. Can the Arbiter judge the smartness/rationality of someone with noticeably higher IQ than the Arbiter himself?
Replies from: tailcalled↑ comment by tailcalled · 2015-10-07T17:30:26.208Z · LW(p) · GW(p)
Sorry, under the helpless approach you cannot conclude anything, much less on the basis of something as complex as a cross-cultural comparative religion analysis. If you are helpless, you do what people around you do and think what people around you think. The end.
It seems like we are thinking of two different views, then. Let's keep the name 'helpless view' for mine and call yours 'straw helpless view'.
The idea behind helpless view is that you're very irrational in many ways. Which ways?
You're biased in favor of your ingroups and your culture. This feels like your ingroups are universally correct from the inside, but you can tell that it is a bias from the fact that your outgroups act similarly confident.
You're biased in favor of elegant convincing-sounding arguments rather than hard-to-understand data.
Your computational power is bounded, so you need to spend a lot of resources to understand things.
There are obviously more, but biases similar to those are the ones the helpless view is intended to fight.
The way it fights those arguments is by not allowing object-level arguments, arguments that favor your ingroups or your culture over others and things like that.
Instead, in helpless view, you focus on things like:
International mainstream consensus. (Things like cross-cultural analysis on opinions, what organizations like the UN say, etc.)
Expert opinions. (If the experts, preferably in your outgroup, agree that something is wrong, rule it out. Silence on the issue does not let you rule it out.)
Things that you are an expert on. (Looking things up on the web does not count as expert.)
What the government says.
(The media are intentionally excluded.)
Oh-oh. I'm failing this test hard.
evil grin
Besides, are you quite sure that you want to make an untestable article of faith with zero practical consequences the criterion for rationality? X-/
Nah, it was mostly meant as a semi-joke. I mean, I like the criterion, but my reasons for liking it are not exactly unbiased.
If I were to actually make a rationality test, I would probably look at the ingroups/outgroups of the people I make the test for, include a bunch of questions about facts where each there is a lot of ingroup/outgroup bias, and look at the answers to that.
That's an argument against the helpless view, right? It sure looks this way.
Except that we live in the current world, not the counterfactual world, and in the current world the helpless view tells you not to believe conspiracy theories.
Well, yes, I'm going to notice and I generally have little trouble figuring out who's stupid and who is smart. But that's me and my personal opinion. You, on the other hand, are setting this up as a generally applicable rule. The problem is who decides. Let's say I talk a bit to Charlie and decide that Charlie is stupid. Charlie, on the basis of the same conversation, decides that I'm stupid. Who's right? I have my opinion, and Charlie has his opinion, and how do we resolve this without pulling out IQ tests and equivalents?
It's essentially a power and control issue: who gets to separate people into elite and masses?
I dunno.
For what purpose are you separating the people into elite and masses? If it's just a question of who to share dangerous knowledge to, there's the obvious possibility of just letting whoever wants to share said dangerous knowledge decide.
In your setup there is the Arbiter -- in your case, yourself -- who decides whether someone is smart (and so is allowed to use the independent approach) or stupid (and so must be limited to the helpless approach). This Arbiter has a certain level of IQ. Can the Arbiter judge the smartness/rationality of someone with noticeably higher IQ than the Arbiter himself?
I don't know, because I have a really high IQ, so I don't usually meet people with noticeably higher IQ. Do you have any examples of ultra-high-IQ people who write about controversial stuff?
Replies from: Lumifer↑ comment by Lumifer · 2015-10-07T18:23:03.642Z · LW(p) · GW(p)
Instead, in helpless view, you focus on things like: International mainstream consensus.
Hold on. I thought the helpless view was for the "dumb masses". They are certainly not able to figure out what the "international mainstream consensus" is. Hell, even I have no idea what it is (or even what it means).
A simple example: Western democracy. What's the "international mainstream consensus"? Assuming it exists, I would guess it says that the Western-style democracy needs a strong guiding hand lest it devolves into degeneracy and amoral chaos. And hey, if you ask the experts in your outgroup (!) they probably wouldn't be great fans of the Western democracies, that's why they are in the outgroup to start with.
I have a feeling you want the helpless view to cherry-pick the "right" advice from the confusing mess of the "international consensus" and various experts and government recommendations. I don't see how this can work well.
in the current world the helpless view tells you not to believe conspiracy theories.
Heh. You know the definitions of a cult and a religion? A cult is a small unsuccessful religion. A religion is a large successful cult.
In exactly the same way what gets labeled a "conspiracy theory" is already a rejected view. If the mainstream believes in a conspiracy theory, it's not called a "conspiracy theory", it's called a deep and insightful analysis. If you were to live in a culture where Holocaust denial was mainstream, it wouldn't be called a conspiracy theory, it would be called "what all right-minded people believe".
For what purpose are you separating the people into elite and masses?
For the purpose of promoting/recommending either the independent view or the helpless view.
Replies from: tailcalled, tailcalled↑ comment by tailcalled · 2015-10-07T19:32:20.051Z · LW(p) · GW(p)
By the way, you asked for a helpless-view deconversion. TomSka just posted one, so...
↑ comment by tailcalled · 2015-10-07T18:52:12.193Z · LW(p) · GW(p)
Hold on. I thought the helpless view was for the "dumb masses". They are certainly not able to figure out what the "international mainstream consensus" is. Hell, even I have no idea what it is (or even what it means).
The "dumb masses" here are not defined as being low-IQ, but just low-rationality. Low-IQ people would probably be better served just doing what people around them are doing (or maybe not; I'm not an expert in low-IQ people).
A simple example: Western democracy. What's the "international mainstream consensus"?
Well, one of the first conclusions to draw with helpless view is "politics is too complicated to figure out". I'm not sure I care that much about figuring out if democracy is good according to helpless view. The UN seems to like democracy, and I would count that as helpless-view evidence in favor of it.
I would guess it says that the Western-style democracy needs a strong guiding hand lest it devolves into degeneracy and amoral chaos.
I would guess that there is an ambiguously pro-democratic response. 48% of the world lives in democracies, and the places that aren't democratic probably don't agree as much on how to be un-democratic as the democratic places agree on how to be democratic.
For the purpose of promoting/recommending either the independent view or the helpless view.
Whoever does the promoting/recommending seems like a natural candidate, then.
↑ comment by Kawoomba · 2015-10-05T16:04:21.703Z · LW(p) · GW(p)
Hey! Hey. He. Careful there, a propos word inflation. It strikes with a force of no more than one thousand atom bombs.
Are you really arguing for keeping ideologically incorrect people barefoot and pregnant, lest they harm themselves with any tools they might acquire?
Sounds as good a reason as any!
maybe we should shut down LW
I'm not sure how much it counts, but I bet Chief Ramsay would've shut it down long ago. Betting is good, I've learned.
Replies from: Lumifer↑ comment by lfghjkl · 2015-10-06T10:06:30.828Z · LW(p) · GW(p)
LOL. Word inflation strikes again with a force of a million atomic bombs! X-)
Knowing About Biases Can Hurt People has already been linked in this thread here. It seems to be the steelman of tailcalled's position and I suggest you argue against it instead of trying to score cheap points by pointing out how tailcalled uses "wrong" words to express himself.
Replies from: Lumifer↑ comment by Lumifer · 2015-10-06T17:36:40.387Z · LW(p) · GW(p)
I am not much concerned about "wrong" words other than that it might generate misunderstanding and confusion, but it does seem to me that I and tailcalled have real (not definitional) differences and disagreements.
I suggest you argue against it.
I argue with live, present people. If you want to point out the many ways in which I'm wrong, jump in :-) But I am not going to argue with texts -- among other things, they don't answer back.
↑ comment by Richard_Kennaway · 2015-10-05T21:56:47.279Z · LW(p) · GW(p)
That seems extremely dangerous.
Everything is dangerous.
If it works, it can be misapplied.
If it doesn't work, it displaces effort from things that do work.
Replies from: tailcalled↑ comment by tailcalled · 2015-10-05T22:07:47.424Z · LW(p) · GW(p)
Sure, but inside view/contrarianism/knowledge of most biases seem like things that ideally should be reserved for when you know what you're doing, which the person described in the OP probably doesn't.
↑ comment by Richard_Kennaway · 2015-10-05T21:55:23.354Z · LW(p) · GW(p)
That seems extremely dangerous.
Everything is dangerous.
If it works, it's dangerous, because it can be misapplied.
If it doesn't work, it's dangerous, because it displaces effort from things that do work.
comment by [deleted] · 2015-10-06T11:40:13.556Z · LW(p) · GW(p)
Ask them the same question I suggest you ask yourself to be less wrong: "what evidence or argument would convince you this specific belief is in error or inconclusive?" If the answer is 'I don't know' then consider finding out. If the answer is 'none' then it's not an evidence or argument based choice so evidence or argument won't change anything. If the answer is 'this evidence / argument' then either the evidence / argument is there or it isn't. If it isn't then the claim is provisionally true, and in no way off limits to further questioning.
Decide if your are talking to your friend or if you are talking to the abstraction of 'pseudoscience' and don't confuse the two.
comment by Crux · 2015-10-05T16:15:12.758Z · LW(p) · GW(p)
While happy thoughts seem exceedingly unlikely to by themselves cure any sort of case of cancer, certainly persistent unhappiness can lead to many physiological changes which are associated with degenerative disease and cancer risk (stress, trouble sleeping, and so on).
As Lumifer pointed out, it's important to consider what the practical consequences are of their beliefs. If the person you're referring to simply believes that engendering a sustainably happy state of mind will decrease their cancer risk, then I doubt there's anything to worry about. But if you would expect them to refuse a proven surgical technique and instead attempt to cure themselves by hanging out with their friends and watching fun movies, then surely it would be a highly useful service to this individual to point them in a better ideological direction.
Don't introduce them to a catalog of logical fallacies. Understanding a few important logical fallacies can help people who possess a propensity to propagate new conceptions through their web of beliefs, figuring out which beliefs should stay and which should go based on their new theory. But most people don't operate in this way. Updating all the different areas of one's belief structure in accordance to a newly acquired abstract tool doesn't come naturally to most individuals. If you coax a friend down the path toward Less-Wrong-style rationality, then there may come a day where reading 37 Ways Words Can Be Wrong would be quite an enlightening experience for them. But that day is probably not today.
I wonder whether your impression that the world of pseudo-science is a rosy one, and rationality is a window into the bleak reality of human life, is the key to the frustration you're communicating here. The only language that your non-rationalist friends will appreciate is the language of concrete results. If you can't employ your ability to think rationally to become noticeably better than them at activities they pursue in a serious way, give them health advice which to them seems to miraculously clear up certain long-standing inconveniences in their life, etc., then you're not giving them any evidence that your way of thinking is better than theirs.
Use your capacity for rational thinking to succeed in concrete endeavors, and then demonstrate to them the results of your competence. One day they may ask to look under the hood--to see the source of your impressive abilities. And then the time will have come to introduce them to the abstract rationality concepts you consider important.
Replies from: Petter↑ comment by Petter · 2015-10-10T15:40:20.798Z · LW(p) · GW(p)
Stress and having trouble sleeping causes cancer?
Replies from: Crux↑ comment by Crux · 2015-10-10T17:07:38.539Z · LW(p) · GW(p)
Both conditions greatly decrease the body's ability to repair and heal. If you believe that the body has any sort of immune response to the proliferation of cancerous cells, then it would follow that stress and sleep deprivation would increase the likelihood of getting cancer.
I don't have an estimate for how much of a factor this is besides noting it as simply one more reason to make sure to avoid chronic stress and sleep deprivation.
comment by LizzardWizzard · 2015-10-05T13:36:30.447Z · LW(p) · GW(p)
This topic is valuable me. Every time when me and my girlfriend getting involved in argument it ends up badly and we continue on holding prior beliefs. She blames me for being too rational (In hollywood sense obv, none of my efforts to convince that the word has different meaning payed off). She is absolutely sure that when you hold a skeptical position, you are getting trapped into it, so entry gate for outside view is closed and you are limiting the spectrum of possibilities. With this I'm okay because I like to think of myself as an open-minded person and I can assume that the universe is not objective. But I'm rapidly getting tired when it comes to secret meanings of events and perfect casuality and total misunderstanding of randomness. In her head there is a strange mash of esoterics, buddhism, astrology somehow combined with evolutionary biology in perfect harmony. I'm not sure whether I should try harder to convert her or give up. Maybe I gave up already, because it seems that open-minded people lack neuroplasticity
Replies from: ChristianKl↑ comment by ChristianKl · 2015-10-06T18:35:46.155Z · LW(p) · GW(p)
I'm not sure whether I should try harder to convert her or give up.
There no reason to think that trying harder would produce bigger effects. If your goal is convincing it's very important to pick your battles.
I had a disagreement with my girlfriend about the health effects of the microwave. She thinks that feeding a plant only microwaved water will kill it. That happens to be a quite testable belief and we might run the experiment even if I don't believe that it will produce additional knowledge for me.
You could check whether her beliefs make predictions about the real world and then do credence calibration games with her. You train the important rationalist skill of predicting and she might find out that some of her beliefs don't make true predictions.
Replies from: LizzardWizzard↑ comment by LizzardWizzard · 2015-10-07T17:44:02.140Z · LW(p) · GW(p)
Good point. However most of her opinions seem to be unfalsifiable, like how I can tell if Dao technic of Inner Smile doesn't work, maybe I'm smiling to my organs not sincerely enough
Replies from: ChristianKl↑ comment by ChristianKl · 2015-10-07T21:55:07.145Z · LW(p) · GW(p)
The point isn't to try to falsify her opinions but to generally encourage her to make predictions about reality and make predictions about reality as well.
If you look at the Dao technic of Inner Smile, you might start by asking her whether she thinks herself that she can perceive whether or not she's sincerely doing the technique.
Don't get attached to have a debate about truth of individual beliefs but focus on actually making predictions.
comment by polymathwannabe · 2015-10-05T14:51:30.353Z · LW(p) · GW(p)
Remember that a little bit of rationality in gullible hands can have unpleasant consequences. Next thing you know, they'll be the ones accusing you of being unreasonable.
There is a range of possible courses of action open to you. On one end, you can happily live with your friends' strange ideas and not interfere at all, even when they teach the same ideas to other people or to their children. On the other end, you can make new friends whose ideas are closer to yours and let your old friends fade from your life. My personal experience has been a case-by-case mix of both approaches.
comment by entirelyuseless · 2015-10-05T13:29:19.254Z · LW(p) · GW(p)
It seems likely to me that happy thoughts would in fact reduce the rate of death by cancer, although not by very much.
Replies from: Nonecomment by Richard_Kennaway · 2015-10-05T22:04:27.092Z · LW(p) · GW(p)
The discussion so far has all been hypothetical. Does anyone who has been in this situation (on either side) have any case studies to offer? I have none.
comment by DavidPlumpton · 2015-10-06T19:44:32.274Z · LW(p) · GW(p)
Possibly asking something like "you're good at finding points that back up your beliefs, but you also need to spend time thinking about points that might contradict your beliefs. How many contradictory points can you think of over the next five minutes?"
comment by Strangeattractor · 2015-10-06T12:57:59.428Z · LW(p) · GW(p)
One approach may be to see if you can find the scientific research that some of the hype is extrapolated from, and discuss that. In the case of Bruce Lipton, that may mean finding and discussing scientific papers about epigenetics and about the effects of low-level magnetic fields on biological systems. If you read the actual papers, and understand them enough to explain them to someone with less of a scientific background, then that could be a starting point for discussing the topics with your friends.
I'm not sure if that will help. But many things that look like pseudoscience have something real that is related to them. Talking about the real things can help separate them out from the bullshit.
Science journalism in general is pretty terrible. Someone has to go way beyond what is offered by mainstream media to have any kind of clue. It's a lot of work, and a lot to expect the average person to do, especially when so many scientific journals are paywalled.
Instead of attempting to shatter the illusion, one thing to do may be to demonstrate techniques for dealing with things when they are uncertain, and out of one's control. If it is a psychological crutch, then having techniques to replace it, rather than a new vision of the world, may remove the need for the crutch.
Some people are more open to different points of view than others. I would start with the people who are at least somewhat open to considering other ideas. And also be prepared to listen to them and find out what it is that they think is important about their beliefs. You may share more common ground than you think. Or they may have had personal experiences that are extremely different from yours. You can probably learn something.
comment by [deleted] · 2015-10-05T15:46:38.239Z · LW(p) · GW(p)
My own impression is that much of pseudoscience is framed in such a way that it cannot be disproven (falsifiability). I once had a friend discuss chakra networks with me. At one point he said: "don't you think it's possible for the human body to exchange energy with the outside world?" This is technically true. Philosophy can sometimes make some very ridiculous arguments too, but those cannot be disproven either. I would instead recommend explaining scientific philosophy and why this has proven so successful.
Replies from: ChristianKl↑ comment by ChristianKl · 2015-10-05T19:31:26.728Z · LW(p) · GW(p)
I would instead recommend explaining scientific philosophy and why this has proven so successful.
How would you go about explaining that in a way to someone who thinks that Bruce Lipton is a scientist because
he was a professor of anatomy at respected university and to the extend that other scientists disagree with him that's simply a case of scientific controversy?
comment by ChristianKl · 2015-10-05T19:34:44.139Z · LW(p) · GW(p)
In general you have to keep in mind that if you challenge someone's believe but don't convince them that their beliefs are wrong there a good chance that you increase the strength of their beliefs.
comment by Lumifer · 2015-10-05T15:02:42.240Z · LW(p) · GW(p)
should I do anything about it?
Are there any practical consequences of these beliefs? As long as they are not telling cancer patients to skip the therapy and think happy, I don't see any harm. Trying to fix other people's beliefs just because you don't like them seems to be... not a terribly productive thing to do.
that also means that I'll be standing idly by and allowing bullshit to propagate
Have you looked at a TV screen recently..?
Replies from: gjm↑ comment by gjm · 2015-10-05T16:32:53.846Z · LW(p) · GW(p)
As long as they are not telling cancer patients to skip the therapy
If they really believe that that's the best thing for cancer patients to do, then there's a very real chance that they will do that (or, if the cancer is their own, just skip the therapy themselves). There may be value in trying to improve their thinking in advance, because once they or someone close to them actually has cancer it may be too late. (Because people don't usually make radical changes in their thinking quickly.)
Whether that outweighs the other factors here, I don't know. Especially given how reluctantly people change their minds.
Replies from: ChristianKl, Lumifer↑ comment by ChristianKl · 2015-10-05T19:25:05.679Z · LW(p) · GW(p)
If they really believe that that's the best thing for cancer patients to do, then there's a very real chance that they will do that
Thinking that it's possible for some people to cure cancer via thought does in no way imply that all people who try succeed in that way. The traditional response of a person with a cancer diagnosis is to all things that promise help. Additional beliefs are needed to advice people against mainstream interventions.
Telling the person with cancer to think happy thoughts isn't harmful. It can have positive placebo effects.
↑ comment by Lumifer · 2015-10-05T16:41:18.597Z · LW(p) · GW(p)
there's a very real chance that they will do that
Define "very real". I don't think it's a serious threat -- in such situations a stern talking-to from a doctor is usually more than sufficient. To stick to one's guns in the face of opposition from the mainstream and the authority figures (like doctors) requires considerably more arrogance and intestinal fortitude than most people have. Fanatics, thankfully, are rare.
Replies from: pianoforte611, gjm↑ comment by pianoforte611 · 2015-10-06T22:25:08.985Z · LW(p) · GW(p)
I'm not sure what country you live in, but from a relative of mine who works in a cancer treatment centre, there are a fairly large number of patients who eschew treatment in favor of herbal remedies for instance. They eventually get treatment when said remedies don't work but the cancer would have gotten worse by then. It's partly false beliefs, wishful thinking or just avoidance of the issue. Do very many people really believe that a herbal treatment is going to cure cancer and the whole medical community is stupid? No, but for many people it gives them enough to pretend that everything is going to be okay and they don't have to worry.