Why do you really believe what you believe regarding controversial subjects?
post by iarwain1 · 2015-01-04T14:32:19.829Z · LW · GW · Legacy · 38 commentsContents
38 comments
For every controversial subject I've heard of, there are always numerous very smart experts on either side. So I'm curious how it is that rational non-experts come to believe one side or the other.
So, what are your meta-arguments for going with one side or the other for any given controversial subject on which you have an opinion?
- Have you researched both sides so thoroughly that you consider yourself equal to or better than the opposing experts? If so, to what do you attribute the mistakes of your counterparts? Have you carefully considered the possibility that you are the one who's mistaken?
- Do you think that one side is more biased the other? Why?
- Do you think that one side is more expert than the other? Why?
- Do you rely on the majority of experts? (I haven't worked out for myself if going with a majority makes sense, so if you have arguments for / against this meta-argument then please elaborate.)
- Do you think that there are powerful arguments that simply haven't been addressed by the other side? To what do you attribute the fact that these arguments haven't been addressed?
- Do you have other heuristics or meta-arguments for going with one side or the other?
- Do you just remain more or less an agnostic on every controversial subject?
- Or do you perhaps admit that ultimately your beliefs are at least partially founded on non-rational reasons?
- Do you think that this whole discussion is misguided? If so, why?
- Often after research it turns out that there are a surprising number of important points on which both sides actually agree.
- It often turns out that one or both sides are not as confident about their positions as it might initially seem.
- Often there are a number of sub-issues for which some of the above meta-arguments apply even if they might not apply to the broader issues. For example, perhaps there is a vast majority of experts who agree on a certain sub-issue even while debating the broader subject.
- Occasionally the arguments ultimately boil down to things that fall outside the domain of rational debate.
- Sometimes on the surface it may seem that someone is an expert, but on further research it turns out that they are relying on arguments outside their field of expertise. For example, many studies are faulty due to subtle statistics issues. The authors may be expert scientists / researchers, but subtle statistics falls outside their domain of expertise.
- Occasionally I've come up with an argument (usually a domain-specific meta-argument of some sort) that I'm pretty sure even the experts on the other side would agree with, and for which I can give a good argument why they haven't addressed this particular argument before. Of course, I need to take my own arguments with a large grain of "I'm not really an expert on this" salt. But I've also in the past simply contacted one of the experts and asked him what he thought of my argument - and he agreed. In that particular instance the expert didn't change his mind, but the reason he gave for not changing his mind made me strongly suspect him of bias.
- For a few issues, especially some of the really small sub-issues, it's actually not all that hard to become an expert. You take out a few books from your local university library, read the latest half dozen articles published on the topic, and that's about it. Of course, even after you're an expert you should still probably take the outside view and ask why you think your expert opinion is better than the other guy's. But it's still something, and perhaps you'll even be able to contribute to the field in a meaningful way and change some others' opinions. At the very least you'll likely be in a better position to judge other experts' biases and levels of expertise.
38 comments
Comments sorted by top scores.
comment by Dustin · 2015-01-04T16:03:51.860Z · LW(p) · GW(p)
So, what are your meta-arguments for going with one side or the other for any given controversial subject on which you have an opinion?
Don't play the game!
My goal state of mind on subjects on which there truly is not a expert consensus is to acknowledge that there is no consensus and thus not choose one side or another.
If I have to make a concrete decision about one of these subjects (who or what to vote for, for example) I just choose a side while making a conscious effort to recognize that I picked a side for this one specific decision and that I did not just choose a tribe.
By "choose a side" I really mean I try to weigh all the evidence for a side as best as time and ability allow and I also try to account for the costs of choosing the wrong side for this specific decision.
The important part is remembering the danger of incorporating a side into your identity. (See Graham's essay on the idea: http://www.paulgraham.com/identity.html)
I'm still not sure it's possible to make a conscious, successful effort to not make these things a part of your identity, but it's what I strive for.
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2015-01-05T11:59:19.907Z · LW(p) · GW(p)
By "choose a side" I really mean I try to weigh all the evidence for a side as best as time and ability allow
(I'm going to run with the hypothetical where someone meant this literally, even if you didn't.)
This should be "evidence against the side", if the choice is between primarily "for" and "against". Once you've made a tentative decision, additional evidence selected to support that decision won't change it, and so it does no useful work. There's also confirmation bias working in this direction. If on the other hand you focus on looking primarily for the opposing evidence, you may oscillate between various positions too much, but at least you'd be learning something in the process.
There is also a mode of gathering evidence where you improve understanding of the arguments already used to form your position. This understanding doesn't necessarily come with claims about how it'd sway your conclusions, it's motivated by value of information. The process of examining confirming arguments may look like gathering of more confirmating evidence, even if the outcome may be the opposite.
Replies from: Dustincomment by Vladimir_Nesov · 2015-01-05T12:09:29.327Z · LW(p) · GW(p)
These posts at Slate Star Codex seem relevant:
-
But I worry that most smart people have not learned that a list of dozens of studies, several meta-analyses, hundreds of experts, and expert surveys showing almost all academics support your thesis – can still be bullshit.
The Control Group Is Out Of Control
How do I know it’s crap? Well, I use my personal judgment. How do I know my personal judgment is right? Well, a smart well-credentialed person like James Coyne agrees with me. How do I know James Coyne is smart? I can think of lots of cases where he’s been right before. How do I know those count? Well, John Ioannides has published a lot of studies analyzing the problems with science, and confirmed that cases like the ones Coyne talks about are pretty common. Why can I believe Ioannides’ studies? Well, there have been good meta-analyses of them. But how do I know if those meta-analyses are crap or not? Well…
comment by buybuydandavis · 2015-01-05T00:59:09.291Z · LW(p) · GW(p)
First thing to do is separate out disagreements on facts, from disagreements on predictions, to disagreements on predictions about interventions, to disagreements on preferences.
One thing to note about "experts". They may know their field, but statistical inference and decision theory are a part of most controversies, and next to no one knows that they are talking about there.
Replies from: alienist, is4junk↑ comment by alienist · 2015-01-05T02:23:28.412Z · LW(p) · GW(p)
One thing to note about "experts". They may know their field
In a number of fields even this is dubious. In some fields the only apparent qualification for being an "expert" is to claim to be one to a reporter and then tell the reporter what he wants to hear.
Replies from: buybuydandavis↑ comment by buybuydandavis · 2015-01-05T04:25:19.790Z · LW(p) · GW(p)
Some people aren't an expert at anything but being a media personality.
Then there are some fields where "expertise" is measured by group applause, without any demonstration by anyone, anywhere that any of them can actually do anything concrete.
↑ comment by is4junk · 2015-01-05T02:10:06.703Z · LW(p) · GW(p)
In addition to those separations, I wish the argument template could be separated out.
Replies from: buybuydandavis↑ comment by buybuydandavis · 2015-01-05T04:28:47.508Z · LW(p) · GW(p)
My head just isn't clicking on a meaning for your comment. Could you elaborate?
Replies from: is4junkcomment by Dentin · 2015-01-05T22:37:42.645Z · LW(p) · GW(p)
Have you researched both sides so thoroughly that you consider yourself equal to or better than the opposing experts? If so, to what do you attribute the mistakes of your counterparts? Have you carefully considered the possibility that you are the one who's mistaken?
In many cases, yes - I do consider myself equal to or better than the opposing experts. I also think long and hard about being the one who's wrong, and I have changed my mind on many topics as a result.
In a number of cases which are not particularly political, I can attribute the mistakes of my of my counterparts to lack of understanding in the field.
However, in the other cases, I can also point at a deeper, more troubling concern: an unwillingness to use computable functions, a willingness to use 'incomparables' or infinite value weightings. Risk to human life from nuclear waste can be computed in dollar terms, yet activists refuse to do so. Similarly, value of a fetus can also be computed in dollar terms, but such is never done. In my eyes, this is a clear and obvious mistake.
Do you rely on the majority of experts?
In many cases, yes. By looking at the bulk of a field, neighboring related fields, advocates and detractors, you can get a good feel for a topic:
Human-induced climate change has the vast bulk of experts advocating for it, with a rather small minority of detractors, most of whom are below average for the expert pool.
Anti-evolution creationists have effectively no experts in their corner.
The vast majority of expert economists look at economic crashes through the lens of 'this is the most complicated construct mankind has ever created'. The minorities of economists who loudly advocate for singular failure cases and singular fixes are not only in the minority, but often conflict with each other in dramatic ways.
In cases where it appears that the majority is wrong, I've generally found that it's because the majority doesn't actually consist of experts on the specific topic. (See below in regard to doctors and molecular biology.) There are also situations where the pool of experts is quite small indeed, for example medical doctors with physics/nanotechnology backgrounds, where a more general pool would produce the wrong consensus.
Do you think that there are powerful arguments that simply haven't been addressed by the other side? To what do you attribute the fact that these arguments haven't been addressed?
Sometimes, yes. An excellent example is that the vast majority of doctors do not believe that aging is reversible and the diseases of old age are curable. This largely appears to be because most doctors are not molecular biologists, and do not really have the skills to evaluate what is actually possible in molecular biology (and what is expected to become possible in the next few decades.)
In this case, the 'experts' are from related fields, but are not as expert on the specific topic as would be expected.
Do you just remain more or less an agnostic on every controversial subject?
For all the subjects I have fairly strong opinions on, there are a lot which I don't because I lack the expertise. Even in some areas where I would be expected to have an opinion, I don't - I am a software engineer, yet I am divided on hard versus soft takeoff of AI. I lean toward soft takeoff for various reasons, but hard takeoff is definitely in the 'plausible' category.
Or do you perhaps admit that ultimately your beliefs are at least partially founded on non-rational reasons?
Absolutely. One of our goals should be to clean up instances of this as they are discovered, and attempt to change protocols to limit it in the future.
comment by alienist · 2015-01-05T02:30:15.244Z · LW(p) · GW(p)
Look at past predictions by the "experts" and compare them to what actually happened. If the experts are proposing courses of action did their previous proposals work? What is the field's analog of "I believe physicists know what they're talking about because airplanes fly"?
comment by Dentin · 2015-01-05T21:55:59.808Z · LW(p) · GW(p)
Do you have other heuristics or meta-arguments for going with one side or the other?
An awful lot of controversial subjects have one thing in common: they are complicated. Often complicated enough that a clear answer isn't always obvious. An excellent heuristic I've used for years is to prefer the side that says "it's complicated" over the side that says "it's simple". Two examples of this:
the cause of the 2008 economic crisis: any side saying things like "clearly, X caused it" or "it's the fault of Y" or "the evil bankers did it" should be given lower weight than a side saying "there were a lot of factors, economics is complicated".
abortion: any side saying "abortion is murder" or "a fetus is a person too" should be given lower weight than a side saying "this is a complex problem because personhood isn't binary, our morals aren't absolute, and we make other cost/benefit tradeoffs against human life all the time".
↑ comment by Capla · 2015-01-06T22:03:47.459Z · LW(p) · GW(p)
I agree about preferring complexity in factual questions, but in moral ones? We always want to acknowledge the tradeoffs, but sometimes the moral stance is just simple.
http://www.yudkowsky.net/singularity/simplified/
Replies from: Dentincomment by [deleted] · 2015-01-04T23:15:42.680Z · LW(p) · GW(p)
I think there are several levels to this question:
Any issue based on Politics or Religion will have experts on both sides arguing for their case based on their own personal choice and biases - these are more often than not, arguments and discussions rooted entirely on opinion.
For controversial issues in Science and Economics I will try to determine if there are any biases based on point 1. In some cases this becomes visible based on the political leanings of the Scientists and / or corporate sponsorship of the study.
In the Maths world there doesn't seem to be any controversial issues over the results (which is why I kind of like maths a lot!), however there are many issues on which approach to take and how useful a result is. For those sorts of predictive issues I don't usually have a strong opinion
So in general, it comes down to what the biases are. Everyone has an opinion, and these are (or ought to be) less relevant the deeper you get into the Science and Maths.
comment by DanArmak · 2015-01-04T15:14:49.092Z · LW(p) · GW(p)
I don't have answers, but here are a few notes.
For every controversial subject I've heard of, there are always numerous very smart experts on either side.
That's almost a tautology; if all experts were on one side, it wouldn't be controversial.
That said, many controversial subjects either don't have experts, or some sides of the controversy deny the expertise of the other sides. For instance, I don't need to refer to an expert theologist (and they are deep domain experts) to refute a particular religious belief.
There are lots of uncontroversial things I'm unsure about, because I don't trust the consensus-making process in the field. To be unsure of something because it is controversial, the experts and the consensus process has to be explicitly rational and truth-seeking; otherwise what I believe and how sure I am is only weakly correlated with what most 'experts' think on the subject. This rules out religion, politics, philosophy and most policy proposals as interesting controversies, leaving scientific and epistemological questions.
Replies from: HalMorris, iarwain1↑ comment by HalMorris · 2015-01-04T21:12:53.013Z · LW(p) · GW(p)
This rules out religion, politics, philosophy and most policy proposals as interesting controversies, leaving scientific and epistemological questions.
Slightly problematic unless you don't admit epistemology being part of philosophy. And it seems like almost as big a swamp as the rest of philosophy, though the problems seem much more worth resolving than in most of philosophy.
There is a paper "Experts: Which ones should you trust" addressing this issue by Alvin Goldman (http://philpapers.org/rec/GOLEWO -- you need JSTOR or something to actually get the article), one of the biggest names in epistemology and specifically social epistemology. Actually I don't think the article does very much to resolve the issue unfortunately. By the way, there are two schools of thought self-described as social epistemology which don't acknowledge each other except mostly to trade deprecations. Actually I don't think the article does very much to resolve the issue unfortunately.
Replies from: Douglas_Knight, DanArmak↑ comment by Douglas_Knight · 2015-01-05T18:09:13.551Z · LW(p) · GW(p)
google scholar is better than jstor. in fact, philpapers links to the same place, but drowning in worthless links.
↑ comment by DanArmak · 2015-01-04T22:07:12.793Z · LW(p) · GW(p)
Slightly problematic unless you don't admit epistemology being part of philosophy. And it seems like almost as big a swamp as the rest of philosophy, though the problems seem much more worth resolving than in most of philosophy.
Yes, I missed that. I meant most but not all philosophy.
↑ comment by iarwain1 · 2015-01-04T15:28:28.736Z · LW(p) · GW(p)
This rules out religion, politics, philosophy and most policy proposals as interesting controversies
And yet there are clearly some rational, truth-seeking experts on each side of many or most of these controversies.
Replies from: DanArmak↑ comment by DanArmak · 2015-01-04T17:22:30.869Z · LW(p) · GW(p)
If you can identify people who are rational, domain experts, honest, and incentivized to seek the truth, then I think you should trust them over the broader 'expert' community.
If such people don't agree with one another, then they should make clear why they disagree. If they fail to do this, then either they are not very good at practical rationality, or they aren't trying to make themselves clear to external observers.
If they've made it clear why they disagree, and it's a matter of domain expertise you're not qualified to judge yourself, then you're back to the outside view and head-counting. But I think this is a rare case.
comment by AmagicalFishy · 2015-01-12T03:05:08.563Z · LW(p) · GW(p)
I have a fundamental set of morals from which I build my views. They aren't explicit, but my moral decisions all form a consistent web. Sometimes one of these moral-elements must be altered because of some inconsistency it presents, and sometimes my judgement of a situation must be altered because it was inconsistent with the foundation. But ultimately, consistency is what I aim for. I know this is super vague, and for that I apologize.
So far, this has worked for me 100% of the time. There have been some sticky situations, but even those have been worked out (i.e. - Answering the question, for example, "Is it ok for a father to leave if his [very] mentally unstable significant other tricked him into impregnating her?" This did not happen to me, but it was a question I had to answer none the less.)
Perhaps to my discredit according to some LWers: I often think trying to quantify morals with numbers has enough uncertainty associated with it that it is useless.
Replies from: gjm↑ comment by gjm · 2015-01-12T11:25:18.610Z · LW(p) · GW(p)
They aren't explicit, but my moral decisions all form a consistent web.
How do you know? (Or, if the answer is "I can just tell" or something: How do you know that your consistency is any better than anyone else's?)
Replies from: AmagicalFishy↑ comment by AmagicalFishy · 2015-01-12T21:35:37.956Z · LW(p) · GW(p)
Trial-and-error.
There are, of course, inconsistencies that I'm unaware of: These are known unknowns. The idea, though, is that when I'm presented with a situation, any such relevant inconsistencies come up and are eliminated (either by a change of the foundation or a change of the judgement).
That is, inconsistencies that exist but don't come up aren't relevant.
An example—extreme but illustrative: Say an element of this foundational set is "I want to 'treat everyone equally'". I interview a Blue man for a job and, upon reflecting, think very negatively of him, even though he's more qualified than others. When I review the interview as if I were a 3rd party [ignorant of any differences between Blue people and regular people], I come to the conclusion that the interview was actually pretty solid.
I now have a choice to make. Do I actually want to treat people equally? If so, then I must think differently of this Blue man, his Blue people, give him this job, and make a very conscious effort to incorperate Blue people into my "everybody" perception. This is a change in judgement. Or, maybe I don't want to treat everyone equally—maybe I want to treat everyone who's not Blue equally. This is a change in foundation (but this change in foundation would have to coincide with the other elements in the foundation-set; or those, too, would change).
But, until now, my perception of Blue people was irrelevant.
Perhaps it would have been best to say: The process by which I make moral decisions is built to maximize for consistency. A lot goes into this. everything from honing the ability to look at a situation as a 3rd party, to comparing a decision with decisions I've made in the past. As a result, there's a very practiced part of me that immediately responds to nigh all situations with "Is this inconsistent?"
(An unrelated note: Are there things in this post I could have eliminated to get the same point across, but be more succint? I often feel as if my responses [in general] are too long.)
comment by [deleted] · 2015-01-09T22:34:23.333Z · LW(p) · GW(p)
There is a controversy in the field of nature conservation which boils down to 'what amount (& other specifics) of human intervention can be allowed in a habitat, such that it is still considered 'wild'?'. The question can sometimes be viewed as a set of 'decision - human intervention - outcome - monitoring', and this is a practical approach with classifiable results, BUT it is only applicable to certain types of habitats, usually (in Continental Europe, I think) already quite anthropogenically transformed. (There are other restrictions, too.) So when you have to decide for any particular patch of 'wilderness' the best way to preserve it, you side with the experts who say 'touch nothing and let only Nature rule there', and the number of species in that place goes down (because there is not enough large mammals, etc.) OR you side with those who say 'traditional land use!' In this case, you get a nice, heterogeneous reserve and a positive feedback loop that will eat it sooner or later. Damned if you do, damned if you don't.
I try not to pick sides, unless I know the habitat well, but whatever I do, I seem to lose.
comment by [deleted] · 2015-01-05T02:18:13.608Z · LW(p) · GW(p)
First, I ask myself "what is the likelihood this belief necessitate actions which may cause harm that cannot be undone?" The higher the likelihood the less I am drawn to adopt the belief. Some religious and political thinkers hold beliefs that they say mandate killing people, for example.
Second, I ask myself "how much do I need to hold a belief in this area at all?" The lower the need the less I am drawn to adopt the belief. I am not a Hindu or a Republican and it matters to me not at all if a particular belief is for or against my being a Hindu or a Republican.
Third, I ask "will I regret not trying this belief more than I regret trying it?" Temporarily adoping beliefs to stress-test them has yielded fantastic positive results and cost only the shedding prior false or incomplete beliefs.
None of these questions are often asked in such stark terms but they are in my mind in some way.
comment by HalMorris · 2015-01-06T04:57:56.202Z · LW(p) · GW(p)
I wrote as a little part of a comment in the middle of a longish thread:
There is a paper "Experts: Which ones should you trust" addressing this issue by Alvin Goldman -- you need google scholar or JSTOR or something to actually get the article), one of the biggest names in epistemology and specifically social epistemology. Actually I don't think the article does very much to resolve the issue unfortunately.
One article (cited in Goldman "Experts...") that I really like is John Hardwig "The Role of Trust in Knowledge", which gets at the critical need for experts to trust other experts, and illustrates with examples of scientific and mathematical accomplishment that just don't fit in one person's head.
Besides the qualities of individuals, we must ask whether a particular study discipline has anything trustworthy to say. As Dustin said:
My goal ... on subjects on which there truly is not a expert consensus is to acknowledge that there is no consensus and thus not choose one side or another.
DanArmak spoke of the "consensus-making process in the field" which "has to be explicitly rational and truth-seeking" -- a very good point, I think, that I've tried to illustrate in "Global Warming and the Controversy: What is Scientific Consensus? Continental Drift as Example".
One point: An area of scientific study has to be tractable (this is relative to available technology -- medical science remained largely intractable until pretty recently), and there has to be a there there. See cartoon: http://xkcd.com/451/
I got the impression from some passing reference (if I didn't imagine it) that Stephen Toulmin has had some things to say about this aspect of "what makes (a) science work" (still haven't found something he wrote to confirm this); I've made some attempts at dealing with it myself in "What is A Machine? Natural Machines and Origins of Science" and "Finding Your Invisible Elephant. A Science Requires, and is Shaped by, a Tractable Subject Matter". The articles aren't as polished as I wish they were - my New Years resolution is to do better.
comment by Pablo (Pablo_Stafforini) · 2015-01-05T21:43:24.379Z · LW(p) · GW(p)
For myself, I generally try not to have an opinion on almost any controversial issue.
Many issues that are now relatively uncontroversial where once controversial, or are still controversial in other parts of the world. Do you also suspend judgment about such issues? Is your reference class the roughly 100 billion people that have ever lived?
Replies from: iarwain1↑ comment by iarwain1 · 2015-01-06T01:20:42.440Z · LW(p) · GW(p)
That depends on if I think modern / first-world society has significant reason to claim epistemic superiority over their past / third-world counterparts. In most areas of thought there's a concept of progress and building on the accomplishments of the past, and to a very large degree the experts that benefited the most from that progress are concentrated in first-world countries.
Replies from: Pablo_Stafforini↑ comment by Pablo (Pablo_Stafforini) · 2015-01-06T17:47:47.174Z · LW(p) · GW(p)
There are innumerable indicators of epistemic superiority in addition to physical or temporal location, and some of these are arguably more reliable. I'm skeptical that the topics that your society regards as "controversial" will coincide with those that you'd be warranted in suspending judgment about in deference to your epistemic superiors.
comment by ilzolende · 2015-01-10T06:37:32.181Z · LW(p) · GW(p)
Most of my beliefs* on controversial issues stem from one of the following:
- I voluntarily exposed myself to persuasive media (political magazines, fiction novels, this website) for entertainment purposes.
- A certain position is in my self-interest. (I oppose using physically painful aversive stimuli as an autism treatment without informed consent directly from the patient, not only because I conveniently have a moral system that opposes them, but also because I do not want to be electrically shocked without my consent.)
- I want to be a member of a political coalition, because they are working towards one of my goals. They also have some other goals that I didn't originally care about either way, but membership in the coalition required that I self-modify to care more about those goals, so I did so by reading lots of persuasive arguments online until I valued their goal at least slightly.
*The "beliefs" referenced don't really meet the LW definition of belief, and in terms of local concepts, are much closer to utility function differences. My coalition agrees with other groups about what the consequences of [action X] will be, but disagrees about the moral value of those consequences. A lot of members of my coalition make arguments to group A that are equivalent to arguing "You should vote this way, because then more paperclips will exist" to humans. For example, my coalition could have blamed [Organization C] for working with [near-universally politically toxic Group D], but instead they complain about how [Organization C] portrays the world as corresponding to my coalition's opponents' beliefs in their advertising materials. This will not persuade anyone who does not already accept their entire argument.
comment by Username · 2015-01-07T16:16:15.501Z · LW(p) · GW(p)
So, what are your meta-arguments for going with one side or the other for any given controversial subject on which you have an opinion?
If you think that the issue is too close to decisively call one way or the other, it would make sense to lobby for the position with the least backing, in order to provide more balance to the argument.
comment by Pablo (Pablo_Stafforini) · 2015-01-05T21:35:46.477Z · LW(p) · GW(p)
This post by Vladimir_M is indirectly relevant: Some Heuristics for Evaluating the Soundness of the Academic Mainstream in Unfamiliar Fields.
(By the way, does anyone know what happened to this user? His contributions were excellent.)
comment by someonewrongonthenet · 2015-01-05T01:09:28.779Z · LW(p) · GW(p)
Politics, social justice: isn't actually about opinions, it's about emotions and conflicting interests. I side with whoever appeals to mine. This generally means "whoever is not feeling angry, scared, or disgusted...whoever is feeling charitable, kind, and altruistic".
Policy: I trust the sociologists and economists, in proportion to the amount of data that exists which they presumably read about. If no consensus, then no opinion. I find consensus occurs more often than people think, and isn't the real problem. In general, trust self in proportion to how much data I've personally seen vs how much others see and weight self more than others.
Dangers from AI / x-risks: I don't believe experts can exist in such an area, so I just make a guess.
Philosophy, Bayesianism, Religion: Create first principles independently, trust only self. Proceed to judge everything via said principles.
ethics & metaethics - Trust only self.
There's very few cases where my opinion on this stuff matters, though.
Replies from: iarwain1↑ comment by iarwain1 · 2015-01-05T01:17:09.739Z · LW(p) · GW(p)
... and weight self more than others.
Philosophy, Bayesianism, Religion: Create first principles independently, trust only self.
ethics & metaethics - Trust only self.
Why?
Replies from: someonewrongonthenet↑ comment by someonewrongonthenet · 2015-01-05T01:54:26.931Z · LW(p) · GW(p)
For the latter, because I'm not sure what other choice there is. See the "No Universally Compelling Arguments" posts on Lesswrong.
If the namuH race, who somehow evolved to not use parsimony, told me the more complicated hypothesis is more likely to be true, I'd have no reason to offer then for why they are wrong. But I'd still say they were wrong, because my definition of Wrong is the only one that really matters when it comes down to it (the same way that my values are all that really matter when I'm faced with the Babyeater scenario)
For the former, questions of actual policy it's partly because I think I'm smarter than the average scientist, and so my opinion is weighty when controversy appears under situations where everyone has access to the same data and yet no one is agreeing. (I feel uncomfortable admitting that and probably wouldn't under my real name). Certainly, the smartest scientists are leagues above me, but from a birds eye view I don't know who advocates what - all I get is the average opinion and a mess of data.
If I perceived someone to both have access to the same data and to be smarter than me, I sometimes do go against my better judgement - in the same spirit that I go against my own judgement when I perceive someone to have more data than me. (In actual practice I don't distinguish between "has more data" and "better at using data" - I just keep a general model of people's competencies.)
I also sometimes look at the skew of intelligence on either side - if all the people on one side of the debate give off signals of greater competence I start weighting them more.
BUT - If I was born with my current mind in a society where everyone was much smarter than me and I wasn't too arrogant to realize it, I'd probably weigh my opinion much less...but I'd still weigh myself a little more than warranted by data or competence alone, because "No Universally Compelling Arguments-therefore-you-have-no-choice-but-to-follow-yourself" seeps into everything, even when all parties involved are human. Only I have my particular highly personalized intuitions about what is parsimonious, what is morally right, what is logical, and so on, and at bottom there is no justification for those things other than "I am programmed like that".