Post/meta-skepticism I
post by Ouroborus · 2020-04-26T00:18:26.123Z · LW · GW · 10 commentsThis is a link post for https://ouroborusdotblog.wordpress.com/2020/04/26/post-meta-skepticism-i/
Contents
10 comments
Less Wrong has been significantly influenced by the skeptic movement. This has general been good for the epistemic health of our community, but at the same time we should also worry about whether everything we have inherited is true and beneficial. In other words, we need to apply skepticism to skepticism itself.
It's often good to start with the strengths of a movement and consider how these can also become weaknesses if taken too extremes, so this is what we'll do in this post instead of taking them at face value. The fundamental promise of skepticism is that it will prevent you from being mislead by weak evidence, poor reasoning, social pressure or those who would manipulate you.
The opposite of foolishing accepting weak evidence is being excessively dogmatic in the standards of evidence you require, even when circumstances require you to decide based on inconclusive evidence. A famous spoof article jokes that we don't know parachutes are reliable because we don't have a randomised controlled trial.
Wittgenstein wanted to argue that language is nonsense insofar as it failed to paint a picture or possible pictures of the world (in a particular technical sense). The only trouble was that this claim itself didn't paint a picture of the world. The logical positivists belived that beliefs were defined in terms of the experimental predictions they made, however it is almost entirely false [LW · GW] that just now a chocolate cake spontaneously formed at the center of the sun then dissolve and it isn't clear how to define this claim itself experimentally or in terms of mathematics. Behaviourism [LW · GW] argued that it was unscientific to ascribe emotions, beliefs or thoughts to humans as we can't directly observe these, which resulted in a greatly crippled psychology field.
Similarly, the opposite of foolishing accepting poor reasoning is being excessively dogmatic in the preciseness of arguments that you need. When talking about ethics, Aristotle argues that, "We must not expect more precision than the subject-matter exists". Some subjects are inherently hard to talk about and expecting too much precision can prevent us talking about it at all. One example is when people dismiss philosophy as too vague to be useful and then end up effectively adopting a philosophical theory anyway, but without any deep thought or evaluation.
Similarly Kieran Healy argues in Fuck Nuance that demanding more nuance can be a lazy response. He worries than you can almost universally proclaim, "But isn't it more complicated than that?" or "Isn't it really both/and?" or "Aren't you leaving out [X]?". Theorisation requires some level of abstraction and simplification and the demand for nuance interferes with this.
Additionally, we should expect that not everyone will have the capacity to produce solid reasoning even when their ideas are correct. Some people are better at discovering new ideas, while other people are better at filtering them. Some people are good at both, but these are relatively few. If we only learn from strong filterers, we miss out on the ideas that are still on the creative edge. I resonate with Jordan Peterson's characterisation of good art as expressing that which is important, but which cannot yet be fully expressed.
Many communities have knowledge embedded in traditions that derives from long and hard practical experience. These arguments are persuasively discussed in Seeing Like a State and The Secrets of Our Success.
Then there is the issue that those who are less educated or who are non-native speakers are likely to be less able to provide arguments up to a particular standard independently of the truth of their claims. Some people suggest we respond by not applying any significant scrutiny to the truth claims of those who are more marginalised, but my suggestion is solely that we approach these with more care.
The opposite of being excessively resistant to social pressure is being contrarian and adopting non-standard positions so that you can differentiate yourself and signal your ability to think differently. But it's also a mistake to completely disregard social evidence. What people believe provides some level of Bayesian evidence, but beyond that, no-one has time to evaluate the evidence for everything. Even if it were possible, it would almost certaintly be worse for your own well-being. It can be hard not to engage in this impossible task as the alternative is admitting that you believe thesis X because simply you were told it and not because you've made up your own mind.
But beyond this, it can lead to the bad habit of turning everything into an argument independently of how unpopular a view is. Social capital is limited, so it's important to spend this wisely if you want to have influence.
The opposite of being naive is being too distrusting. It can be very easy to identify one or two circumstances when a particular person or organisation lied to you or misled you and then refuse to ever trust anything they ever say again. This may be reasonable if we avoid them without any significant costs due to our circumstances, but this isn't always possible.
10 comments
Comments sorted by top scores.
comment by ChristianKl · 2020-04-26T12:28:10.223Z · LW(p) · GW(p)
Less Wrong has been significantly influenced by the skeptic movement.
It's not clear to me that this is true. Instead of speaking about how one needs high standards of evidence the "official"-epistomology of LessWrong is Bayesianism that doesn't have a concept of standard of evidence.
Replies from: TAG, Ouroborus↑ comment by TAG · 2020-04-26T14:30:28.045Z · LW(p) · GW(p)
Except that it does because there are things that individual Bayesians won't accept as evidence
Replies from: ChristianKl↑ comment by ChristianKl · 2020-04-26T20:15:37.604Z · LW(p) · GW(p)
Instead of treating things as not passing a standard of evidence the Bayesian way is to treat something as providing a low amount of evidence that doesn't cause a large update.
When I look at arguments people make on LessWrong I seldom see anybody referencing standards of evidence.
Replies from: TAG↑ comment by TAG · 2020-04-27T14:31:25.129Z · LW(p) · GW(p)
Ideal Bayesian reasoners can put low real numbers such as 0.000001% on some piece of evidence: organic brains can't.
Real Bayesians have to round off to zero , ie. treat some things as not evidence at all.
And they do: Even though lesswrongians don't often engage in explicit discussion of what is or isn't evidence, they still have firm and fairly uniform opinions about it. You can see what happens to people with variant epistemologies by reading less wrong: they get downvoted, told they are not a good fit, etc.
↑ comment by Ouroborus · 2020-04-26T12:38:14.233Z · LW(p) · GW(p)
I'm always skeptical of the official narratives of what people in a movement believe (yes I know, ironic to write this in a thread about post-skepticism)
Replies from: ChristianKl↑ comment by ChristianKl · 2020-04-26T20:23:40.853Z · LW(p) · GW(p)
If you look at big issues for this community like dealing with Xrisk where low probabilities are involved the standard skeptic response is to see the evidence for the Xrisks not being up to the standards of evidence.
The sequences make arguments about taking positions on topics like cryonics or the Many World Hypothesis that don't fall in line with those of the skeptic community but are taking based on different epistemics.
You had people in the skeptic community nominating Elon Musk for the Luddite Award. It makes a lot of sense from the perspective of the skeptic community to do that and at the same time it's very hard from a LessWrong perspective to see that nomination making sense.
Replies from: Ouroborus↑ comment by Ouroborus · 2020-04-26T21:44:20.570Z · LW(p) · GW(p)
Those are excellent points. Maybe it doesn't apply to the community as a whole, but I still think there are a greater proportion of people with the archetypal skeptical mindset in the LW community than the general population. But in any case, my aim was to discuss the limits of skepticism; how widespread it is on LW is a side point.
comment by Rafael Harth (sil-ver) · 2020-04-26T08:03:04.000Z · LW(p) · GW(p)
I agree that for the examples you're naming (e.g., demanding strong evidence/resisting social pressure), there is a failure mode that looks like you're going too far (e.g., being excessively dogmatic/being contrarian).
However, I don't think that this failure mode actually results from identifying the underlying principle and then taking it to the extreme, and I think that's an important point to clarify. For example, in the first case, the principle I see is something like "demand strong evidence for strongly held beliefs" or even more generally "believe things only as strongly as evidence suggests." I don't think it's obvious that this principle can be taken too far. In particular, I think the following
A famous spoof article jokes that we don't know parachutes are reliable because we don't have a randomised controlled trial.
is not an example of doing that. Rather, the mistake here is something like, "equating rationality with academic science." We don't have a formally conducted study on the effectiveness of parachutes, and if you think that's the only evidence that counts, you might mistrust parachutes. But, as a matter of fact, we have excellent evidence to believe that parachutes work, and believing this evidence is perfectly rational. So you cannot arrive at a mistrust of parachutes by having high standards for evidence, you can only arrive at it by being wrong about what kind of evidence does and doesn't count.
Again, I only mean this as a clarification, not as a counterpoint. It is still absolutely possible to go wrong in the ways you describe, and avoiding that is important.
comment by Pattern · 2020-04-26T02:19:56.002Z · LW(p) · GW(p)
Behaviourism [LW · GW]argued that it was unscientific to ascribe emotions, beliefs or thoughts to humans as we can't directly observe these, which resulted in a greatly crippled psychology field.
Maybe they're right (given their definition of science). Indirect observation, or subjective judgement, can also be important. (This probably requires more layers/data collection/experiment design around how people (verbally) 'assign' emotion labels.)
Wittgenstein wanted to argue that language is nonsense insofar as it failed to paint a picture or possible pictures of the world (in a particular technical sense). The only trouble was that this claim itself didn't paint a picture of the world. The logical positivists belived that beliefs were defined in terms of the experimental predictions they made, however it is almost entirely false [LW · GW] that just now a chocolate cake spontaneously formed at the center of the sun then dissolve and it isn't clear how to define this claim itself experimentally or in terms of mathematics. Behaviourism [LW · GW]argued that it was unscientific to ascribe emotions, beliefs or thoughts to humans as we can't directly observe these, which resulted in a greatly crippled psychology field.
characterisation of good art as expressing that which is important, but which cannot yet be fully expressed.
"Was Wittgenstein an artist?"
Many communities have knowledge embedded in traditions
Evaluate based on action/recommendations rather than apparent epistem... which is "found to be correct" through said epistemological efforts.
Even if it were possible, it would almost certaintly be worse for your own well-being.
I'd say it's hard to reason about impossible things, but impossible things might be better classified as nonsense.
The opposite of being naive is being too distrusting. It can be very easy to identify one or two circumstances when a particular person or organisation lied [to,] or [misled] you[,] and then refuse to ever trust anything they ever say again. This may be reasonable if [you can] avoid them without any significant costs due to our circumstances, but this isn't always possible.
If it fails once, don't use it. (Conduct your own investigation.)