Gnostic Rationality

post by Gordon Seidoh Worley (gworley) · 2017-10-11T21:44:22.144Z · LW · GW · 40 comments

Ancient Greek famously made a distinction between 3 kinds of knowledge: doxa, episteme, and gnosis.

Doxa is basically what in English we might call hearsay. It's the stuff you know because someone told you about it. If you know the Earth is round because you read it in a book, that's doxa.

Episteme is what we most often mean by "knowledge" in English. It's the stuff you know because you thought about it and reasoned it out. If you know the Earth is round because you measured shadows at different locations and did the math that proves the only logical conclusion of the results is that the Earth is round, that's episteme.

Gnosis has no good equivalent in English, but the closest we come is when people talk about personal experience because gnosis is the stuff you know because you experienced it. If you know the Earth is round because you traveled all the way around it or observed it from space, that's gnosis.

Often we elide these distinctions. Doxa of episteme is often thought of as episteme because if you read enough about how others gained episteme you may feel as though you have episteme yourself. We discover this is not true, though, when we actually develop episteme of something we previously only had doxa of episteme of, like when we try to teach another person something and discover we didn't understand it as well as we thought we did. Similarly, episteme is sometimes mistaken for gnosis because episteme may allow you to always get the expected answer the way gnosis usually lets you, but only so long as you put in the effort to reckon epistemologically.

Many rationalist thinkers focus heavily on episteme and the doxastic logic used to combine facts. This is fine as far as it goes: you need to be able to develop well-formed episteme from facts if you want to have much chance of winning. Humans are notoriously bad at episteme and it takes considerable training to become good at it, and episteme requires constant maintenance to remain accurate. We cannot hope to be rationalists if we cannot master episteme.

But there is something more if you want to walk the Way. It's not enough to know about the Way and how to walk it; you need gnosis of walking. And I know this (doxastically, epistemically, and gnostically) from listening to others describe their experiences, reasoning about epistemology, and remembering my own experience learning to walk the Way.

This would be a purely academic distinction if it weren't for the fact that I see many of my rationalist friends suffering and finding consistently that those who suffer the most tend to be those with the least gnosis of rationality. And this is further complicated because those with gnosis do not always have the most episteme, so those more skilled at epistemic rationality may reasonably ignore the doxa of gnostic rationalists as confused at best and self-deceptive at worst. And so I find myself between a rock and a hard place because I see my friends suffering and I know (epistemically) how they can be helped but I don't know (gnostically) how to help them.

All I know how to do is leave breadcrumbs for those without so much dust in their eyes that they can see the breadcrumbs well enough to keep following the Way when they find they are no longer walking it. This, however others may perceive it, has been the motivating goal, at least for me, with what we've lately been calling "metarationality". That is, to figure out how to help our epistemically rationalist friends learn to be gnostically rationalist. My writing and the writing of David Chapman, Kevin Simler, Sarah Perry, and others is a way to gain doxa and maybe even episteme of our gnosis, but other than maybe Chapman's proposed curriculum, we have not really found a reliable way to guide people towards gnostic rationality.

But maybe "gnostic rationality" is a better name than "metarationality", and one more people can get behind. After all, I already see primarily epistemic rationalists engaging in practices to develop gnosis through things like comfort zone expansion (CoZE) and using the double crux in their own lives when the stakes feel high, and gnostic rationality, by the very nature of being rationality, does not work without episteme enough to judge what may help you win and what may not. Thus it is not that anyone is seeking to go beyond rationality so much as fully engage with it, not just with our words (doxa) and minds (episteme), but also with our hearts (gnosis). This is the Way of gnostic rationality.

40 comments

Comments sorted by top scores.

comment by Unnamed · 2017-10-12T00:19:47.550Z · LW(p) · GW(p)

I like the distinction between doxa, episteme, and gnosis, and the application to rationality and people attempting to walk the Way. Upvoted.

I don't understand the connection to metarationality. (This does not necessarily mean that you should try harder to explain the connection; another possible response is to just keep talking about things like doxa, episteme, and gnosis.)

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2017-10-12T19:14:49.057Z · LW(p) · GW(p)

Ultimately it's all about the same stuff, but explained from different perspectives as I develop a deeper understanding that lets me see it in different ways. I find I often knew things I didn't know I knew when I hope ideas up to different lights, and I view doing this as a key process for developing understanding.

comment by Ben Pace (Benito) · 2017-10-12T20:48:45.240Z · LW(p) · GW(p)

Hi gworley! I've moved this post off the frontpage and back to your personal user blog. This post is an edge case as it has some fine and interesting ideas in it; the reason I've moved it is because it reads to me as a post for which being interested in and able to understand all of it requires having a social connection to the rationality community and knowledge of its history (in particular, the last two paragraphs of the post, and thus much of the discussion that ensued). In general the frontpage is for discussion of interesting ideas, not for things like news, recent events, social coordination, or announcements (check out the frontpage content guidelines for more info).

comment by Vanessa Kosoy (vanessa-kosoy) · 2017-10-12T19:50:51.359Z · LW(p) · GW(p)

This essay seems perfectly clear until it reaches the phrase "It's not enough to know about the Way and how to walk it; you need gnosis of walking." I completely failed to understand it. I presume that "the Way" means "rational thinking" and we established that "gnosis" means "personal experience" (I would say "direct experience"), so what you're saying here is "you need direct experience of rational thinking?" What does it mean apart from just "you need to think rationally?"

The rest of essay did nothing to clarify this point and thus was mostly lost on me. Maybe it's only supposed to make sense for those familiar with so called "metarationality" and the other links? If so, it might be okay, but I wish you said it explicitly. On the other hand, if it's supposed to be an advertisement for "metarationality" and so forth, I'm afraid it does a very poor job. I finished the essay with no understanding of why those concepts are valuable, although it might be a reading comprehension failure on my part?

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2017-10-12T20:42:27.977Z · LW(p) · GW(p)

I think that's fair. I wanted to write this in one sitting so didn't take the time to develop the why here, only reference other places where I point at the why. I didn't write it to be an advertisement, although to be fair literally everything anyone writes is working to spread ideas, even if only weakly. Mostly it was that this connection to the doxa/episteme/gnosis categories became clear enough to me that I wanted to express it.

comment by sarahconstantin · 2017-10-12T17:22:37.570Z · LW(p) · GW(p)

"Gnostic rationality" is a confusing name because the first association many people will have is Gnosticism.

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2017-10-12T19:06:07.923Z · LW(p) · GW(p)

I'm personally fine with this association and in fact even find it somewhat fitting, though only in a very tenuous way.

comment by KatjaGrace · 2017-10-12T21:47:42.722Z · LW(p) · GW(p)

"It's not enough to know about the Way and how to walk it; you need gnosis of walking."

Could I have a less metaphorical example of what people need gnosis of for rationality? I'm imagining you are thinking of e.g. what it is like to carry out changing your mind in a real situation, or what it looks like to fit knowing why you believe things into your usual sequences of mental motions, but I'm not sure.

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2017-10-12T22:54:29.120Z · LW(p) · GW(p)

Yep, sounds like you got it. It's like when you quit grad school because you realize you were only staying for sunk costs, or start exercising because you believe in its benefits, and you don't have to go through the motion of explicitly figuring this out and then willing yourself into doing it. You knew it, maybe you double check your work to make sure the dark, unobserved processes of your brain didn't make a mistake, and then you just do it because it's the most natural thing in the world, like taking a sip of water when you're thirsty.

comment by KatjaGrace · 2017-10-12T19:31:21.323Z · LW(p) · GW(p)

So a gnostically rational person with low epistemic rationality cannot figure things out by reasoning, yet experiences being rational nonetheless? Could you say more about what you mean by 'rational' here? Is it something like frequently having good judgment?

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2017-10-12T21:05:36.766Z · LW(p) · GW(p)

Mmmm, these aren't orthogonal dimensions within rationality. We wouldn't call a person who happened to win all the time because they made the right choices without being able to explain why a rationalist; we'd probably just say they are wise or have good judgement.

By "rational" and "rationality" I want to point at the same thing Eli(ezer) does, which he also called the "winning Way". It's something like "the ability to take action that you are happy with" although I'd probably describe it in technical terms as "axiologically aligned intention".

Rationality is an almost inherently epistemic notion because such alignment requires logical reasoning to judge, and in fact understanding how this works thoroughly rigorously seems to be the core of the AI safety problem. Thus even if someone could be accidentally rational without being a rationalist, this is something that is only interesting to those with sufficient epistemic rationality to assess it, and thus there's not really a strong sense in which you can have someone with lots of gnosis of rationality who doesn't also have episteme of it because they wouldn't know rationality in any sense well enough to have gnosis of it.

comment by the gears to ascension (lahwran) · 2017-10-11T23:11:11.311Z · LW(p) · GW(p)

Ok I am now fairly confident I have understood what you mean, and I really like the name "gnostic rationality". there's still some weirdness in the possibility of gnosis and episteme even being different in the first place, but I think that even though the difference isn't particularly compressible, it's still worth making.

comment by PDV · 2017-10-12T07:53:30.203Z · LW(p) · GW(p)

[Moderation notice by Ben: I've blocked replies to this comment, as this comment reads as fairly political and quite aggressive. In particular this comment has personal attacks ('obviously and dangerously confused'), incendiary language ('cancer in the rationalist community', 'block off sound reasoning'), and little substance for gworley to respond to (or for me to even understand what exactly PDV believes and why).

PDV, there may be some valid points in there, but if you want to discuss them, you should create a comment that's less directly insulting and more likely to promote productive discussion. This is a warning notice; if we read similar comments from you in the future, your account will likely be temporarily suspended.

Thanks to users who are hitting 'report' on comments - it's super helpful to know quickly about things where moderators want to be able to watch/act.]

-----------------------------------------------------------------------------------------

I am not one of the most epistemically rational people in the community. Probably the top half, unlikely the top quarter. Even relative to me, though, you, Chapman, and your whole crowd, are very obviously and dangerously confused.

So confused that I genuinely believe that taking your ideas seriously corrodes good epistemic hygiene. They're seductive, self-reinforcing, and block off sound reasoning, like the strongest conspiracy theories. They're built into a superstructure that justifies feeling superior and ignoring counterargument, by distinguishing themselves as "gnostic" or "Kegan 5" or "postrationalist" and implying that this is better than standard-issue rationality.

They are not. They are just standard irrational thinking, dressed up in the language of the Sequences so that they can be embraced wholeheartedly while not rejecting the ingroup signalling.

They are a cancer in the rationalist community and have done more damage than RationalWiki. Only Intentional Insights and Leverage are real competition with Meaningness for organization most harmful to the rationalist project.

(I'm not sure how Simler relates to any of it, since he has not displayed any of the themes or ideas I've seen from this irrationalist cluster. If he self-identifies with it that is surprising and I will have to re-evaluate my good opinion of his work.)

Replies from: spiralingintocontrol, gworley, sarahconstantin, lahwran
comment by spiralingintocontrol · 2017-10-12T17:27:13.863Z · LW(p) · GW(p)

I'm missing a lot of context here. How is this post connected to the other things you're referring to as a "cancer" and what is wrong with those things and this post?

Meta note: I don't like that your comment has a lot of "this is bad" but not a lot of why.

edit: To be clear, I'm genuinely curious. This post is also extremely confusing and bizarre, so I would appreciate hearing your take on it as someone who is skeptical but also seems to understand what it's pointing at.

Replies from: PDV
comment by PDV · 2017-10-14T08:52:04.763Z · LW(p) · GW(p)

Insofar as I understand what it's pointing at, it is pointing at something I'd paraphrase as "logical thought is overrated". There's nuance to what exactly it's being pushed aside in favor of, but that's the core piece I object to.
I object to it the most strongly because it's from an intellectual lineage that draws adherents mostly from the rationalist community and is based around disparaging logical thought and a naive view of truth in favor of various wooy "instinct/social reasoning/tradition/spirituality without understanding is good" frameworks.
And while there's value to system 1 reasoning, I think that A) CFAR is handling that quite fine with more care and purpose and B) Anything that hooks tightly to system 1 without being moderated by the strong endorsement of system 2 should be treated as BADSCARYATOMICFIRESPIDERS, even while trying to extract value from it.

comment by Gordon Seidoh Worley (gworley) · 2017-10-12T18:55:05.312Z · LW(p) · GW(p)

I'd mostly rather not respond to this, but others have made the case that defending the conversational norms we'd like to see in the community is worth the fight, so I will.

We've previously gotten to the heart of our disagreement in other forums, PDV, as being about, as I would put it, the primacy of epistemology, namely that I believe we cannot develop a fundamentally "correct" epistemology while you believe that we can. If I am mischaracterizing this, though, please correct me as this discussion happened before the time of the double crux so we lacked a formal process to ensure our good faith. I'm happy to engage in a double crux now to see what shakes out of it, especially since I always seem to figure out new things from dialogue with you. If that's something you'd like to do feel free to kick it off with a separate top-level comment on this post or as its own post.

But good faith is really why I think it necessary to write a response. I will leave it for others to judge my success, but I always strive to give my interlocutors the benefit of the doubt, to disagree in good faith, and do whatever I can to be maximally charitable to positions I think are confused. I'm willing to bet some people would even accuse me of performing too much hermeneutics on the arguments of others so that I can interpret them in ways that I can agree with. It's thus sort of hard for me to even conclude this, but your responses to me often appear to be given in bad faith.

For example, above you accuse me and my "whole crowd" of "feeling superior and ignoring counterargument". You say we are "a cancer in the rationalist community". As best I can tell this is a position based on sentiment, and I've been happy to admit since the very first time I publicly posted about these ideas I've been aware that they risk creating negative sentiment and I'm still working out how to talk about them without doing that. Certainly sentiment is important for a lot of things, but one thing it's not really important for is developing what we might call here "rational" episteme, i.e. the sort of classical, logical arguments of rationalism. Yet your comments are not directly a criticism of the sentiment I am creating, but instead seem directed at assessing the truth of my arguments. Thus I'm left with little choice but to consider these statements in the class of ad hominem.

Now I'm not opposed to ad hominem and snark per se: it can be fun and I'm happy to be knocked down for my mistakes. But you present them as if they are part of your argument for why I am wrong and this is, as far as I can tell, contra the values you seem to be yourself favoring (a cluster we might call generically "LW rationality"). And unfortunately doing this is not simply noise to be filtered out from the rest of the argument: it creates an example of it being okay to respond to arguments via the social side channel rather than addressing the ideas head-on. And this behavior creates the sort of environment that makes people hesitant to speak for fear of social reprisal.

I am sufficiently esteemed by myself and others to weather such remarks, but directed at those with less power and esteem this behavior would be read as bullying, and I at least don't want LW to be the sort of place where that's okay, especially since I suspect that sort of behavior is much of what killed LW 1.0 because it's the kind of thing that made me lose interest in LW 1.0.

So I'm happy to engage in discussion around whether there is a difference between episteme and gnosis (the Anglo-Austrian-Analytic stance says "practically, no", the Continental stance says "yes"), around how the presentation of my ideas may produce negative sentiment, and even around how these ideas impact the rationalist mission (though I don't think that's well defined and agreed upon). Really I'm happy to discuss anything in good faith! What I'm not happy to do is let slide arguments presented in what I must reluctantly conclude are bad faith because it is a weed in the garden we are trying to tend here.

There are places for bad faith, for trolling, for bullying. I would like LW to not be one of those places.

Replies from: lahwran, PDV
comment by the gears to ascension (lahwran) · 2017-10-12T23:11:52.132Z · LW(p) · GW(p)

namely that I believe we cannot develop a fundamentally "correct" epistemology

U wot m8

If that's an "everything is true", then I think I disagree with it. I agree in a very vacuous sense; but I think all useful powerful reasoning processes are reachable from each other in the world we actually live in. There don't seem to be local minima in reasoning space once you simplify the world hard enough.

You can't have a "correct" epistemology in some great oracle sense, but you only get to have one overarching one anyway. Meta-rationality is still implemented on your same old brain.

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2017-10-13T18:12:34.235Z · LW(p) · GW(p)

What I really should say is that I believe we can't develop a consistent epistemology that is also complete. That is, there will be facts that cannot be reckoned within a consistent epistemology and an epistemology that admits such facts will be inconsistent. I believe this follows directly from the incompleteness theorems, in so far as we consider epistemologies formal systems (I do because I deal with stuff that could get in the way as part of the phenomenological layer). So you are right that I mean an epistemology cannot be correct in the way we let oracles be correct (complete and consistent).

I think this is worth stressing because thinking as if oracles are possible and as if they are indeed the thing we should aspire to be like seems to be natural within human thought even though it is a computationally impossible achievement within our universe as best we can tell. I believe I read an implicit assumption in much writing, rationalist or no, that is also to this effect.

With sufficient computational resources we can act as if we are oracles, but only if we restrict ourselves to problems simple enough that the resources needed, generally of order exponential or more in terms of the problem size, are physically available to us. I expect though that for no matter how much resources we have we will always be interested in those questions for which we do not have enough resources to pretend to be oracles for, so addressing such issues is important both now while we are very much limited by our brains and in the future when we will at least be limited by the amount of reachable energy within the universe.

Thus we are stuck trading off between various epistemologies the same way in mathematics we may have to use different formal systems to address different questions, as in when we choose whether or not to pick up the axiom of choice and in so doing necessitate the introduction of heuristics to keep us away from the places where everything is true because the system no longer keeps those things out on its own. Of course this is all part of a single computation with an epistemological telos implemented in our "same old brain"s, but that's something distinct from even if it approximates an epistemology.

Replies from: PDV
comment by PDV · 2017-10-14T08:37:07.133Z · LW(p) · GW(p)

The falsity of this argument follows directly from the computability of physics.

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2017-10-15T04:49:53.220Z · LW(p) · GW(p)

This seems contra our current best understanding of physics, specifically that fundamental physics operates in a nondeterministic fashion from our perspective because there is uncomputable stuff happening. Just what that looks like appears to be literally unknowable, but we have made some decent inferences as to what might be going on, hence MWI and other metaphysical theories.

comment by PDV · 2017-10-14T08:41:13.564Z · LW(p) · GW(p)

Your belief system is flawed, built on embracing not-even-wrong statements as truth. This makes every conclusion you draw suspect, and when you've stated enough confident conclusions confidently which bear out that suspicion, it is no longer reasonable to presume they are correct until proven otherwise. That does not constitute an ad hominem, merely updating priors in response to evidence.

comment by sarahconstantin · 2017-10-12T17:26:19.314Z · LW(p) · GW(p)

David Chapman, once you talk to him and actually double-crux, is much more sensible than your reading of him. He was an AI researcher himself.

His popular writing has a vibe of "nerds and I F*ing Love Science fans: you aren't so great!" that can be off-putting. If you ignore the implicit insult, assume it can't hurt you, and try to figure out what he literally believes is true, it's not actually irrationalist.

Replies from: Conor Moreton, PDV
comment by Conor Moreton · 2017-10-12T19:50:33.957Z · LW(p) · GW(p)

This sounds true, and is useful clarification for David Chapman in particular, but I don't think it's bad for PDV to pump against tone and signaling that will, if unchecked, reliably undermine norms of epistemic hygiene and reliably incentivize/cause people to feel proud of [behavior sets] that are roughly equally likely to emerge from "serious rationalist investigating means and modes that are outside of rationalist culture norms" and "sloppy thinker going off the rails."

In other words, I could believe you entirely about David Chapman (and basically do) and nevertheless want to upvote a policy that disincentivizes people who talk like that. If eight out of ten people who talk like that are Doing It Wrong, then in my opinion the responses are, ranked from least to most awesome:

  • Let them have the microphone and lean into the fallacy of the gray (deontological allow)

  • Hold a low bar of suspicion and reject particular people or claims ("for" with exceptions)

  • Shut them down and shut them out (deontological against)

  • Hold a high bar of suspicion and vet particular people or claims ("against" with exceptions)

I think we know enough about how things play out (see the history of General Semantics' slow unraveling and decline) to lean for both the third and fourth bullets over the second. I think the second is a siren song that overestimates our ability to expunge bad habits and bad norms after they've already taken root. In my experience, those bad habits and bad norms just never go away, so it's worth being proactive in preventing them from forming in the first place.

I acknowledge that Good and Smart and Perceptive people could disagree. Wouldn't mind pointing more clearly at cruxes if that seems productive, but would ask that the other group go first.

Replies from: sarahconstantin, ChristianKl, lahwran
comment by sarahconstantin · 2017-10-12T20:19:48.064Z · LW(p) · GW(p)

Whoa whoa whoa.

This essay, which you recommend "holding to a high bar of suspicion", is basically "guilty" of contrasting "heart" and "head", and claiming that personal experience or maturity can be differently valuable than explicit reasoning.

Gordon's general corpus of work is about developing psychological maturity, which is pretty much Kegan's topic as well. It's about the squishy stuff that we call "wisdom."

I can see some possible reasons why this topic and outlook is off-putting.

First of all, it's squishy and hard to prove, by its nature. I don't think that's inherently bad -- poetry and literature are also squishy insights into human nature, and I think they're valuable.

Second of all, people like Gordon and the developmental psychologists have certain implicit presumptions about the Good Life that I'm not sure I share. They tend to favor adapting to the status quo more than changing it -- in writers like Erikson and Kohlberg, there's a lot of pro-death sentiment, and a lot of talk about cooperating with one's dominant society. They tend to talk about empathy in ways that sometimes (though not always) conflict with my beliefs about autonomy.

At its worst, the ideal of humanely accepting the "complexity" of life leads people to commit some actual harms -- Siddhartha Mukherjee, the cancer researcher and physician who wrote the Pulitzer-winning The Emperor of Maladies, is famous for his humane, compassionate, "mature" outlook, and probably would get classified as having a high "developmental stage", and he's a major popular promoter of the view that cancer is intrinsically incurable. In my opinion, passive acceptance that cancer cannot be cured is one of the reasons that we haven't made more progress in cancer treatment. Medical progress is a real thing in the real world, and sounding wise by accepting death is not actually good for humankind.

But. I definitely don't believe in jumping down the throat of everybody who sounds vaguely developmental-psych-ish or Continental-philosophy-ish and calling on the community to shun them! That's not what people seeking truth would do.

I know the feeling of "this is scary voodoo designed to demoralize me, get it away!" I've learned that you have to take time on your own to face your fears, read the original sources behind the "voodoo", and pull apart the ideas until they're either trivially wrong (and hence not scary) or have a grain of truth (which is also not scary.)

I can see the voodoo too. I can see collectivist and mystical vibes a long way off. Let's be gentlemen anyway.

Replies from: sarahconstantin, Conor Moreton
comment by sarahconstantin · 2017-10-12T20:40:18.152Z · LW(p) · GW(p)

To give an example, my first reaction to reading Heidegger is "this is voodoo!" I still think he's a bad person with bad values. But I can clarify somewhat what I think is true in his philosophy and false in it, and I think my understanding of psychology (and potentially AI) is sounder for that struggle.

There's stuff I can't read without getting "triggered." I know my limits and proceed slowly. But ultimately, if there's something I have reason to believe has substance on topics I care about, I expect to eventually hit the books and wrestle with the ideas. I think I'll eventually get there with developmental psychology. Which is why I think people like Gordon are good to have in my noosphere.

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2017-10-12T21:07:00.423Z · LW(p) · GW(p)

D'awwww, thanks! :-)

comment by Conor Moreton · 2017-10-12T21:15:20.159Z · LW(p) · GW(p)

I disagree with very little in the above, and think that disagreement is mainly summed up with me being slightly more conservative/wary and you being slightly less so. =)

Replies from: Conor Moreton
comment by Conor Moreton · 2017-10-12T22:45:18.043Z · LW(p) · GW(p)

I do, however, object to the implication that I was, in any way, advocating "jumping down the throat of" or "calling on the community to shun" ideas that pattern-match to things that are bad-in-expectation. I think that people reading this thread will reasonably assume that that was your summary of me (since you were making that objection in response to my comment) and that this is an uncharitable strawman that clearly doesn't match the actual words in my actual comment.

Perhaps that impression would not have been left if I had more strenuously objected to the specific over-the-top stuff in PDV's post ("cancer" and so forth)? I left those out of my endorsement, but maybe that would've been clearer if I'd included specific lines against them.

comment by ChristianKl · 2017-10-13T16:30:18.084Z · LW(p) · GW(p)

Do you have a specific document that you mean when you reference the history of General Semantics slow unraveling and decline? What do you think should they have done differently?

comment by the gears to ascension (lahwran) · 2017-10-12T23:03:20.674Z · LW(p) · GW(p)

Nothing wrong with talking "like this"[1] if you're having good epistemics while you do it. I do agree that it's harder to evaluate and should get held to the same high standard that easily verified things are held to.

[1] (for values of "like this" that aren't "have bad epistemics")

Replies from: Conor Moreton, PDV
comment by Conor Moreton · 2017-10-12T23:08:51.282Z · LW(p) · GW(p)

Ehhhhhhhhr mooooostly agree? But there are social effects, like shifts in the Overton Window. I claim that I have seen high-status people with rock-solid epistemics spout frothing madness as they explored the edges of what is known, and they were taking all of the things that they were saying with heavy helpings of salt and so forth, but they weren't transparent to onlookers about the fact that they were spitballing, and then some of those onlookers went on to spout frothing madness themselves (but with less rock-solid epistemics themselves) and by the time you got three or four steps removed there were people who just thought that sort of reasoning was a central example of rationality because lots of people were doing it sans context or caveat.

Replies from: lahwran
comment by the gears to ascension (lahwran) · 2017-10-12T23:14:23.527Z · LW(p) · GW(p)

annoying but fair.

Replies from: Conor Moreton
comment by Conor Moreton · 2017-10-12T23:21:15.168Z · LW(p) · GW(p)

I agree that it is annoying. We'll patch this in the update to Humans 2.0.

comment by PDV · 2017-10-14T08:20:24.422Z · LW(p) · GW(p)

I'm not sure I believe that isn't a contradiction in terms.

comment by PDV · 2017-10-14T09:03:51.753Z · LW(p) · GW(p)

I don't know what he believes. I know only what he says. If he doesn't believe what he says, that isn't exactly a ringing endorsement, but would complicate things.
What I do know is that his entire notion of meaningness, and everything I've ever read from that blog, is anti-truth and anti-rationality. It's grounded in assertions that rationality has problems which I do not accept are problems and makes bald-faced assertions (see: eternalism, aka 'Truth exists', which is asserted to be wrong because it contains divine command theory as a subset) that are just not true, while laying the foundation for his other arguments. Ex falso sequitur quodlibet, and his writing style has all the bad qualities of Eliezer's, so I can't bring myself to read it in enough depth to write a point by point rebuttal.

comment by the gears to ascension (lahwran) · 2017-10-12T15:01:33.922Z · LW(p) · GW(p)

I disagree. they seem to be vaguely trying to gesture at something that I would simply put as "use system 1 well". they're just trying speak the language of system 1 to communicate the concept, and getting lost in concept space when attempting to.

Replies from: gworley, PDV
comment by Gordon Seidoh Worley (gworley) · 2017-10-12T19:04:14.273Z · LW(p) · GW(p)

I don't really find the S1/S2 ontology sufficiently precise to be happy using it, but I agree that in those terms you could say that one aspect of what I'm talking about is developing rationality as an S1 rather than an S2 skill. As I understand it this is what CFAR and friends are working towards, but my impression of this approach is that it's flawed because S1/S2 creates confused categories that will ultimately limit what can be accomplished that way. I can go into more depth but that's the gloss.

comment by PDV · 2017-10-12T17:23:40.438Z · LW(p) · GW(p)

There are people productively engaging with that concept. They have none of these problems. Even if it's true that that is important and what they're trying to convey, it is harmful to accept their framing, more harmful than could justify potential benefits from them instead of sticking to people who are grounded in true things instead of nice things.

Replies from: ChristianKl
comment by ChristianKl · 2017-10-12T17:37:45.123Z · LW(p) · GW(p)

What kind of evidence do you see, that leads you to believe that people who accept Chapmans framing got harmed? Do you see anything that's distinct from you not understanding arguments that those people make while they discuss under that framing?

Additionally, it's interesting that you suggest that this post makes exactly the same framing that Chapman uses. To me the this post seems to break things down in different ontological concepts and thus implies a different frame. The ability to see that those are two different frames would be one of those things metarationality would supposely help with (as it's about how systems relate to each other).

Replies from: PDV
comment by PDV · 2017-10-14T08:34:17.554Z · LW(p) · GW(p)

Chapman's entire shtick is pretending to be wise, but even worse he's good enough that people take his ideas seriously. And then spend months or years of effort building a superstructure of LW-shaped mysterianness on top of it, losing sight of actual ability to distinguish true things from false things and/or accomplish goals.

The basic deal is that it professes to include all the goals and prpose of rationality, while also using other methods. But those other methods are thinly-disguised woo, which are attractive because they're easy and comfortable, and comfortable because they're not bound to effectiveness and accuracy. It keeps the style and language of the rationalist community - the bad parts - while pushing the simple view of truth so far back in its priorities that it's just lip service.

I'll grant that this isn't quite the same flavor of anti-truth woo as Chapman. But the difference is unimportant to me.