bgaesop's Shortform

post by bgaesop · 2019-10-24T03:46:51.281Z · LW · GW · 22 comments

Contents

22 comments

22 comments

Comments sorted by top scores.

comment by bgaesop · 2019-10-24T03:46:51.419Z · LW(p) · GW(p)

https://slatestarcodex.com/2019/10/21/the-pnse-paper/

So, shouldn't all the rats who've been so into meditation etc for the past decade or so be kinda panicking at the apparent fact that enlightenment is just dunning-krugering yourself into not being able to notice your own incompetence?

Replies from: Yvain, Kaj_Sotala, DonyChristie, rsaarelm, George3d6, gworley, liam-donovan, None, Viliam
comment by Scott Alexander (Yvain) · 2019-10-27T09:24:29.090Z · LW(p) · GW(p)

I'd assumed what I posted was the LW meditator consensus, or at least compatible with it.

comment by Kaj_Sotala · 2019-10-25T06:07:27.822Z · LW(p) · GW(p)

Note that I already discussed this paper a bit at the end of my earlier post on meditation [LW · GW]; (this kind of) enlightenment removing your subjective suffering over your incompetence and otherwise leaving most of your behavior intact is as predicted and would still be considered valuable by many people. Also, enlightenment is only one of the things you can develop via meditation, and if you want practical benefits there are other axes that you can focus on.

Replies from: liam-donovan
comment by Liam Donovan (liam-donovan) · 2019-10-25T17:04:10.035Z · LW(p) · GW(p)

From an epistemic rationality perspective, isn't becoming less aware of your emotions and body a really bad thing? Not only does it give you false beliefs, but "not being in touch with your emotions/body" is already a stereotyped pitfall for a rationalist to fall into...


Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2019-10-25T19:36:37.319Z · LW(p) · GW(p)

Definitely. But note that according to the paper, the stress thing “was observed in a total of three participants”; he says that he then went on to “went on to conduct other experiments” and found results among similar lines, and then gives the yoga and racism examples. So it’s not clear to me exactly how many individuals had that kind of a disconnect between their experience of stress and their objective level of stress; 3/50 at least sounds like a pretty small minority.

I'm intending to flesh out my model further in a future post, but the short version is that I don't believe the loss of awareness to be an inevitable consequence of all meditation systems - though it is probably a real risk with some. Metaphorically, there are several paths that lead to enlightenment and some of them run the risk of reducing your awareness, but it seems to me entirely possible to take safer paths.

comment by Pee Doom (DonyChristie) · 2019-10-31T01:52:16.925Z · LW(p) · GW(p)

I am currently very skeptical that the PNSE paper has anything of worth, given that Jeffery Martin's Finder's Course is basically a scam according to this review and some others. (I don't know if the paper is based on Finder's Course participants.) It would be valuable for someone to do a fact check on the paper.

comment by rsaarelm · 2019-10-28T13:30:56.509Z · LW(p) · GW(p)

If you take the paper at face value, wouldn't you expect a lot of the chronically depressed rats to be jumping at the chance to trade off the ability to remember appointments with no longer subjectively suffering from depression?

comment by George3d6 · 2019-10-27T10:26:20.935Z · LW(p) · GW(p)

I don't see why, it seems that the paper doesn't challenge the assumption that the "enlightened" state or the "dmn reduced activation" sate or whatever you want to call it results in less happiness.

Even more so, a lot of the effects described could also fall under aging and might be confounded by the fact that most of these people have aged during the time they trained (e.g. not being able to remember appointments).

Finally, you don't have to bring meditation/introspection practices to the point where you get that "enlightened" state a constant in your life. I think there's a fair argument to be made for dzogchen-like practices (e.g. the kind of quackery people like Sam Harris and Loch Kelly expose), if done in moderation, can allow you to "silence the dmn" or "separate awareness/attention/consciousness", in a selective way rather than all the time. I.e. get to a point where your brain is able to experience something close to a small-medium dose of psilocybin for a short amount of time.

But, again, this final point is based on empirical quackery,but then again, so is most of the "research" around meditation when you compare it to other types of research (Note: I'm not blaming the way the research is done, I'm saying that the thing that they are trying to observer in combination with the tools they have means they will get very little insight... you can only do so many diffusion tensor MRIs on healthy people until it becomes unethical and the data you get from is very broad compare to, say, that you would get about kidney function from a biopsy + blood/urine testing using a broad range of separation techniques and chemical assays within reasonable cost)

comment by Gordon Seidoh Worley (gworley) · 2019-10-24T20:07:42.378Z · LW(p) · GW(p)

Maybe. This is a very narrow definition of "enlightenment" in my opinion, as in Scott is claiming PNSE is enlightenment whereas I would say it's one small part of it. I think of it differently, as a combination of psychological development plus some changes to how the brain operates that seemingly includes PNSE but I'm not convinced that's the whole story.

comment by [deleted] · 2019-10-31T15:02:10.472Z · LW(p) · GW(p)

Meta-point: I noticed almost falling into the defensive, trying to refute your sentiment without even reading the paper.

I don't hold you responsible for that, but a less polemic tone would probably get you a better response at least from me.

comment by Viliam · 2019-10-24T10:16:50.178Z · LW(p) · GW(p)

I am happy that someone finally brought into rationalist community some skepticism about meditation, in a way that won't get dismissed as "nah, u jelly, cos u have no jhanas, u full of dukkha and need some metta to remove your bad karma, bro."

I was already getting quite nervous about the lack of skepticism. Especially in a community that used to dismiss not only all religion and supernatural claims, but also all kinds of mysterious answers such as quantum woo or emergence... and suddenly went like "look, here is a religion that is totes different, because it's from the opposite side of the planet, and here is a religious practice that has all benefits and no problems, let's do it every day" and everyone seems to jump on the bandwagon, and then people start using words from foreign languages and claim to have mysterious experiences that are in principle incommunicable to mere muggles... and I'm like "what the fuck, is this still the Less Wrong I used to know, or have these people been kidnapped and brainwashed?"

To answer you question, if they have been successfully Dunning-Kruger'ed, they'll probably just be like: "nope, I have an unmediated direct perception of reality, and I know it's all okay". Also, if there is any problem with enlightenment, obviously those people Scott mentions have not been truly enlightened.

Replies from: habryka4, An1lam
comment by habryka (habryka4) · 2019-10-25T19:57:41.311Z · LW(p) · GW(p)

Hmm. Every major post I can remember that talks about meditation has comment sections that are full of skepticism and hesitation, I myself wrote a lot of those comments. Scott Alexander has been engaging skeptically with all the meditation stuff for at least two years. Skepticism towards meditation-related things is not a recent thing, and has been pretty widespread ever since I've started being active in the community around 6 years ago.

Somewhat related: Who is the "they" that you are talking about ? The things you say don't sound to me like the things that I've heard from people arguing for meditation on LessWrong. More concretely, risks from meditation have been acknowledged and discussed since the very beginning [LW · GW], and, if anything, I think are usually overstated. I really don't think people have described meditation as "all benefits and no problems". Again, I myself am highly skeptical of a lot of the enlightenment/meditation stuff, but the things you say really don't seem to map onto past discussions on this site.

I've also never heard anyone argue for having unmediated direct perception of reality, at least not in a way that anyone took seriously, because yes, I agree that that doesn't make sense. The argument that has been made is that meditation, to a certain degree, allows you more accurate access to your own mind and how you process information, which is a testable claim that can be disputed (and I am myself skeptical of), but is drastically different from claiming "unmediated access to reality".

I also really don't like the repeated linking to fallacies and biases. Since none of the above are responding to concrete individuals making concrete cognitive motions, it's unclear what they are supposed to indicate. My current sense is that in this comment, they do more harm than good, in the way Eliezer wrote about in "Knowing about biases can hurt people" [LW · GW].

Like, this comment feels really scarily close to what feels to me like the average Reddit comment on political subreddits. There is a concrete outgroup that gets described, which then gets made fun of by putting words in its mouth that are extreme caricatures of what has actually historically been voiced, and the group is described as part of some oppressive regime that has to be toppled by the underdogs who actually know what really is going on.

I think there are reasonable criticisms to be made of how people are engaging with meditation, and a lot of it is very justified (as is the Scott article linked above), but this comments seems like it drastically misconstrues opinion in the community, and then goes on to dramatically strawman dozens of people who tried to make real arguments in a way that makes me really uncomfortable.

Replies from: Viliam, bgaesop
comment by Viliam · 2019-10-26T21:05:02.352Z · LW(p) · GW(p)

Fair points. My comment was more a result of years (looking at the "kensho" article, yep, it's already two years) of accumulated frustration, than anything else. Sorry for that.

From my perspective, the skepticism seems surprisingly mild. Imagine a parallel reality, where a CFAR instuctor instead says he found praying to Jesus really helpful... in ways that are impossible to describe other than by analogy ("truly looking at Jesus is like finally looking up from your smartphone") and claims that Jesus helps him at improving CFAR exercises or understanding people. -- I would have expected a reaction much stronger than "your description does not really help me to start the dialog with Jesus".

Interestingly, clone of saturn's comment [LW(p) · GW(p)] in that debate seems like a summary of the PNSE paper:

If you think of your current level of happiness or euphoria (to pick a simple example) as the output of a function with various inputs, some of these inputs can be changed through voluntarily mental actions that similarly can't be directly explained in words and aren't obvious. Things like meditating long enough with correct technique can cause people to stumble across the way to do this. Some of the inputs can be changed about as easily as wiggling your ears, while others can be much more difficult or apparently impossible, maybe analogous to re-learning motor functions after a stroke.

I may be misremembering things I have read on Slate Star Codex as having them read on Less Wrong. (I wonder how to fix this. Should I keep bookmarks every time something rubs me the wrong way, so that when it happens hundred times I can document the pattern?)

By the way, I don't think the problem with explaining meditation/enlightenment/Buddhist stuff is going to go away soon. Like, there are entire countries that practice this stuff for thousand years, and... they have hundred schools that disagree with each other, and also nothing convicing to show. A part of that is because communicating about inner content is difficult, but I believe a significant part is that self-deception is involved at some level. I don't believe that a brain described in Elephant in the Brain simply gets more accurate insights by doing lots of introspection regularly. (Note than in the traditional setting, those insights include remembering your previous lives. Even if no one in the rationalist community buys the part about the previous lives, they still insist that the same process -- which led other people to remembering their previous lives -- leads to superior insights.)

comment by bgaesop · 2019-10-25T21:36:50.719Z · LW(p) · GW(p)

The in-person community seems much less skeptical of these things than the online community. Which isn't to say there are no skeptics, but (especially among the higher status members) it's kind of distressing to see how little skepticism there is about outright silly claims and models. At last year's CFAR reunion, for instance, there was a talk uncritically presenting chakras as a real thing, and when someone in the audience proposed doing an experiment to verify if they are real or it's a placebo effect, the presenter said (paraphrasing) "Hmm, no, let's not do that. It makes me uncomfortable. I can't tell why, but I don't want to do it, so let's not" and then they didn't.

This is extremely concerning to me, and I think it should be to everyone else who cares about the epistemological standards of this community

Replies from: mr-hire, rsaarelm
comment by Matt Goldenberg (mr-hire) · 2019-10-29T19:47:32.245Z · LW(p) · GW(p)
At last year's CFAR reunion, for instance, there was a talk uncritically presenting chakras as a real thing, and when someone in the audience proposed doing an experiment to verify if they are real or it's a placebo effect, the presenter said (paraphrasing) "Hmm, no, let's not do that. It makes me uncomfortable. I can't tell why, but I don't want to do it, so let's not" and then they didn't.

I attended that talk and have a slightly different memory.

To my memory, the claim was "I tried this exercise related to my body, and it had a strong internal effect. Then I started playing around with other areas related to chakras, and they had really strong effects too. Try playing around with this exercise on different parts of your body, and see if there's a strong effect on you."

The second part matches my memory, and I was a bit dissapointed we didn't get to do more of an experiment, but in no way were "chakras uncritically presented as a real thing."

comment by rsaarelm · 2019-10-29T07:54:37.964Z · LW(p) · GW(p)

Hmm, no, let’s not do that. It makes me un­com­fortable. I can’t tell why, but I don’t want to do it, so let’s not

After 100 years of parapsychology research, it's pretty obvious to anyone with a halfway functioning outside view that any quick experiment will either be flawed or say chakras are not real, so I'm not sure whether to take this as face value of the person thinking chakras are real-real and genuinely not being able to say why they don't want to do the experiment, or just saying a polite-speak version of "we both know doing the experiment will show chakras aren't real and will make me lose face, you're making a status grab against me for putting me on the spot by demanding the experiment so fuck you and fuck your experiment."

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2019-10-29T13:54:49.035Z · LW(p) · GW(p)
After 100 years of parapsychology research, it's pretty obvious to anyone with a halfway functioning outside view that any quick experiment will either be flawed or say chakras are not real

I don't know why the person didn't want to do an experiment, and I'd be willing to extend them the benefit of the doubt, but is there some particular research disproving chakras? So far I'd been going with the non-mystical chakra model [LW · GW] that

In general, if you translate all mystical statements to be talking about the internal experiences, they’ll make a lot more sense. Let’s take a few common mystical concepts and see how we can translate them.
Energy -- there are sensations that form a feeling of something liquid (or gaseous) that moves within or outside of your body. When unpacked, it’s likely to be made up of sensations of light (visual), warmth, tingling, and tension (physical). “Channeling energy” is adjusting your state of mind so as to cause these sensations to “move” in a certain way, to appear or disappear.
Chakras -- points in your body where it’s particularly easy to feel certain types of energies. It’s also particularly easy to visualize / feel the energy moving into or out of those places. “Aligning chakras” is about adjusting your state of mind so as to cause the energy to flow evenly through all chakras.
(Chakras are a great example of a model that pays rent. You can read about chakras, see what predictions people are making about your internal experience when you explore chakras, and then you can go and explore them within yourself to see if the predictions are accurate.)

... and after meditation caused me to have these kinds of persistent sensations on my forehead, I assumed that "oh, I guess that's what the forehead chakra thing is referring to". Another post suggested that experiences of "energy" correspond to conscious representations of autonomic nervous system activity, and the chakras to physiological hubs of that activity.

That has seemed sensible enough to me, but the topic hasn't seemed important enough to explore in detail; should I assume that this model is actually wrong?

Replies from: rsaarelm
comment by rsaarelm · 2019-10-29T17:42:20.407Z · LW(p) · GW(p)

The description sounded like both parties were assuming chakras involved some actual mystical energy and were doing the invisible garage dragon dance [LW · GW]. The parapsychology angle to this one is simple that even without knowing about a specific rebuttal, chakras are a well-known mystical concept, parapsychology research has been poking at most of the obvious mystical claims, and if parapsychology had verified that some supernatural phenomenon is actually real, we'd have heard of it.

If they were talking about the non-mystical model, the first person could've just said that it's a possibly helpful visualization shorthand for doing relaxation and biofeedback exercises and there's no actual supernatural energies involved.

Replies from: Kaj_Sotala, Raemon
comment by Kaj_Sotala · 2019-10-29T17:55:40.262Z · LW(p) · GW(p)

Yeah I don't know what exactly was said, but given that this was the CFAR alumni reunion, I would be willing to give the speaker the benefit of the doubt and assume a non-crazy presentation until I hear more details. Especially since a lot of things which have sounded [LW · GW] crazy and mystical have turned out to have reasonable [LW(p) · GW(p)] explanations.

comment by Raemon · 2019-10-29T19:53:48.651Z · LW(p) · GW(p)

My sense of what the person-at-the-reunion was talking about (having chatted with them a bit, although not sure I understand their position well enough to speak for them) was a model where Chakras roughly correspond to "application of Gendlin's Focusing, directed to particular areas of the body, turns out to yield different information."

i.e. a thing that I've heard reported by several LessWrongers is that focusing directed at your stomach tends to give a set of information about what your subconscious is thinking/feeling/interested-in, than focusing other areas of your body, or without any directed attention at all.

I've heard a couple people make the broader, somewhat stronger claim that each of the body-areas associated with a chakra tend to have consistent effects across people when used as introspection targets.

This doesn't seem particularly mysterious to me, although it seems reasonable to be escalatingly skeptical of:

  • "introspecting with a focus on particular body parts yields different information about what's going on with you subconsciously than generic introspection"
  • "one particular body part tends to be particular useful for this"
  • "seven body parts tend to be particularly useful for this in a way that corresponds to traditional chakras, and there's a model of how those areas relate with particular introspective techniques that tend to cause particular effects, across people"
comment by NaiveTortoise (An1lam) · 2019-10-24T19:34:02.056Z · LW(p) · GW(p)

As one data point, I'm a silent (until now) skeptic in the sense that I got really into meditation in college (mostly of the The Mind Illuminated variety), and felt like I was benefiting but not significantly more than I did from other more mundane activities like exercise, eating well, etc. Given the required time investment of ~1 hour a day to make progress, I just decided to stop at some point.

I don't talk about it much because I know the response will be that I never got to the point of "enlightenment" or needed a teacher (both possibilities that I acknowledge), but I figured given your post, I'd leave a short reply mentioning my not necessarily generalizable experience.