On the Resolution of Frightening Paradoxes and Inferences

post by Rukifellth · 2013-02-02T04:50:47.846Z · LW · GW · Legacy · 24 comments

Contents

24 comments

As some of us might be aware, there exist ideas that harm their carriers simply by lingering in awareness. One may wonder if these ideas are just manifestations of Obsessive Compulsive Disorder or other neurosis, but the difference here is the metaphysical nature of such ideas.

A person with OCD may have their thought stream painfully interrupted by whatever's been fixated upon and literally be unable to stop. I conjecture that a person who has a form of metaphysical obsession will have their thought stream *infected*, such that anything they value or care about will be permanently devalued somehow, rather than being merely pushed away, or that the subject of obsession will be *compatible* or *miscible* with ones ordinary life and that one will make logical inferences about one's life based on those metaphysical ideas. The difference between this and regular personal development is that these inferences don't make one come up with useful ideas or insights for improvement; rather than supplementing one's life, they *deconstruct* and *disassemble*, by virtue of their global scope. I'll elaborate more on this in the last paragraph.

I further conjecture that anybody with these dangerous ideas would avoid telling anyone else out of remarkable conscience, under the belief that these ideas are unresolvable and would simply harm others. Alternatively, they may see these ideas as a revelation and try telling others, only to not be taken seriously. In defense of this article from those in the former category, I'll present no such ideas, because I'm not a complete dumbass. What I'm presenting is an opportunity.

Anyone who has these ideas shouldn't post them here; this article is just to gauge interest in/need for a group which shares their ideas knowing that:

A) Different perspectives help
B) Those with whom you're sharing aren't going to feel significantly worse for the burden
C) If a solution isn't found, *other* people are going to come up with them, so we might as well get it over with and post the solutions.

One may wonder how dangerous ideas could possibly exist, or think that these are just misunderstood epiphanies. To them I ask to read about George Price, the population geneticist, journalist and chemist, whose work on the origins of altruism drove him first to give away all his possessions, then to let the homeless sleep in his house. These are all *really* generous things, but I feel that the mindset of Price when doing these things was tainted by *desperation*, that he wanted to avoid a terrifying conclusion, namely that selflessness itself was rooted in selfishness, making its goodness non-intrinsic. The paradox seems easy to resolve on the outside, when one isn't panicking about it, and while working through a similar crisis I kind of wished I could go back and time and explain to him why interpretations like that are only *half* the story.  He cut his own throat with a pair of scissors at the age of 52.

[Edited out some poorly received theatrics, which admittedly bordered on the unnecessary]

24 comments

Comments sorted by top scores.

comment by Mitchell_Porter · 2013-02-03T20:15:22.342Z · LW(p) · GW(p)

i have often considered making a post like this, but with examples. The main difficulty would lie in knowing where to draw the lines. Do you include any sort of agitated response to any sort of idea that isn't of immediate practical relevance to daily life?

Some examples just as I think of them:

Pascal freaked out about the emptiness of infinite space.

Everett's daughter felt free to kill herself because she was happy in a parallel universe.

Legions of young materialist nihilists in recent centuries have felt like they are dead inside or otherwise detached from life or "unable to live" (etc., with many variations), and attribute this to epiphenomenalism, determinism, or some other belief about the nature of mind, matter, and cause-and-effect.

There are numerous extreme responsibilities that a person may take on in life - ending death for all humanity, really ending starvation/poverty/war in the world, doing the best possible thing at all times, countering "existential risk" - which may lead to a harrowing existence because society and culture do not support this effort or understand it.

Even just possessing a simple insight, or apparent insight, into the nature of reality, regarding which everyone around you is oblivious, can lead to the sort of isolating obsessive internal monologue that your post seems to hint at. And this could really be anything - atheism in a religious society, a paranoid supposition about some routine aspect of human experience that you have not yet personally experienced (thus that one is more frequent in the young), some mindbreaking philosophical concept like solipsism.

Here is a relevant earlier discussion.

Replies from: Rukifellth
comment by Rukifellth · 2013-02-03T22:49:10.905Z · LW(p) · GW(p)

The main difficulty would lie in knowing where to draw the lines. Do you include any sort of agitated response to any sort of idea that isn't of immediate practical relevance to daily life?

I would draw the line at regular obsessions, mostly because cognitive behavioural therapy at least offer some options. I have some experience with obsessions, and those were self contained, situationally. Not a personal example, but a pre-occupation with staying away from sharp objects for fear of committing suicide, despite not having depression or violent impulses, is local to that situation of being around sharp objects, not triggered by simply thinking about sharp objects in a room completely devoid of sharp objects.

Anything within the line would be subjects whose trigger is their being facts, whose consequences are reacted to quite quickly, and are not specific to any physical situation a person may be in at the time. I'm not so sure about Pascal (based on that sentence alone I mean), but Everett's daughter would fit the bill, assuming her belief compelled her to commit suicide, rather than gave her freedom to do it. I recall one person who actually had a similar problem here at Lesswrong, not two months ago actually. I'm surprised that I forgot about it. I guess I would call these "abstract obsessions" as opposed to "personal obsessions".

I apologize for not fleshing this out in better detail in the original post; I wasn't expecting this to generate interest from anywhere outside the hypothetical target audience, though in retrospect, I would probably have dug into it too.

EDIT: I'm reading the transcription of XiXiDu's psychology session, and this looks exactly like the class of problem I'm talking about.

comment by drethelin · 2013-02-02T19:11:35.988Z · LW(p) · GW(p)

you don't give me a very good reason to think basilisks are anything more than specific instances of OCD or Depressive or other mentally ill spirals. If you think we should form a mental illness support group for Lesswrongers, I wholeheartedly support that, but let's leave out basilisks until they actually come up.

Replies from: David_Gerard, Rukifellth, private_messaging
comment by David_Gerard · 2013-02-03T18:51:51.056Z · LW(p) · GW(p)

From (anecdotal-level) observation of examples, the famous LW basilisk is something that you need a string of things going wrong to be upset by: you need to believe certain Sequence memes, you need to believe they string together to imply particular things in a particular way, you need to be smart enough to understand for yourself how they imply what they do, and you need to be obsessive in just the wrong way.

The question then is what to do about it. Freedom of inquiry is absolutely necessary for science as done by mere humans to actually work, but this is not happening for various reasons that seemed good ideas at the time.

refs: a call to decompartmentalise, as compartmentalisation is in fact an epistemic sin; the dangers of doing so.

comment by Rukifellth · 2013-02-02T20:16:41.697Z · LW(p) · GW(p)

Wouldn't a mental illness group targeted to Lesswrongers be about basilisk-like problems, since basilisks are more prominently mentioned here?

If there's no one else who has basilisks to share, then no action ought to be taken regarding basilisks, yes. However, if other people do have basilisks, then something ought to be done, preferably in a closed environment. The thing is, right now we have no sure way of knowing if people have basilisks, because no one in their right mind would actually tell others the details without prompting.

In any case, if somebody else has a basilisk, they might come up and comment with "Yeah, let's do it". If nobody else has basilisks, then that won't happen, and only then will the group be proven unnecessary. I'd rather wait and see whether that happens or not.

Replies from: drethelin, ArisKatsaris
comment by drethelin · 2013-02-03T18:59:03.908Z · LW(p) · GW(p)

The base rate for any mental illness is hugely higher than the base rate for people who get basilisked even within Lesswrong. I think a group talking about rational, empirical, and practical ways to deal with having or knowing people who have various mental variances would be pretty cool. I think a basilisk group would on the other hand get few to no people talking in it or get swiftly banhammered

Replies from: None
comment by [deleted] · 2013-12-29T22:20:14.583Z · LW(p) · GW(p)

I realize it's currently 10 months later, but as someone who is of indeterminate diagnosis but very definitely not neurotypical, this would actually be very, very useful. I was extremely disappointed to find that the LessWrong wiki page for "corrupted hardware" just lists a few references to Sequences referring to evolutionary psychology (an often-mildly-dubious science in the first place) rather than an abundance of references to specific hardware corruptions and how to deal with them.

Replies from: drethelin
comment by drethelin · 2013-12-30T09:18:46.486Z · LW(p) · GW(p)

I recommend you start a thread in discussion or post in the open thread about this question! People generally like to post to new "advice repository" style threads, and people like talking about themselves, and this can be both.

Replies from: None
comment by [deleted] · 2014-01-06T18:26:04.677Z · LW(p) · GW(p)

Actually, that's a pretty good idea. I think I'll do that.

comment by ArisKatsaris · 2013-02-03T18:26:48.546Z · LW(p) · GW(p)

So, let me give a hypothetical scenario, tell me how such a group would hypothetically help in that scenario.

We're in a planet controlled by mind-reading SpaceNazis. You suddenly realize through scientific research that all purple-eyed people (forming a 3 percent of the general population) are actually of SpaceJewish heritage. SpaceNazis kill people of SpaceJewish heritage. You are also purple-eyed btw.

This is effectively a basilisk in the sense of "harmful knowledge". You're shortly up for your standard monthly mindreading by the regime. What could your contact group do to help? Brainwash you (and themselves) into forgetting it?

Replies from: Rukifellth
comment by Rukifellth · 2013-02-03T22:10:07.938Z · LW(p) · GW(p)

Assuming that the mindreaders punish people that don't turn in SpaceJews, trying to tell others about the problem is either suicide or mini-basilisk inducing, depending on that person's disposition towards me. On the other hand, if I were to ask help from a group of other purple-eyed SpaceJews who already determined the same secret and were in no greater danger for that request, we would at least be slightly be more likely to come up with a solution better than "commit suicide to protect other purple-eyed SpaceJews."

As such, the purpose of a closed group would be more of a way to negotiate the basilisk(s) in such a way that doesn't create additional risk, because anyone informed of said basilisk(s) would either

A) Already be distracted by the same basilisk, creating zero net loss B) Be distracted by an old basilisk, the idea being that their previous distraction/non-investment in the ideas that set up the new basilisk will render them less likely to get caught in the same loop and more likely to come up with a creative solution. As David_Gerard said,

From (anectdotal-level) observation of examples, the famous LW basilisk is something that you need a string of things going wrong to be upset by: you need to believe certain Sequence memes, you need to believe they imply particular things in a particular way, you need to be smart enough to understand for yourself how they imply what they do, and you need to be obsessive in just the wrong way.

Suppose basilisk A is of consequence Z, and is known by John. David however, does not care either way about consequence Z, possibly because he already knows about basilisk B and is more concerned about consequence X, and John is in the same spot as David in the matter of which basilisk is more important. Since both are already being distracted by a basilisk either way, they could trade basilisks, each hoping that the other might come up with a resolution without worrying about spreading it to somebody who would actually suffer a decreased quality of life for it.

comment by private_messaging · 2013-02-03T17:53:15.457Z · LW(p) · GW(p)

you don't give me a very good reason

He's not trying to give that reason for very good reasons...

edit: just realized it might be misinterpreted as me taking it seriously.

Look. I'm #10 on topcoder marathon match impressive debuts page, of all time, that was 4.5 years ago, I was still a newbie, it was the first programming contest of any kind I ever done. Their Elo-like ranking system which penalizes losing to newbies, combined with a contest where you could as well test everything offline prompted many high ranked contestants not to submit solutions. Impairing my score bump. Trust me, I can understand the math.

I'm not concerned with it actually working, and neither should you be. I'm rather bored of this topic and this [what ever it is] deletes counter arguments to it, which is really weird, but if you actually have some anguish (and the tales of the OCD suffering are not some sort of urban legend) you can mail me and I'll talk you out of it, or try to.

comment by fubarobfusco · 2013-02-02T21:26:07.804Z · LW(p) · GW(p)

To them I ask to read about what happened to that poor man, George Price, the population geneticist, journalist and chemist, whose work on the origins of altruism drove him first to give away all his possessions, then to let the homeless sleep in his house.

After reading his Wikipedia article, it isn't clear to me that the origins-of-altruism research caused his unusual behavior (and subsequent suicide). He seems to have had a curious relationship with Christian religion; he also survived thyroid cancer — and thyroid problems can have profound emotional effects.

One might wonder whether brain changes affecting one's valuation of oneself vs. others would lead a scientific mind both to curiosity about the origins of altruism, and to do altruistic acts. In that case, you need not fear that studying altruism will cause you to become excessively altruistic.

Replies from: aelephant
comment by aelephant · 2013-02-03T23:44:30.575Z · LW(p) · GW(p)

It seems like he had some problems distinguishing short- & long-term. Yes, opening your house & letting homeless people sleep there will help them in the short-term, but if it destroys your life in the long-term & makes you completely unable to help them, well then it isn't the correct decision for maximizing the amount of altruism you can perform. It seems like he was trying to be a good altruist, but wasn't a very good rationalist. Apologies if that seems offensive to anyone. I'm not trying to insult the man, he did much more good than many people do, I'm just trying to think about it in a broad sense.

Replies from: fubarobfusco
comment by fubarobfusco · 2013-02-04T00:11:33.994Z · LW(p) · GW(p)

On the other hand, he may have done a reasonable job of implementing the specific altruistic algorithm specified by Jesus ("sell your possessions, give to the poor, and follow me" — Matthew 19:21; Mark 10:21; Luke 18:22).

comment by buybuydandavis · 2013-02-03T08:45:31.853Z · LW(p) · GW(p)

there exist ideas that harm their carriers simply by lingering in awareness.

The inability to control your own mental focus is a big problem. My estimate is that it's much more a case of susceptible individuals than generally powerful basilisk memes. Most people just aren't that interested or moved by ideas in the first place. And those who are, tend to be moved by different ones. I'm sure I'm thoroughly immune from whatever memes got Price into his altruistic tizzy.

comment by rev · 2013-02-16T20:30:43.438Z · LW(p) · GW(p)

Reposting from open thread because it's relevant to this discussion.

Are there any mechanisms on this site for dealing with mental health issues triggered by posts/topics (specifically, the forbidden Roko post)? I would really appreciate any interested posters getting in touch by PM for a talk. I don't really know who to turn to.

Replies from: Rukifellth
comment by Rukifellth · 2013-02-17T00:00:18.833Z · LW(p) · GW(p)

Not that I'm aware of. A few people in this thread actually made such an offer to me, and I corresponded with Manfred, but nothing of a group.

I'll read anything you send me.

comment by Manfred · 2013-02-02T14:11:55.594Z · LW(p) · GW(p)

Feel free to share any scary ideas with my by PM. But mostly I recommend reading things like this or this.

Replies from: Rukifellth
comment by Rukifellth · 2013-02-02T15:41:31.477Z · LW(p) · GW(p)

Mine isn't a moral crisis, but thank you. The Price story was a specific example of the more general problem.

comment by Kawoomba · 2013-02-02T07:41:43.746Z · LW(p) · GW(p)

I conjecture (...) I further conjecture (...) I'll present no such ideas, because I'm not a complete dumbass (...) He cut his own throat with a pair of scissors at the age of 52. (...) So, is anyone feeling angst?

You need a campfire, some distant animals howling, and merry band of teenagers for this kind of story to work.

Replies from: Rukifellth
comment by Rukifellth · 2013-02-02T14:53:52.094Z · LW(p) · GW(p)

Clever. I don't suppose utility being served by theatrics occurred to you?

I suppose it was overdone though, especially the title.

comment by jaekwon · 2013-05-20T17:10:45.386Z · LW(p) · GW(p)

I now have undeniable proof that it is not worthwhile to worry about acasually dangerous ideas, but you'll have to simulate me through deduction to find out what this proof is. #basilisk

Replies from: Rukifellth
comment by Rukifellth · 2013-05-21T21:30:01.672Z · LW(p) · GW(p)

I smiled, though it's starting to vex me that people still think I'm talking specifically about Roko's Basilisk.