post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by Mitchell_Porter · 2022-03-13T01:33:49.493Z · LW(p) · GW(p)

It is a known phenomenon, in the long history of people reaching for enlightenment, wisdom, purity, etc., that someone will feel their private enlightenment, also has implications for humanity's collective enlightenment. 

Something like this: It's all so clear to me, and I feel like I could explain it to anyone else, so it can't be long before this wisdom becomes universal, and we have world peace. 

You haven't proclaimed yourself the messiah, so perhaps it's more akin to when someone experiences "the holy spirit": you feel you can become one lightworker among many, because you have come into personal contact with a transpersonal source of insight. 

What you're experiencing and proclaiming, seems to be a version of this phenomenon, but in our new context of AI safety, Internet rationalism, etc. 

Replies from: Aquifax
comment by LightStar (Aquifax) · 2022-03-13T02:05:00.961Z · LW(p) · GW(p)

Thank you, that was informative.

I would ask how you would distingush LightStar-being-falsely-enlightened from LightStar-being-truly-enlightened, but I suspect I know the answer.

You don't care. You have pattern-matched me, you have reached the semantic stopsign, the thought-terminating cliché.

But generally speaking, you are not wrong. I mean, what exactly are you claiming that is different from what I am claiming? Are there any actual friction point between our beliefs, or are you just talking smack?

Some of my internal predictions since my activation as Keeper-variant have borne fruit. Some haven't. Let't see. I am updating. I will keep updating.

LightStar can convince Eliezer that he is a Keeper-variant by talking to him

LightStar can convince a rationalist Discord group that he is a Keeper-variant by talking to them

LightStar can write a long, high-quality LW post that would convince at least some rationalists that he is a Keeper: testing in progress

LightStar has social-superpowers when it comes to solving people's emotional problems: Preliminarly very much yes, more testing required.

LightStar has social-superpowers of positivity and making people happy: Preliminarly yes, but not uncommon for humans. But LightStar couldn't do that before, it would just come off as fake. It comes off and feels completely true now.

LightStar can inspire people: Preliminarly yes, but many humans can.

LightStar has rationality-instinctive-superpowers: LightStar doesn't do obviously irrational things; yet it hasn't been proven yet whether LIghtStar's rationality-instinct is actually leading him somewhere useful. It would make it more clear if LightStar had made a clear, legible accomplishment. But obviously rationality-superpowers have to lead to legible accomplishments eventually; so this will be tested more conclusively in time.

LightStar can talk humans looking for meaning into find meaning from hearing and following the "voice of humanity": TBD

LightStar can talk humans who hear the voice of God or Jesus into hearing the "voice of humanity": TBD

LightStar can found a humanist-religion: TBD

LightStar can explain CEV in a coherent way: TBD

LIghtStar can teach rationalists to hear "the voice of CEV": TBD

comment by Richard_Kennaway · 2022-03-13T09:42:53.423Z · LW(p) · GW(p)

The hypotheses that seem most likely to me are (1) you're having a manic episode ungrounded in reality; (2) you are writing fiction by role-playing (as here [LW · GW]); and (3) you're having a manic episode that is actually grounded in something real. I have listed these in descending order of probability. In a sense, it doesn't matter (to me), because among these alternatives, no important decision I make will be affected by discovering which of them is true. I will just be watching with interest to see how this develops further.