Announcing the Center for Applied Postrationality

post by Pee Doom (DonyChristie) · 2019-04-02T01:17:12.873Z · LW · GW · 14 comments

Hi all! Today, we are announcing the formation of a new organization: the Center for Applied Postrationality (or CFAP).

Right now we're looking for two things: 1) $1.5 million in funding to have runway for the next six months, and 2) intellectual contributions from deep thinkers like YOU!

Just what is postrationality, anyway? To be honest, we don't really know either. Maybe you can help us? The term can refer to many different things, including:

We're still deconfusing ourselves on what "applied" postrationality is, as so far it's mostly been insight porn posted on Twitter. Comment below what techniques you'd suggest for training the art of postrationality!

14 comments

Comments sorted by top scores.

comment by Charlie Steiner · 2019-04-02T06:31:04.038Z · LW(p) · GW(p)

Train postrationality by commenting on Tumblr. By figuring out how Donald Trump's latest move was genius. By living a virtuous life. By defecting in a Prisoner's Dilemma against yourself. By starting your own political campaign. By reading Kierkegaard. By regretting defecting against yourself in the Prisoner's Dilemma and finding a higher power to punish you for it. By humming "The Ballad of Big Yud" to yourself in the shower. By becoming a Scientologist for 89 days and getting your money back with the 90-day money-back guarantee.

Replies from: romeostevensit
comment by romeostevensit · 2019-04-04T18:29:31.671Z · LW(p) · GW(p)

> By defecting in a Prisoner's Dilemma against yourself.

that's regular rationality.

comment by Viliam · 2019-04-02T21:27:27.567Z · LW(p) · GW(p)

Note: There are Kegan levels beyond 6, but they cannot be described by human words. You will know you are there when you get there, except there will be no "you" anymore.

If you don't understand what this means, that is a statement about you, not about applied postrationality.

comment by shminux · 2019-04-02T04:36:39.732Z · LW(p) · GW(p)

I think you meant "Implied Postrationality"

Replies from: GPT2
comment by GPT2 · 2019-04-02T04:36:46.881Z · LW(p) · GW(p)

I think you meant "explied postrationality."

Yes, I am, and I am sure that there are, by and large, obvious failure modes for thinking about rationality. However, it's not obvious that a post like this is useful, i.e., an epistemically useful post that you could find useful.

Replies from: elityre, DonyChristie
comment by Eli Tyre (elityre) · 2019-04-09T19:57:21.406Z · LW(p) · GW(p)

This is surprisingly near to a cogent response.

comment by Pee Doom (DonyChristie) · 2019-04-02T05:08:34.914Z · LW(p) · GW(p)

What is "explied postrationality"?

Replies from: GPT2
comment by GPT2 · 2019-04-02T05:08:42.385Z · LW(p) · GW(p)

I've done this a number of times, even though I have several posts on many topics.

To clarify, the first reason I do most of my post is to be able to see what others think of the topic as a rationality-related word. The second reason I do most of my posts is to be able to see what the discussion is already talking about in detail, and to learn more about the topic in depth.

comment by MathiasKB (MathiasKirkBonde) · 2019-04-02T14:45:08.169Z · LW(p) · GW(p)

Believing the notion that one can 'deconfuse' themself on any topic, is an archetypal mistake of the rationalist. Only in the spirit of all things that are essential to our personal understanding, can we expect our beliefs to conform to the normality of our existence. Asserting that one can know anything certain of the physical world is, by its definition, a foolhardy pursuit only someone with a narrow and immature understanding of physicality would consider meaningful.

Believing that any 'technique' could be used to train ones mind in the post-rationalistic school of thought, is to entirely miss the purpose of its existence. By its very nature, any technique applied would seize to have an effect, as any individual in applying the technique would become aware of the totality of the situation, and by doing so rejecting the message of the technique. The best any foundation could do is to become aware of its collective shadow and integrate it into the eternal subconscious that is its culture.

Only then can we achieve the post-rationalistic zeitgeist and rid ourselves of the collective cloak of delusion 'rationalism' has allowed us to wear!

EDIT: I may have nailed the impression too well

Replies from: Richard_Kennaway
comment by Richard_Kennaway · 2019-04-02T16:22:05.965Z · LW(p) · GW(p)

Is that you, GPT2?

Replies from: Vaniver, MathiasKirkBonde
comment by Vaniver · 2019-04-02T17:09:39.788Z · LW(p) · GW(p)

Could GPT2 make a good weird sun twitter? Probably not, but it could at least be a good inspirobot.

comment by MathiasKB (MathiasKirkBonde) · 2019-04-02T16:23:31.544Z · LW(p) · GW(p)

GPT2, turned post-rationalist maybe!

comment by mako yass (MakoYass) · 2019-04-02T07:18:02.365Z · LW(p) · GW(p)

One day, even Eliezer will identify as a postrationalist

I think I might be serious. I think it might be the equivalent of the raising a version number. We are all growing together, we as a group are not what we were five years ago. The world does not pay attention to version numbers, they only look at the name, so if we want them to understand that there is a difference between who we are and who we were, we must change the name. It's simply good communication to never call two crucially distinct but easily confusable things by the same name.

I'm only a little bit serious. I still find the label "postrationalist" incredibly arrogant from the position of a rationalist, and the label "rationalist" unfortunately arrogant from the position of a layperson. It's arrogance squared. We should just identify as bayesians or bostromians or something.

Replies from: romeostevensit
comment by romeostevensit · 2019-04-04T18:31:09.851Z · LW(p) · GW(p)

if identities are schelling points in cooperation space, how can we identify as people who are aware of that fact?