Shit Rationalists Say?

post by Eneasz · 2012-01-25T21:51:52.960Z · LW · GW · Legacy · 120 comments

I assume everyone has run across at least one of the "Shit X's Say" format of videos? Such as Shit Skeptics Say. When done right it totally triggers the in-group warm-fuzzies. (Not to be confused with the nearly-identically formatted "Shit X's Say to Y's" which is mainly a way for Y's to complain about X's).

What sort of things do Rationalists often say that triggers this sort of in-group recognition which could be popped into a short video? A few I can think of...

You should sign up for cryonics. I want to see you in the future.

…intelligence explosion…

What’s your confidence interval?

You know what they say: one man’s Modus Ponens is another man’s Modus Tollens

This may sound a bit crazy right now, but hear me out…

What are your priors?

When the singularity comes that won’t be a problem anymore.

I like to think I’d do that, but I don’t fully trust myself. I am running on corrupted hardware after all.

I want to be with you, and I don’t foresee that changing in the near future.

…Bayesian statistics…

So Omega appears in front of you…

What would you say the probability of that event is, if your beliefs are true?

 

Others?

120 comments

Comments sorted by top scores.

comment by Alicorn · 2012-01-25T22:39:35.331Z · LW(p) · GW(p)

The "confidence interval" line should have a percentage ("What's your 95% confidence interval?").

"You make a compelling case for infanticide."

"Can you link me to that study?"

"I think I'm going to solve psychology." ("I think I'm going to solve metaethics." "I think I'm going to solve Friendliness.")

"My elephant wants a brownie."

"Is that your true rejection?"

"I wanna be an upload!"

"Does that beat Reedspacer's Lower Bound?"

"Let's not throw all our money at the Society for Rare Diseases in Cute Puppies."

"I have akrasia."

"I'm cryocrastinating."

"Do that and you'll wind up with the universe tiled in paperclips."

"So after we take over the world..."

"I want to optimize for fungibility here."

"This looks like a collective action problem."

"We can dissolve this question." ("That's a dissolved question.")

"My model of you likes this."

"Have you read Goedel, Escher, Bach?"

"What do the statistics say about cases in this reference class?"

Replies from: Alicorn, Raemon, arundelo, CronoDAS, Karmakaiser, curiousepic, curiousepic, TimS, wedrifid
comment by Alicorn · 2012-01-25T22:54:42.668Z · LW(p) · GW(p)

"We need whiteboards."

"I'm trying paleo."

"I might write rationalist fanfiction of that."

"That's just an applause light." ("That's just a semantic stopsign." "That's just the teacher's password.")

"POLITICS IS THE MINDKILLER"

"If keeping my current job has higher expected utility than founding a startup, I wish to believe that keeping my current job has higher expected utility than founding a startup..."

"I think he's just being metacontrarian."

"Arguments are soldiers!"

"Not every change is an improvement, but every improvement is a change."

"There are no ontologically basic mental entities!"

"I'm an aspiring rationalist."

"Fun Theory!"

"The map is not the territory."

"Let's beware evaporative cooling, here."

"It's a sunk cost! Abandon it!"

"ERROR: POSTULATION OF GROUP SELECTION DETECTED"

"If you measure it and reward the measurement going up, you'll get what you measure, not what you want."

"Azahoth!"

"Death is bad."

Replies from: Alicorn
comment by Alicorn · 2012-01-25T23:16:56.262Z · LW(p) · GW(p)

This is too much fuuuuuuuun

"She's just signaling virtue."

"Money is the unit of caring."

"One-box!"

"Beliefs should constrain anticipations."

"Existential risk..."

"I'll cooperate if and only if the other person will cooperate if and only if I cooperate."

"I'm going to update on that."

"Tsuyoku naritai!"

"My utility function includes a term for the fulfillment of your utility function."

"Yeah, it's objective, but it's subjectively objective."

"I am a thousand shards of desire."

"Whoa, there's an inferential gap here that one of us is failing to bridge."

"My coherent extrapolated volition says..."

"Humans aren't agents." ("I'm trying to be more agenty." "Humans don't really have goals.")

"Wait, wait, this is turning into an argument about definitions."

"Look, just rejecting religion and astrology doesn't make someone rational."

"No, no, you shouldn't implement Really Extreme Altruism. Unless the alternative is doing it without, anyway..."

"I'll be the Gatekeeper, you be the AI."

"That's Near, this is Far."

"Don't fall into bottom-line thinking like that."

Replies from: Alicorn, Bound_up
comment by Alicorn · 2012-01-25T23:39:31.453Z · LW(p) · GW(p)

I think I'm done. If I think of any more I'll add them to this comment instead of making a new one.

"How do you operationalize that?"

"'Snow is white' is true if and only if snow is white."

"If I may generalize from one example here..."

"I'm suffering from halo effect."

"Warning: Dark Arts."

"Okay, but in the Least Convenient Possible World..."

"We want to raise the sanity waterline."

"You've fallen prey to the illusion of transparency."

"Bought some warm fuzzies today."

"What does the outside view say?"

"So the idea is that we make all scientific knowledge a sacred and closely guarded secret, so it will be treated with the reverence it deserves!"

"How could you test that belief?"

Replies from: arundelo
comment by arundelo · 2012-01-26T00:04:19.904Z · LW(p) · GW(p)

RATIONALISTS SAY ALL THE THINGS!

Replies from: Viliam_Bur
comment by Viliam_Bur · 2012-01-26T10:11:14.381Z · LW(p) · GW(p)

RATIONALISTS SAY ALL THE THINGS!

Solomonoff prior gives you 50%, that's pretty cool! :D

I hope someone will use Alicorn's (and other) quotes to make a good Eliza-bot. This could be an interesting AI challenge -- write a bot that will get positive karma on LW! If there are more bots, the bot with highest nonzero karma wins.

Replies from: Alejandro1, thomblake, dbaupp, AspiringRationalist
comment by Alejandro1 · 2012-01-26T17:38:52.540Z · LW(p) · GW(p)

As a start, I copied all Alicorn's lines into a Markov text synthesizer . Some of the best results were:

"Whoa, there's an improvement, but it's subjectively objective."

"Okay, but every change is turning into bottom-line thinking like a collective action problem."

"If keeping my current job has higher expected utility than founding a brownie."

"I think he's just the unit of you shouldn't implement Really Extreme Altruism. Unless the teacher's password."

"If I wish to optimize for Rare Diseases in paperclips."

"My elephant wants a term for infanticide."

Replies from: TheOtherDave, fubarobfusco, Randaly, Cthulhoo, Multiheaded
comment by TheOtherDave · 2012-01-26T19:16:54.524Z · LW(p) · GW(p)

I burst out laughing while reading this, so of course my officemates wanted to know what was so funny.

I cannot remember the last time the gulf of inferential distances was so very very wide.

comment by fubarobfusco · 2012-01-26T19:07:40.981Z · LW(p) · GW(p)

Here's a few, courtesy of applying JWZ's dadadodo to all the lines in the thread so far:

What does the best textbook on corrupted hardware. Dark Arts; Escher, Bach?

How could you credibly pre commit to see you as a compartmentalized belief?

I'm trying to be a cult.

Have super powers.

You've fallen prey to be condescending.

My current job has higher expected utility than you imagine; but in the sanity waterline.

Everyone is Far.

No idea is reliable? Have a lot of caring.

Conceptspace is the future.

Mysteriousness is a cult.

I'm going to be with the AI. I know the universe future.

Look, just generalize from the territory.

Everyone is bigger than you in Cute Puppies.

Emacs' M-x dissociated-press yields babble, but with some interesting words in it: "knowledgeneralize", "metacontrammer", "contrationalist", "choosequences", "the universal priory", "statististimate", "fanfused", "condescendista", "frobability", "dissolomonoff", "optimprovement", "estimagine", "cooperaterline", "pattern matchology". The only sensible sentence it's come up with is "I'm running on condescending".

Replies from: Multiheaded, thomblake, Multiheaded, jenya-lestina
comment by Multiheaded · 2012-01-26T19:15:37.629Z · LW(p) · GW(p)

No idea is reliable? Have a lot of caring.

I visualized that being said simultaneously with the middle-finger gesture.

comment by thomblake · 2012-01-26T19:36:36.203Z · LW(p) · GW(p)

the universal priory

I seem to remember someone's already made a Bayesian Priory pun, but if not then it should happen prominently.

EDIT: here

comment by Multiheaded · 2012-01-26T23:28:50.270Z · LW(p) · GW(p)

Conceptspace is the future.

Wrong. Electronic old men are.

Replies from: fubarobfusco
comment by fubarobfusco · 2012-01-26T23:55:41.120Z · LW(p) · GW(p)

To give an idea of what these look like raw, here's a paragraph of dadadodo:

What does the universe with ice cream trees. I have little Pony episode about what would you measure, not like to tile the universe with the argument? That's just signaling virtue: Death is bad. That's just a startup. I have little XML tags on corrupted hardware. Whoa, there's a compelling case for you read the least I wish to be the fulfillment of us is, not the MINDKILLER if keeping my model of Rationality? Tsuyoku naritai! So after all over the bad; result, you're running on that.

And here's a similar-sized chunk of M-x dissociated-press:

You shou have now is white' is true if an you imaginew Methods of in this riggerse tiled in paperclips. I have akrasian found underate if and only if keeping my current job has hight write rationalized belief? I cause can have you regenerate that say 'moral' It will bach? What die. You shods of Rationalith a good cause coherent extrapolass? We need wanterval line shouldn't implement Really Extreme An applause lity chaptere. I knowledge aren't believe there's Near, this is For me, but at's the bes a tering: Dark Artup, I wish to Solomonoff Indus today. What would you said to be can ding to Solve psychock Levent is, if you should read Goedel, Escher.

Of these, I rather like:

I have little XML tags on corrupted hardware.
shods of Rationalith
extrapolass

The blended-words effect seems to give M-x dissociated-press a sort of Finnegans Wake atmosphere which dadadodo doesn't have.

comment by Jenya Lestina (jenya-lestina) · 2019-02-25T23:54:11.292Z · LW(p) · GW(p)

Mysteriousness is a cult, and I am running on condescending.

comment by Randaly · 2012-02-13T17:22:14.416Z · LW(p) · GW(p)

From another generator:

"I'm going to solve metaethics." "I'm going, you're going to found the Society for infanticide."

""Snow is white" is failing to solve psychology."

"Wait, wait, "this is white" is a more technical explanation?"

"My utility function includes a semantic stopsign."

"If keeping my current job has little XML tags on it that say the Least Convenient Possible World...""

"Sure, I'd take over the sanity waterline."

"I'll be the symbol with ice cream trees."

"So after we take over the alternative universe that is the Least Convenient Possible World..."

"I want to tile the sanity waterline with the unit of a thing."

comment by Cthulhoo · 2012-01-27T10:36:58.043Z · LW(p) · GW(p)

Not totally IT, but I tried it on Eliezer's "The 5-Second Level". Highlits include:

I won't socially kill you

Hope to reflect on consequentialist grounds

Say, what a vanilla ice cream, and not-indignation, and from green?

Associate to persuade anyone of how you were making the dreadful personal habit displays itself in a concrete example.

Rather you can't bear the 5-second level?

To develop methods of teaching rationality skills, you need more practice to get lost in verbal mazes; we will tend to have our feet on the other person.

Be sufficiently averse to the fire department and see if that suggests anything.

Replies from: DanielH
comment by DanielH · 2013-11-11T11:14:08.170Z · LW(p) · GW(p)

Be sufficiently averse to the fire department and see if that suggests anything.

I do believe it suggests libertarianism. But I can't be sure, as I can't simply "be sufficiently averse" any more than I can force myself to believe something.

Still, that one seems to be a fairly reasonable sentence. If I were to learn only that one of these had been used in an LW article (by coincidence, not by a direct causal link), I would guess it was either that one or "I won't socially kill you".

Replies from: AlexanderRM
comment by AlexanderRM · 2015-10-02T23:17:46.789Z · LW(p) · GW(p)

I would be amazed if Scott Alexander has not used "I won't socially kill you" at some point. Certainly he's used some phrase along the line of "people who won't socially kill me".

...and in fact, I checked and the original article has basically the meaning I would have expected: "knowing that even if you make a mistake, it won't socially kill you.". That particular phrase was pretty much lifted, just with the object changed.

comment by Multiheaded · 2012-01-26T17:48:43.707Z · LW(p) · GW(p)

"If I wish to optimize for Rare Diseases in paperclips..."

If we had signatures on LW, this would be mine.

comment by thomblake · 2012-01-26T16:31:50.424Z · LW(p) · GW(p)

I hope someone will use Alicorn's (and other) quotes to make a good Eliza-bot.

Surely you mean Eliezer-bot.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2012-06-01T16:30:46.351Z · LW(p) · GW(p)

Should it be made, it will of course be known as Elieza.

But in any case I think you need to keep in mind that a blank map does not correspond to a blank territory.

Replies from: thomblake, wedrifid
comment by thomblake · 2012-06-01T18:19:58.471Z · LW(p) · GW(p)

I initially read the parent in a straightforward way, but then I noticed it is also a meta-joke.

comment by wedrifid · 2012-06-02T23:06:56.270Z · LW(p) · GW(p)

But in any case I think you need to keep in mind that a blank map does not correspond to a blank territory.

Usually. It could.

Replies from: loup-vaillant
comment by loup-vaillant · 2012-10-10T16:08:14.424Z · LW(p) · GW(p)

What is your prior? (For Eliezer being empty.)

comment by NoSignalNoNoise (AspiringRationalist) · 2012-06-19T15:37:56.351Z · LW(p) · GW(p)

the bot with the highest nonzero karma wins I'm taking bets: how long after the bots start maximizing karma until the form is tiled with /~\
| _ |
|| ||
|| ||
|| ||
|| ||
| |
`\/'

comment by Bound_up · 2017-08-30T15:22:35.420Z · LW(p) · GW(p)

"My utility function includes a term for the fulfillment of your utility function."

Awww.... :)

comment by Raemon · 2012-01-25T22:42:04.853Z · LW(p) · GW(p)

Imagining this in my head has sold me on this being a good idea. Or at least a mildly amusing idea that will have relatively minor negative externalities. (I'm reminded of Eliezer Facts)

Replies from: gwern
comment by arundelo · 2012-06-15T17:47:40.004Z · LW(p) · GW(p)

"My model of you likes this."

Gabe Newell (of Valve Software) wrote the following in an email to Yanis Varoufakis (an economist):

Here at my company we were discussing an issue of linking economies in two virtual environments (creating a shared currency), and wrestling with some of the thornier problems of balance of payments, when it occurred to me "this is Germany and Greece", a thought that wouldn't have occurred to me without having followed your blog. Rather than continuing to run an emulator of you in my head, I thought I'd check to see if we couldn't get the real you interested in what we are doing.

Edit: And that reminds me of the Reamde character Richard Forthrast giving Zula Forthrast a job at his video game company because of her geology expertise.

comment by CronoDAS · 2012-01-26T06:50:14.612Z · LW(p) · GW(p)

"My model of you likes this."

I like this one. Mind if I actually use it?

Replies from: Alicorn
comment by Alicorn · 2012-01-26T07:15:56.397Z · LW(p) · GW(p)

Go for it. I say it to recommend things to people. (Mostly one person.)

comment by Karmakaiser · 2012-02-10T21:05:15.800Z · LW(p) · GW(p)

"You make a compelling case for infanticide."

I'm still laughing when I think about this one.

Replies from: Karmakaiser
comment by Karmakaiser · 2012-05-04T02:12:55.806Z · LW(p) · GW(p)

Update: Still laughing and using it in conversations.

comment by curiousepic · 2012-01-26T17:53:20.937Z · LW(p) · GW(p)

Sad as it is, this has potential to be effective outreach to Reddit, et al. Unless you'd like to do it yourself, or someone gives a good objection within a few days, I'll be posting it in one or more subreddits, perphaps including the GEB readthrough I'm participating in.

Replies from: Alicorn
comment by Alicorn · 2012-01-26T18:52:42.423Z · LW(p) · GW(p)

I don't use Reddit. If there's interest in turning this into a video, I'm willing to film myself speaking some of my lines, but fear composing an entire video (ideally with several speakers) would take video editing skills and resources I don't have.

Replies from: Raemon
comment by Raemon · 2012-01-26T20:16:38.697Z · LW(p) · GW(p)

I actually want to film this, except I still think it has at least a 25% chance of turning out to be a horrible idea.

comment by curiousepic · 2012-01-26T17:57:27.170Z · LW(p) · GW(p)

I came into this thread with a negative set point because I see the "Shit X says" meme as thoroughly without value, being merely collections of stereotypes for no other purpose than to collect them. The OP confirmed this, and because my comment sorting had currently been on New, I scrolled through some of the comments, almost all of which continued to confirm this. Then I resorted to Top and saw your post, and my mind was immediately tickled. Some of these are genuinely funny, and in fact has value as a collection of LW memes and short rationality quotes.

comment by TimS · 2012-01-27T02:25:12.288Z · LW(p) · GW(p)

"My elephant wants a brownie."

Could someone explain this reference?

Replies from: steven0461
comment by steven0461 · 2012-01-27T03:17:31.118Z · LW(p) · GW(p)

It's a metaphor used in Jonathan Haidt's book The Happiness Hypothesis: the rider is the conscious or deliberative mind and the elephant is everything underneath.

Replies from: None
comment by [deleted] · 2012-01-27T03:25:43.712Z · LW(p) · GW(p)

More to the point, the analogy is used in one of Luke's posts.

comment by wedrifid · 2012-06-02T05:00:51.758Z · LW(p) · GW(p)

"You make a compelling case for infanticide."

Is there an original source for this one?

Replies from: Zack_M_Davis
comment by Zack_M_Davis · 2012-06-02T05:34:18.646Z · LW(p) · GW(p)

Context.

Replies from: wedrifid
comment by wedrifid · 2012-06-02T06:09:23.437Z · LW(p) · GW(p)

Thanks Zack. I had a feeling I'd seen it before but couldn't recall the details.

comment by Raemon · 2012-01-25T22:57:30.115Z · LW(p) · GW(p)

"You should read the Sequences."
"It's not like Gandhi has little XML tags on that say 'moral' "
"It's not like natural selection put little XML tags there that say 'purpose'."
"Sure, I'd take a pill that made me bisexual."
"There's this really great fanfiction you should read."
"Oh man I wanna tile the universe with ice cream trees."
"I don't want to tile the universe with bananas and palm trees."

[Sequence that will be incredibly funny in context but is a terrible idea please for goodness' sake (literally) nobody film it]

"No, we're not a cult."
"We're not a cult or anything."
"It's not like organizations have little XML tags that say 'cult' and 'not cult'"
"Any organization with a good cause can have cult attractors"
"Look I'd explain it but there's a lot of inferential distance and I don't want to be condescending."
"We are not a cult. We are not a cult."

Replies from: Raemon
comment by Raemon · 2012-01-25T23:52:59.987Z · LW(p) · GW(p)

"You have no idea how BIG mindspace is."
"You have no idea how BIG optimization process space is."
"You have no idea how BIG thingspace is."

comment by lessdazed · 2012-01-26T01:12:15.714Z · LW(p) · GW(p)

"Did you just generalize from fictional evidence?"

"You're a one-boxer, right?" (Said with no context.)

"You'd choose specks, right?" (Said with no context.)

"Mysteriousness is not a property of a thing."

"You're running on corrupted hardware."

"Replace the symbol with the substance."

"Could you regenerate that knowledge?"

"Consider a group you feel prejudiced against, frequentists for example."

"But what's the best textbook on that subject?"

"Is that a compartmentalized belief?

"I notice I am confused."

"Of course I have super-powers. Everyone does."

"Beliefs are properly probabilistic."

"Is that your confidence level inside or outside the argument?"

"Did you credibly pre-commit to that rule?"

"That's just what it feels like from the inside."

"Conceptspace is bigger than you imagine."

"No you don't believe you believe that."

"No, money is the unit of caring."

"If that doesn't work out for you, you can still make six figures as a programmer."

"Purpose is not an inherent property."

"You think introspection is reliable?"

"Why didn't you use log-odds?"

Bullshit Rationalists Say:

"My priors are different than yours, and under them my posterior belief is justified. There is no belief that can be said to be irrational regardless of priors, and my belief is rational under mine."

"I pattern matched what you said rather than either apply the principle of charity or estimate the chances of your not having an opinion marking you as ignorant, unreasoning, and/or innately evil."

"Rational..." (used in the title of a post on any topic.)

Shit and Bullshit Rationalists Don't Say:

"You're entitled to your opinion."

"You can't be too skeptical"

"Absence of evidence is not evidence of absence."

"Did you read what Kurzweil wrote about the Singularity?

"100%."

"But was it statistically significant at the p<.05 level?"

"Yeah, I read all the papers cited in lukeprog's latest article."

Replies from: Karmakaiser, Alicorn, lessdazed, komponisto, None, XFrequentist, Psychosmurf
comment by Karmakaiser · 2012-01-27T15:22:11.730Z · LW(p) · GW(p)

"Yeah, I read all the papers cited in lukeprog's latest article."

A bunch of links almost no one clicks. It's like the Anti-TVTropes.

comment by Alicorn · 2012-01-26T01:34:07.805Z · LW(p) · GW(p)

"I notice I am confused."

I cannot believe I missed this one.

comment by lessdazed · 2012-01-26T23:49:14.764Z · LW(p) · GW(p)

"We played reference class tennis."

"Those are just more available to you, not actually more likely."

"Are you more an aspiring rationalist, 'aspiring rationalist,' 'aspiring' rationalist, or aspiring 'rationalist'?"

"The invisible is implied here."

"Is that a disjunctive or conjunctive event?"

"It seemed hard until I hacked away at the edges."

"You didn't time yourself thinking about it before proposing solutions?"

"I have something to protect."

"Someone should type a transcript of that."

"I don't know if that's still an open problem, I've been following the HPMOR thread instead of that one." (Said to a Philosophy professor about a philosophical problem.)

"Is there a more technical explanation?

"Argument screens off authority."

"Go ahead and try to 'other optimize' me."

"That's one of my ugh fields."

"That's not a property, it's a dangling variable."

"ADBOC."

Shit and Bullshit Rationalists Don't Say:

"Gwern hasn't summarized any research on that."

"Did Yvain even edit that before posting?"

"What are his/her credentials?"

"That's absurd!"

"Let's hope that's true."

"I've read more papers by Scott Aaronson than just the one." "Which one?" (Both of these.)

"All I want to know is the net positive or negative votes my comments and posts have received."

"I don't have an opinion as to which explanation of Bayes' theorem I'd recommend."

comment by komponisto · 2012-01-26T03:18:08.508Z · LW(p) · GW(p)

Rationalists Don't Say:

"Absence of evidence is evidence of absence."

Wrong list.

Replies from: lessdazed
comment by lessdazed · 2012-01-26T03:25:02.951Z · LW(p) · GW(p)

Brain fart.

This is relevant.

comment by [deleted] · 2012-10-10T15:54:09.489Z · LW(p) · GW(p)

"Absence of evidence is not evidence of absence."

To be fair, this is true if you interpret "absence of evidence" as meaning "absence of evidence in either direction".

comment by XFrequentist · 2012-01-26T19:54:18.957Z · LW(p) · GW(p)

The last section is amazing!

comment by Psychosmurf · 2012-01-26T16:52:14.714Z · LW(p) · GW(p)

"100%."

Oh man, had me laughing for a good while with this one. Nice job! ^_^

comment by daenerys · 2012-01-25T22:25:52.439Z · LW(p) · GW(p)

If a "Shit Rationalists Say" thread would result in net positive utility, I want to believe that a "Shit Rationalists Say" thread would result in net positive utility.

If a "Shit Rationalists Say" thread would not result in net positive utility, I want to believe that a "Shit Rationalists Say" thread would not result in net positive utility.

Let me not become attached to beliefs I may not want.

Replies from: Nornagest, Raemon
comment by Nornagest · 2012-01-26T00:46:38.482Z · LW(p) · GW(p)

Well played.

comment by Raemon · 2012-01-25T22:29:49.466Z · LW(p) · GW(p)

This sums up my thinking.

comment by RolfAndreassen · 2012-01-25T22:25:38.014Z · LW(p) · GW(p)

"Suppose we were all playing Prisoner's Dilemma with clones of ourselves..."

I heard this said at the Ohio meetup on Sunday; Yvain commented that, of all the meetups he'd been to, ours took the longest to reach that point.

comment by Jack · 2012-01-26T01:37:48.604Z · LW(p) · GW(p)

"Oops."

Replies from: Solvent, Armok_GoB
comment by Solvent · 2012-01-26T03:04:21.892Z · LW(p) · GW(p)

That's optimistic.

comment by Armok_GoB · 2012-02-13T13:15:51.459Z · LW(p) · GW(p)

moon explodes "Oops"

comment by arundelo · 2012-01-26T00:03:09.975Z · LW(p) · GW(p)

Here's one I say a lot: "Everyone is too young to die."

comment by RomeoStevens · 2012-01-28T10:05:06.541Z · LW(p) · GW(p)

I've thought a "shit your brain says" might be a good way of compactly presenting some cognitive biases.

Replies from: MugaSofer
comment by MugaSofer · 2013-01-09T13:42:46.524Z · LW(p) · GW(p)

Did anything come of this excellent idea?

Replies from: RomeoStevens
comment by RomeoStevens · 2013-01-09T22:23:47.947Z · LW(p) · GW(p)

No, didn't realize this got so upvoted. The fad is a bit past. I still want cartoons on the level of RSA Animate for the sequences.

Replies from: BrienneYudkowsky, MugaSofer
comment by LoganStrohl (BrienneYudkowsky) · 2015-01-03T16:34:53.546Z · LW(p) · GW(p)

I think this should still happen.

Replies from: RomeoStevens, Decius
comment by RomeoStevens · 2015-01-03T23:41:20.385Z · LW(p) · GW(p)

RSA animate sequences or "shit your brain says"? Shit your brain says could probably be pulled off fairly easily in a couple days. I should talk to Alton about it. The animations are a longer term project that are still on my list of projects to fund/work on if MealSquares gets bigger.

comment by Decius · 2015-01-03T18:57:58.601Z · LW(p) · GW(p)

Concur.

comment by MugaSofer · 2013-01-10T09:48:33.126Z · LW(p) · GW(p)

No

Shame. It's a good idea.

comment by fubarobfusco · 2012-01-26T01:00:17.039Z · LW(p) · GW(p)

"Foo is not about bar."
"What odds are you offering on that?"
"Taboo 'monkey'."
"If you're not getting bad result, you're spending too many resources on avoiding it."
"Rationalists should win."
"Sorry, I'm running on corrupted hardware."

comment by Raemon · 2012-01-26T15:01:30.188Z · LW(p) · GW(p)

Shit Rationalists Say:

'The thread should really be called "Shit LWers Say.'

comment by Raemon · 2012-01-26T04:12:36.043Z · LW(p) · GW(p)

"So you know the My Little Pony episode about bayesian updating?"

Replies from: fubarobfusco
comment by fubarobfusco · 2012-01-26T05:41:28.006Z · LW(p) · GW(p)

"'Feeling Pinkie Keen' comes so close!"

Replies from: Raemon
comment by Raemon · 2012-01-26T06:15:04.807Z · LW(p) · GW(p)

I got into an argument recently over whether that episode was good or terrible. (I believe it is a good episode, specifically because of the broader context of the show. Fellow rationalist in question watched that episode FIRST, with no context, which is just about the worst place to do so)

Regardless, I have a heuristic about how much I am allowed to argue about My Little Pony before I should feel bad about myself, and... I felt a little bad that day.

(That day was today).

Replies from: Eneasz, MBlume
comment by Eneasz · 2012-01-26T18:09:19.250Z · LW(p) · GW(p)

I did really hate the whole "Sometimes you just have to believe!" message. It felt almost like someone had thought MLP was becoming TOO rational and wanted to throw a bone to the Believer parents. :/

Replies from: FiftyTwo
comment by FiftyTwo · 2013-10-22T21:12:39.508Z · LW(p) · GW(p)

Alternative interpretation is believing something you have strong evidence of that contradicts your pre-existing theories.

comment by MBlume · 2012-01-26T07:44:11.676Z · LW(p) · GW(p)

My brother saw the first of the cutie mark episodes first, he was unimpressed.

comment by NancyLebovitz · 2012-01-26T09:17:22.441Z · LW(p) · GW(p)

Cached thoughts are the mindkiller.

Replies from: TheOtherDave
comment by TheOtherDave · 2012-01-26T16:09:29.631Z · LW(p) · GW(p)

I always think that.

comment by mindspillage · 2012-01-26T02:12:20.435Z · LW(p) · GW(p)

One huge category of utterances remains unrepresented:

"Ooh, is there a new Methods of Rationality chapter up yet?"

"I can't believe there's no new chapter yet."

"Have you read Methods of Rationality? You have to read it, OMG."

Replies from: Locke
comment by Joshua Hobbes (Locke) · 2012-01-26T03:01:47.802Z · LW(p) · GW(p)

Now that Eliezer is writing at a decent pace again I'm just desperate for progress updates.

Replies from: Xachariah, shminux
comment by Xachariah · 2012-01-26T09:53:30.974Z · LW(p) · GW(p)

Your response just prompted me to check. I wonder if this is how the pigeons in Skinner's Box felt.

Replies from: Locke
comment by Joshua Hobbes (Locke) · 2012-01-26T18:28:58.838Z · LW(p) · GW(p)

The worst part is that unlike with actual updates we might get them at any time. So I check multiple times per day just to see if he's going to tell us how many words he wrote last night.

comment by shminux · 2012-01-26T07:13:26.127Z · LW(p) · GW(p)

No one said that in a long time.

comment by Anubhav · 2012-01-27T13:22:34.990Z · LW(p) · GW(p)

Took every quote posted in this thread so far and Markov-ised it. (Depth of 1, i.e., each word only depends on the word before it. A depth of 2 basically regurgitates the input in this case.)

Here are some of the results:

"How do you regenerate that beat Reedspacer's Lower Bound?"

"It's a more karma. I interpreted your map, not a group you feel prejudiced against, frequentists for infanticide."

"Look, just generalize from halo effect."

"Okay, but in this is rational under them to make six figures as a cult."

"Beliefs are just an inferential distance and my belief is the territory."

"That's just a survey should do you measure, not throw all the scientific knowledge a dissolved question."

"Any organization with the teacher's password."

"That's just rejecting religion and closely guarded secret, so it but there's no idea how BIG thingspace is."

"Foo is the karma system more likely."

"You make six figures as a thousand shards of inferential distance and I cooperate."

"Conceptspace is white."

"Go ahead and only if and it's objective, but I think he's just what the Nazis did!"

"There's a Philosophy professor about definitions."

"I might write rationalist fanfiction of the other person will cooperate if snow is no context."

"Mysteriousness is true rejection?"

"I'd say politically incorrect about it before proposing solutions?"

"Arguments are you feel prejudiced against, frequentists for infanticide."

"I have to suppress dissent."

"Of course I wanna be the universe with rationality?"

"Money is an argument about my belief that one of a rationalist."

"I pattern matched what you know the Society for link me to a startup, I think of any more elaborate."

"'Snow is being used to say 'purpose'."

Replies from: MixedNuts, army1987
comment by MixedNuts · 2012-01-27T13:40:48.179Z · LW(p) · GW(p)

Frequentists for infanticide! (I hope nobody takes this seriously. I mean, ew. Who'd support such horrible statistics?)

Replies from: Anubhav
comment by Anubhav · 2012-01-27T13:47:43.378Z · LW(p) · GW(p)

Most of those quotes aren't even in the post anymore. :(

(Cut it down drastically because the wall of text was too high.)

comment by A1987dM (army1987) · 2012-01-28T14:32:42.387Z · LW(p) · GW(p)

Awesome.

comment by MBlume · 2012-01-25T23:03:59.373Z · LW(p) · GW(p)

"I know, it's pasta, and it's terrible for me, but at least I poured butter all over it."

comment by Normal_Anomaly · 2012-01-27T03:00:49.821Z · LW(p) · GW(p)

"How do you know that?"

"Have I interpreted your position correctly?"

"That's uncomputable."

"That AI would destroy the world."

"That's a fact about your map, not about the territory."

"That's a duplicate quote."

"What does this have to with rationality?"

"That word doesn't carve reality at its joints."

"I want to do an AI Box experiment."

"Apply the reversal test! You're suggesting we kill all the old people."

"I wish I could self-modify to . . ."

"I'd say something politically incorrect about race, but it would start a flame war."

"I'd say something politically incorrect about gender, but it would start a flame war."

"I'd say something politically incorrect about my ability to say politically incorrect things, but it would start a flame war."

"Karma is being used to suppress dissent."

"Your position is incoherent unless you're a vegetarian."

"Rationality isn't that great."

"We should do a survey on this."

"That survey should have been bigger. We should do another survey."

"Way too much of the scientific literature is wrong."

"We need to make the karma system more elaborate."

"The negation of that statement sounds just as true."

"There's an Everett branch where . . ."

"I've seen research on this but I can't remember it."

"My personalities disagree on this."

"If this is a simulation, does that really matter?"

"My inner Robin Hanson says . . ."

Shit and/or Bullshit LessWrongers Don't Say:

"Let's agree to disagree."

"You're a Communist."

"Just trust me on this."

"That's just like what the Nazis did!"

"Can you explain that with less math?"

"I'm a rationalist."

"Haven't you seen Terminator?"

"But that's unnatural!"

"I don't really want more karma. I feel like I have enough."

Replies from: None
comment by [deleted] · 2012-02-11T15:30:37.890Z · LW(p) · GW(p)

"You're a Communist."

Some of us read Moldbug and Foseti so I'm not too sure... ;)

Replies from: Multiheaded
comment by Multiheaded · 2012-06-02T06:39:49.446Z · LW(p) · GW(p)

(For the record, following your second link is what originally gave me the intuition that "fascist technocracy" might be a real problem, and that it's worth investigating. Before, I was mostly like: "Oh, nothing to worry about, it's just Mencius and he says creepy shit, but he's nice enough really.")

comment by Anubhav · 2012-01-26T14:51:28.086Z · LW(p) · GW(p)

The thread should really be called "Shit LWers Say."

We're not the only group of people calling ourselves "Rationalists", nor are we the most well-known of these groups (not by a long shot).

comment by [deleted] · 2012-01-26T02:23:04.961Z · LW(p) · GW(p)

I'm guilty of all of these:

"Cached thought!"

"That's a wrong question."

"Have you read the Sequences?"

"According to Solomonoff Induction/the universal prior..."

"Stop prepending "rational" to post titles!"

"I know the forbidden idea, and it's not that bad."

"That's just a status/signalling game."

"There's a signalling hypothesis by Robin Hanson..."

"TDT/UDT says..."

comment by EchoingHorror · 2012-01-26T21:30:18.185Z · LW(p) · GW(p)

"I want to get my microexpressions analyzed so I can know what I'm thinking."

comment by Curiouskid · 2012-01-26T02:11:17.007Z · LW(p) · GW(p)

Consider this a study guide for newbies who want to measure how much of LW they understand.

comment by Prismattic · 2012-01-26T01:37:38.161Z · LW(p) · GW(p)

Post title should be "Shit LWers say". Not all Bayesians sound like regulars on this website.

Replies from: NancyLebovitz, DavidAgain
comment by NancyLebovitz · 2012-01-26T09:16:27.091Z · LW(p) · GW(p)

What's your prior that a Bayesian is also an LWer?

comment by DavidAgain · 2012-01-26T07:14:51.052Z · LW(p) · GW(p)

And not all rationalists are Bayesians, come to that.

comment by Karmakaiser · 2012-01-27T01:56:37.828Z · LW(p) · GW(p)

Rationalist PUA:

"What's you number? [...] No not that. Your Erdos number."

Rationalist insults:

"...Frequentist."

While twirling a paperclip: "I do not love you, nor do I hate you." (well, more of a threat.)

"I bet (73%) that you're a really consistent person. The sort of person whose decisions are final. Like say, in Monty Hall."

"Name three.."

Replies from: pedanterrific
comment by pedanterrific · 2012-01-27T03:00:12.504Z · LW(p) · GW(p)

While twirling a paperclip: "I do not love you, nor do I hate you."

More of a threat, surely?

Replies from: Karmakaiser
comment by Karmakaiser · 2012-01-27T15:15:24.438Z · LW(p) · GW(p)

I could go into a long semantic argument putting threat as some subclass in insult but you're right. Oops.

comment by Alejandro1 · 2012-01-26T03:37:39.654Z · LW(p) · GW(p)

Some line in the video (maybe the "corrupted hardware" one) should be said by someone with a desk full of piles of papers (connoting unfinished, urgent work) on topics like FAI, and a computer screen with open windows on TvTropes, Fanfiction.net, and a LW post on akraisa.

comment by FiftyTwo · 2012-01-26T06:40:45.359Z · LW(p) · GW(p)

"Source?"

Replies from: D_Alex
comment by D_Alex · 2012-01-27T08:27:52.731Z · LW(p) · GW(p)

Nah, that's 4chan.

comment by CharlesR · 2012-01-26T06:34:26.440Z · LW(p) · GW(p)

"What Shock Level are you?"

Replies from: Raemon
comment by Raemon · 2012-01-26T07:00:41.634Z · LW(p) · GW(p)

This is the first one that stumps me.

Replies from: MBlume, CharlesR
comment by MBlume · 2012-01-26T07:45:42.028Z · LW(p) · GW(p)

That's because it's old -- more of a Shit SL4ers say

comment by CharlesR · 2012-01-26T15:48:25.049Z · LW(p) · GW(p)

The question came up at the West LA LW Meetup. Only two people knew what it meant.

comment by Raemon · 2012-01-25T22:33:31.890Z · LW(p) · GW(p)

I think I like this conceptually (I would get warm fuzzies if done right), except I'm trying to detach (in my own mind as well as others), the word "rationalist" from "the Less Wrong memeplex." Which is probably a pipe dream, and such a video would not contribute (much) to the issue one way or another, and "Shit Less Wrong folks say" doesn't have the same ring to it.

comment by maia · 2012-06-13T00:12:42.488Z · LW(p) · GW(p)

"You're going to [insert life plan here]? Why don't you just go work on Wall Street and donate your money to AMF?"

comment by Rubix · 2012-02-09T20:19:12.496Z · LW(p) · GW(p)

"This, modulo that."

"It's not obvious that..."

"...is the obvious low-hanging fruit."

"HPJEV is my fantasy jerkass boyfriend."

"Eat this rock salt."

"Eat this potassium salt."

"Have you played Mage: The Ascention?"

"Can you be a CEV machine and order food for me?"

"Scumbag aliefs!"

"Scumbag subconscious!"

"Scumbag physics!"

"I want to want to do that."

comment by thomblake · 2012-01-25T22:12:31.735Z · LW(p) · GW(p)

Wow, most of that looks really weird, and putting it all together like that seems to associate 'rationalist' with 'weird object-level beliefs'.

Replies from: Normal_Anomaly
comment by Normal_Anomaly · 2012-06-03T15:06:14.065Z · LW(p) · GW(p)

There's a selection bias. We also say plenty of normal things, but they aren't unique to us, so they don't get put here.

comment by Solvent · 2012-01-27T00:16:23.328Z · LW(p) · GW(p)

From RationalWiki:

How to spot a LessWrongian in your sceptical discussion

In the outside world, the ugly manifests itself as LessWrong acolytes, minds freshly blown, metastasising to other sites, bringing the Good News for Modern Rationalists, without clearing their local jargon cache. Sceptical discussion spaces will often have people show up labelling themselves "rationalist" and being as irritating as teenage nerds who've just discovered Ayn Rand. Take one drink for each of:

"As rationalists, we should ..."

"As a rationalist, you should ..."

"Bayesian" (particularly when they seem to have picked a real doozy of a prior probability; no working will ever be shown)

Preaching "rationalism" as if it's a religion substitute.[26][27]

Taking offense when someone inevitably points out that they're preaching a religion substitute.

Attempting to argue their points with them resulting in being told "You should try reading the sequences" rather than them addressing your point.[28]

"Eliezer Yudkowsky suggests ..."

Replies from: pedanterrific
comment by pedanterrific · 2012-01-27T20:39:29.642Z · LW(p) · GW(p)

minds freshly blown

Band name alert.

comment by Karmakaiser · 2012-01-26T22:39:27.345Z · LW(p) · GW(p)

"Rational Approach..."

comment by [deleted] · 2012-03-23T03:39:40.300Z · LW(p) · GW(p)

"Stop dissecting my hypothetical..."

"Do you operate under Crockers rules?"

"That is a question about quality of life, not about death badness!"

comment by MichaelVassar · 2012-02-09T20:01:47.693Z · LW(p) · GW(p)

"It's not obvious that..."

"...is the obvious low-hanging fruit."

"HPJEV is my fantasy jerkass boyfriend."

"This, modulo that."

[EDIT: Not actually Michael Vassar.]