Meetup report: Darwin, Some Rationalists and the Joker [link]

post by XFrequentist · 2011-05-09T19:01:14.541Z · LW · GW · Legacy · 7 comments

This past weekend the Ottawa LessWrong meetup hosted Venkat Rao, who is on a nomadic journey to promote his (good) new book and to find enlightenment through deliberate idleness.

I thought the resulting post might be of some interest.

Here's an excerpt:

For those of you aren’t familiar with Less Wrong, it is a community of rationalists associated with the Singularity Institute. I admit I was rather wary and curious at the same time.

[...]

I suspect I am in a sort of evil-twin relationship with the Less Wrong philosophy of cognition and decision-making. When I said this at the meetup, one of the attendees remarked, “…and you’re the evil twin.”

[...]

To my pleasant surprise though, the meeting was a great deal of fun. I guess I was sort of expecting an inquisition by a panel of Spocks given the views I espouse, but it was mainly a freewheeling open-ended discussion that went down plenty of interesting rabbit holes. Beer, nachos and bad geek jokes flowed freely.

[...]

I am now less wary of the lesswrongers than I used to be. They bring a healthy sense of doubt, irony, aesthetics and skepticism to their passion for rationality.

He goes on to say that he'll continue to make fun of LessWrongers, but it says something of Venkat's charming wit that he was able to drop the "Spock" and "faith" bombs (both of which irk me), and I still like the guy.

On a related note, it was a great deal of fun having a special guest appearance. If anyone's passing through, please drop a line! I'd recommend either coming in the summer, or bringing your ice skates!

7 comments

Comments sorted by top scores.

comment by cousin_it · 2011-05-09T19:38:31.590Z · LW(p) · GW(p)

I enjoyed Venkat's theory of corporate life and his post about right questions. Unfortunately I don't know how to switch someone over from the dangerous attractor of "narrative rationality" (basically, attachment to self-generated deep wisdom) to actual rationality. Mencius Moldbug is another example of such a person, very smart but so far gone that trying to switch him over would be a lost cause.

Replies from: Vladimir_M
comment by Vladimir_M · 2011-05-10T17:47:40.697Z · LW(p) · GW(p)

Unfortunately I don't know how to switch someone over from the dangerous attractor of "narrative rationality" (basically, attachment to self-generated deep wisdom) to actual rationality.

Why would you want to do this anyway? What we have in terms of "actual rationality," as you call it, is often excellent for detecting bullshit, but it's still largely impotent when it comes to generating interesting hypotheses and novel insight about many (if not most) interesting questions. In contrast, smart people who follow the "narrative" path will inevitably end up producing lots of nonsense in the process, but as long as you avoid getting carried away and take care to apply a bullshit filter to their output consistently, what remains will often contain otherwise unreachable gems of insight. Even when the "narrative" attractor lowers the average accuracy of beliefs of people who fall into it, the value of their output for a careful reader may still be higher than if they were restrained by more stringent intellectual standards.

Replies from: cousin_it
comment by cousin_it · 2011-05-10T18:57:04.166Z · LW(p) · GW(p)

For generating content that is pleasant to consume, many things have a better track record than rationality, e.g. some musicians can write really interesting music while drunk or high. But if you need actionable information, most instances of beautiful-sounding insight generated by "narrative-minded" people fail when you try to apply them, because they weren't selected for correctness.

Replies from: Vladimir_M
comment by Vladimir_M · 2011-05-10T20:09:37.588Z · LW(p) · GW(p)

I mostly had in mind insight that's interesting for reasons of intellectual curiosity rather than practical usefulness (which is still a higher bar than content that's merely pleasant to consume, like music). You are of course right that practically useful insight is much rarer, but I don't think the output of most narrative-minded people would be improved in this regard by making them adhere to stricter intellectual standards.

It would be great if their creative intellectual excursions could somehow be made to home onto correct insight more frequently, but I doubt this could be accomplished in any practical way. At best, you could make them apply a stricter bullshit filter to their existing output, but this isn't much of an improvement over filtering it yourself. At worst, this could make them more cautious and improve the average accuracy of their output only at the cost of lowering its peak quality.

comment by atucker · 2011-05-10T01:45:47.954Z · LW(p) · GW(p)

What's this about Batman vs. Joker rationality?

Replies from: XFrequentist
comment by XFrequentist · 2011-05-10T15:02:07.574Z · LW(p) · GW(p)

The Batman archetype describes (in Venkat's view) much of LessWrong: world saving do-gooders. He thinks that groups of Batmans (Batmen?) are vulnerable to Jokers, who essentially have the sole goal of producing chaos for their own amusement. Discussing the contrast, who you'd expect to win at various timescales, and which view makes more sense, was surprisingly illuminating - it led us to explore historicism, Winning, whether cryonics is a good bet, and much more. Progress vs. Decay would have produced much of the same conversation, I suspect.

Clearly, to call the Joker approach "rationality" is to use the term very differently than we do.

Replies from: TAG
comment by TAG · 2018-04-12T17:00:32.589Z · LW(p) · GW(p)

The Joker is the ultimate skeptic.