The Craft and the Community

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-04-26T17:52:21.611Z · LW · GW · Legacy · 10 comments

Contents

10 comments

This sequence ran from March to April of 2009 and dealt with the topic of building rationalist communities that could systematically improve on the art, craft, and science of human rationality. This is a highly forward-looking sequence - not so much an immediately complete recipe, as a list of action items and warnings for anyone setting out in the future to build a craft and a community.

10 comments

Comments sorted by top scores.

comment by SirBacon · 2009-04-26T20:03:33.682Z · LW(p) · GW(p)

"...then there's the idea that rationalists should be able to (a) solve group coordination problems, (b) care a lot about other people and (c) win..."

Why should rationalists necessarily care a lot about other people? If we are to avoid circular altruism and the nefarious effects of other-optimizing, the best amount of caring might be less than "a lot."

Additionally, caring about other people in the sense of seeking emotional gratification primarily in tribe-like social rituals may be truly inimical to dedicating one's life to theoretical physics, math, or any other far-thinking discipline.

Caring about other people may entail involvement in politics, and local politics can be just as mind-killing as national politics.

Replies from: ryleah, cabalamat, astray, Cameron_Taylor
comment by ryleah · 2014-10-08T00:09:00.515Z · LW(p) · GW(p)

Sorry to answer a 5 year old post, but apparently people read these things. You asked "Why should rationalists necessarily care a lot about other people," but all the post said was that they should be able to.

comment by cabalamat · 2009-04-28T03:08:51.077Z · LW(p) · GW(p)

Why should rationalists necessarily care a lot about other people?

They shouldn't, particularly. End goals are not a part of rationality, rationality exists to achieve them.

However, many end goals can be more easily achieved by getting help from others. If your end goals are like this, it's rational for you to solve group coordination problems and care about other people.

comment by astray · 2009-04-28T21:49:05.443Z · LW(p) · GW(p)

I don't think that b is necessarily an immediate entailment of rationality, but a condition that can be met simultaneously with a and c. The post presents a situation where c is satisficed only through a and b. (It does not take much finagling to suppose that a lonesome mountain man existence in a world ruled by barbarians is inferior in fuzziness and utilons relative to the expectation of the world where a b and c are held to be true.)

comment by Cameron_Taylor · 2009-04-27T15:32:59.124Z · LW(p) · GW(p)

Good point Bacon. I've been wondering where the implicit assumption that rational agents have an altruistic agenda came from. The assumption seems to permeate a rather large number of posts.

When Omega offers to save lives, why do I care? To be perfectly honest, my own utility function suggests that those extra billions are a liability to my interests.

When I realise that my altruistic notions are in conflict with my instinctive drive for status and influence, why do I "need to move in the direction of joining groups more easily, even in the face of annoyances and apparent unresponsiveness"? If anything it seems somewhat more rational to acknowledge the drive for status and self-interest as the key component and satisfy those criteria more effectively.

This isn't to say I don't have an altruistic agenda that I pursue. It is just that I don't see that agenda itself as 'rational' at all. It is somewhere between merely arbitrary and 'slightly irrational'.

With that caveat, this summary and plenty of the posts contained within are damn useful!

Replies from: SirBacon
comment by SirBacon · 2009-04-27T19:47:32.040Z · LW(p) · GW(p)

"With that caveat, this summary and plenty of the posts contained within are damn useful!"

I resoundingly agree.

That said, Eliezer is attempting to leverage the sentiments we now call "altruistic" into efficient other-optimizing. What if all people are really after is warm fuzzies? Mightn't they then shrink from the prospect of optimally helping others?

Hobbes gives us several possible reasons for altruism, none of which seem to be conducive to effective helping:

"When the transferring of right is not mutual, but one of the parties transferreth in hope to gain thereby friendship or service from another, or from his friends; or in hope to gain the reputation of charity, or magnanimity; or to deliver his mind from the pain of compassion [self-haters give more?]; or in hope of reward in heaven; this is not contract, but gift, free gift, grace: which words signify one and the same thing."

There is also the problem of epistemic limitations around other-optimizing. Charity might remove more utilons from the giver than it bestows upon the receiver, if only because it's difficult to know what other people need and easier to know what oneself needs.

Replies from: adamisom
comment by adamisom · 2012-02-12T00:48:32.161Z · LW(p) · GW(p)

"Mightn't" we shrink from optimal helping? "Might" charity be usually an imbalance of utilons?

Yes, we might, it might.

These are important considerations--I don't mean to denigrate clear thinking. But to lie content with hypothetical reasons why something wouldn't work, due to a common hidden laziness of most humans but which we can convince ourselves is due to more noble and reasonable reasons, is to completely miss the most crucial point of this entire Sequence: actually doing something, testing.

I think it's safe to say that the natural inclination of most humans isn't initiating large projects with high but uncertain reward. It's to "just get by", a fact which I must thank you, good sir, for illustrating.... it was intentional, right?

comment by MrHen · 2009-04-27T14:05:29.886Z · LW(p) · GW(p)

I like summary posts like this, by the way. It makes it much easier to find what I am looking for later and helps get the wiki started.

comment by ESRogs · 2011-07-15T23:19:01.278Z · LW(p) · GW(p)

Donating hours worked within a professional specialty and paying-customer priority, whether directly, or by donating the money earned to hire other professional specialists, is far more effective than volunteering unskilled hours.

What does "paying-customer priority" refer to in the above sentence? Is 'paying' being used as a verb or is "paying-customer priority" something that is being donated?

comment by evtujo · 2009-05-02T05:39:39.581Z · LW(p) · GW(p)

There's an interesting googletech video on how to create an online community with desired attributes (a la stackoverflow.com): http://www.youtube.com/watch?v=NWHfY_lvKIQ