post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by Jiro · 2022-06-27T14:45:58.093Z · LW(p) · GW(p)

There are plenty of “successful people” I don’t trust.

Being successful is a necessary condition for trusting them, not a sufficient condition.

comment by Viliam · 2022-06-27T21:41:24.102Z · LW(p) · GW(p)

I had a problem understanding your thesis, and I am still not sure that I do. It feels like... uhm, let me express it by a modified version of Litany of Tarski [? · GW]:

If rich people approve of rationality, I desire to be rational.

If rich people disapprove of rationality, I desire to be irrational.

Let me not become attached to epistemic strategies that may be uncool.

Replies from: Matt Goldwater
comment by UtilityMonster (Matt Goldwater) · 2022-06-28T23:47:37.622Z · LW(p) · GW(p)

I meant to convey that I was evaluating my trust in the rationalist community, not rationality itself. 

And I concluded that the opinions of really successful people is only a minor factor affecting my trust in the rationalist community.

comment by [deleted] · 2022-06-28T08:13:21.121Z · LW(p) · GW(p)

How much gain do you think is actually available for someone who is still limited by human tissue brain performance and just uses the best available consistently winning method?

The consistently winning method with the best chance of success isn't very interesting.  It just means you go to the highest rated college you can get accepted to, you take the highest paying job from the highest tier company that you can get in at, you take low interest loans, you buy index funds as investments and stay away from lower gain assets like real estate, and so on.  

Over a lifetime most people who do this will accumulate a few million dollars in net worth and have jobs that pay ~10 times the median.  (~400k for an ML engineer with ~10 yoe or a specialist doctor)

I would argue that this, for most humans, is the most rational.  Better to have 75-90 percent of your futures be good ones than to have 5% of your futures be good ones, but your average income is far higher in the second case.  (from the few futures where you became a billionaire bringing up the average)

Extraordinary success - billions in net worth - generally requires someone to either be born with millions in seed capital (non rational) or to gamble it all on a long shot and get lucky (non rational) or both.  

Billionaire level success is non replicable.  It requires taking specific actions that in most futures would have failed during narrow and limited windows of opportunity, and having had the option to take those actions at all.  This is essentially the definition of luck.

Replies from: Matt Goldwater, dkirmani
comment by UtilityMonster (Matt Goldwater) · 2022-06-29T03:42:36.984Z · LW(p) · GW(p)

I'd say it's rational to maximize expected utility [LW · GW]. The small probability of an enormous success could outweigh the larger probability of a failure that won't ruin your life.

comment by dkirmani · 2022-06-28T12:46:32.511Z · LW(p) · GW(p)

How much gain do you think is actually available for someone who is still limited by human tissue brain performance and just uses the best available consistently winning method?

Quite a bit.

https://www.gwern.net/on-really-trying

comment by TAG · 2022-06-28T06:22:18.722Z · LW(p) · GW(p)

I said that because effective altruist / rationalist writers seemed smart, I cautiously trusted many of their opinions.

Just generally smart? Why not trust people with qualifications and experience in specific fields?

Replies from: Matt Goldwater
comment by UtilityMonster (Matt Goldwater) · 2022-06-29T03:02:57.540Z · LW(p) · GW(p)

I was paraphrasing. I agree it makes sense to trust people when they're talking about things they seem to know more about.