[Link] "Where are all the successful rationalists?"

post by ioannes (ioannes_shade) · 2020-10-17T19:58:33.560Z · LW · GW · 10 comments

Contents

10 comments

https://applieddivinitystudies.com/2020/09/05/rationality-winning 

From near the end:

The primary impacts of reading rationalist blogs are that 1) I have been frequently distracted at work, and 2) my conversations have gotten much worse. Talking to non-rationalists, I am perpetually holding myself back from saying "oh yes, that’s just the thing where no one has coherent meta-principles" or "that’s the thing where facts are purpose-dependent". Talking to rationalists is not much better, since it feels less like a free exchange of ideas, and more like an exchange of "have you read post?"

There are some specific areas where rationality might help, like using Yudkowsky’s Inadequate Equilibria to know when it’s plausible to think I have an original insight that is not already "priced into the market", but even here, I’m not convinced these beat out specific knowledge. If you want to start a defensible monopoly, reading about business strategy or startup-specific strategy will probably be more useful than trying to reason about "efficiency" in a totally abstract sense.

And yet, I will continue reading these blogs, and if Slate Star Codex ever releases a new post, I will likely drop whatever I am doing to read it. This has nothing to do with self-improvement or "systematized winning.

It’s solely because weird blogs on the internet make me feel less alone.

10 comments

Comments sorted by top scores.

comment by mingyuan · 2020-10-18T02:53:37.123Z · LW(p) · GW(p)

There's no way to comment on the original, so I'll say here that I'm so insanely incredibly sick of the 'rationalists don't actually win' meme.

The classic rebuttal these days is that we were better-prepared for COVID than most and have avoided catching it better than base rates would predict; that seems like a win. We also created Epidemic Forecasting (which had rationalist members advising national governments) and the microCOVID Project. You also might count signing up for cryonics as a win, or becoming a prolific writer. Multiple rationalists have been internationally ranked Magic: The Gathering players (at least Zvi Mowshowitz and Aaron Gertler, there might be others). 

And even if the navel-gazey type of success 'doesn't count,' I still think it's meaningful that we successfully put on major events like EAG and get people like Elon Musk to come. There's all the work we've done to put AI safety on the map, which might turn out to be really important. There's the revitalization of LessWrong thanks to the dedicated work of a few individuals. I'd wager we have more books published per capita than most communities. 

For something more mainstream, plenty of rationalists have been very financially successful. Sam Bankman-Fried, who runs a crypto trading company, is perhaps the biggest example, but he's not the only one. Multiple rationalists made millions off of Bitcoin. Conor White-Sullivan's Roam got a crazy high seed valuation of $200M last month. ZeroCater was also founded by a rationalist-adjacent guy. So yes, rationalists do run successful startups and hedge funds.

What do people want us to do, become president of the US? Start our own country? Politics is not really our arena, but Open Phil's access to $10 billion translates into not-insignificant influence on many parts of reality. I just don't get what would count as obvious success, short of being solely and visibly responsible for a positive singularity. Agh!!!

Replies from: ChristianKl, ioannes_shade
comment by ChristianKl · 2020-10-27T01:30:00.229Z · LW(p) · GW(p)

Multiple rationalists have been internationally ranked Magic: The Gathering players (at least Zvi Mowshowitz and Aaron Gertler, there might be others). 

Zvi entered the magic hall of fame in 2007, which suggests that he was a good magic player before the rationalist community existed. In any case, Magic isn't a highly rewarding activity and "becoming a professional magic player" doesn't seem like a rationalist career choice. It's likely that the rationalist thinking got him out of being a magic pro.

comment by ioannes (ioannes_shade) · 2020-10-18T17:53:27.796Z · LW(p) · GW(p)

Yes! I'm also reminded of Romeo's comment [LW(p) · GW(p)] about rationality attracting "the walking wounded" on a similar post from a couple years back.

I think rationality is doing pretty good, all things considered, though I definitely resonate with Applied Divinity Studies' viewpoint. Tsuyoku Naritai!

comment by Viliam · 2020-10-18T16:29:57.697Z · LW(p) · GW(p)

Two questions:

  • how should we measure the "success of rationalists"?
  • how big success should we expect?

These are related, because the bigger the success, the easier it would be to measure it. Like, if rationalists would openly take over the world, create a Global Rationalist Government, fix poverty worldwide, make cryonics mandatory, etc., no one would be asking whether the rationalists are truly winning.

On the other hand, if I had a magical button which would guarantee that every rationalist will in five years have twice as much wealth (compared to the counterfactual world where the magic button does not work), its effects would be mostly invisible. Maybe the affected people would notice, but even they could easily attribute the effects to something else, such as their own hard work or luck. From outside: the millionaires would remain millionaires, the employees would mostly remain employees, the poor ones would remain mostly poor.

This gives us two versions of the complaint:

  • if we expect big success, why don't we obviously observe it?
  • if we expect small success, what makes us believe it exists, if we are not measuring it?

I believe that many people have the "small" success, but yeah, I don't have solid data to back this statement. I wish I had.

Also, we had some "big" successes. Specifically: AI safety became a mainstream topic, or the existence of effective altruism. These are easy to downplay, but I think that 10 years ago, few people would have predicted that AI safety will become a mainstream topic.

I also believe that we could do much better, and I hope that one day we will. I think that an important task will be learning to cooperate better. (Which includes learning how to avoid being exploited by malicious actors, which unfortunately also sometimes target our community. But the solution is not keeping everyone at internet distance. The solution is to develop some mechanism to "trust but verify".)

On the other hand, the stories about Beisutsukai, Anasûrimbor Kellhus, or recursively self-improving AI, primed us to expect "rationalist fast takeoff". Of which there is no evidence so far. The rationalist takeoff -- if there is such a thing -- is definitely slow. I wonder if there is a way to make it faster; I think our behavior is far from optimal, but I wonder whether it can realistically get better.

comment by norswap · 2020-10-22T17:51:42.245Z · LW(p) · GW(p)

The question pops up regularly. Jacob (Jacobian on here) wrote an answer here: https://putanumonit.com/2019/12/08/rationalist-self-improvement/

One issue I see is the narrow definition of winning used here. I think that people reflective enough to embrace rationality would also be more likely to reconsider the winning criteria not to just be "become filthy rich and/or famous". Consider that maybe the prize is not worth the price. I'd be more interested into people that have become wealthy/established/successful in their fields (without becoming a rock star I mean, just plain old successful, enough to be free of worries and pursue one's one direction).

comment by waveman · 2020-10-19T06:05:47.703Z · LW(p) · GW(p)

It’s solely because weird blogs on the internet make me feel less alone.

 

Numerous people have said this to me about the whole rationality movement. That feeling "I am not the only one". 

comment by ChristianKl · 2020-10-27T11:29:08.791Z · LW(p) · GW(p)

Talking to rationalists is not much better, since it feels less like a free exchange of ideas, and more like an exchange of "have you read post?"

With what kind of rationalists are you talking? 

So where are all the winners?

Dominic Cummings did basically out himself. Foreign Policy writes about him having had "almost total power over Britain" for some period of time. That was written back in June. Now a article from two weeks ago still describes him as one of the most powerful people in UK politics. 

comment by athom · 2020-10-18T03:32:20.795Z · LW(p) · GW(p)

Is there any data on how people feel like rationality has changed their lives? Somewhat separate to what we're doing as a community / what some of the most successful rationalists are up to, it seems worth trying to work out how engaging more with rationality changes things for people. I'm pretty sure my engagement with rationality has helped me become stronger; if it turned out not to help most people that would definitely change how I presented things.

Replies from: thomas-kwa, mary-chernyshenko
comment by Thomas Kwa (thomas-kwa) · 2020-10-18T20:17:26.539Z · LW(p) · GW(p)

I'm planning to write a sequence on all positive and negative effects the practice of rationality has had on my life, and I already have one post [LW · GW] on pitfalls. Future posts will probably be about things like

  • forming realistic expectations
  • reading LW in a way that's less reality-masking
  • a list of all the low-hanging fruit I've found
  • a list of interventions I've tried
  • weird effects from learning some rationality techniques but not others
  • the benefits of diversifying one's sources of knowledge
  • re-learning how to interact with non-rationalist society

These will take a while to gestate and write up, but I expect to have a significant amount of content out before this time next year. Of course, this doesn't replace a community survey, but I think a case study with emphasis on the practical will contribute significantly.

comment by Mary Chernyshenko (mary-chernyshenko) · 2020-10-18T10:08:31.627Z · LW(p) · GW(p)

I kind of view this as a process of sequential self-sorting.

Some people go on and become obviously successful and develop a culture with norms, I think almost fully described by the early-to-middle LW proclaimed ideals. I am not among them and can't say for certain; I am only certain they keep diversifying.

Other people trickle off into the other branch which keeps branching. I am among them. I keep running into a need to express thoughts according to virtuous rules which overshadows other needs most starkly, but I don't feel inclined to make more things (including immaterial). Some thoughts have been nesting in me for years and I don't expect them to leave soon.