SBF x LoL

post by Nicholas / Heather Kross (NicholasKross) · 2022-11-15T20:24:52.041Z · LW · GW · 6 comments

Contents

6 comments

6 comments

Comments sorted by top scores.

comment by cata · 2022-11-15T21:44:44.199Z · LW(p) · GW(p)

I have really different priors than it seems like a lot of EAs and rationalists do about this stuff, so it's hard to have useful arguments. But here are some related things I believe, based mostly on my experience and common sense rather than actual evidence. ("You" here is referring to the average LW reader, not you specifically.)

  • Most important abilities for doing most useful work (like running a hedge fund) are mostly not fixed at e.g. age 25, and can be greatly improved upon. FTX didn't fail because SBF had a lack of "working memory." It seems to have failed because he sucked at a bunch of stuff that you could easily get better at over time. (Reportedly he was a bad manager and didn't communicate well, he clearly was bad at making decisions under pressure, he clearly behaved overly impulsively, etc.)
  • Trying to operate on 5 hours of sleep with constant stimulants is idiotic. You should have an incredibly high prior that this doesn't work well, and trying it out and it feeling OK for a little while shouldn't convince you otherwise. It blows my mind that any smart person would do this. The potential downside is so much worse than "an extra 3 hours per day" is good.
  • Common problems with how your mind works like "can't pay attention, can't motivate myself, irrationally anxious" aren't always things where you need to find silver bullet, quick fixes, or else live with them forever. They are typically amenable to gradual directional improvement.
  • If you are e.g. 25 years old and you have serious problems like that, now is a dumb time to try to launch yourself as hard as possible into an ambitious, self-sacrificing career where you take a lot of personal responsibility. Get your own house in order.
  • If you want to do a bunch of self-sacrificing, speculative burnout stuff anyway, I don't believe for a minute that it's because you are making a principled, altruistic, +EV decision due to short AI timelines, or something. That's totally inhuman. I think it's probably basically because you have a kind of outsized ego and you can't emotionally handle the idea that you might not be the center of the world.

P.S. I realize you were trying to make a more general point, but I have to point out that all this SBF psychoanalysis is based on extremely scanty evidence, and having a conversation framed as if it is likely basically true seems kind of foolish.

Replies from: NicholasKross
comment by Nicholas / Heather Kross (NicholasKross) · 2022-11-16T02:17:13.279Z · LW(p) · GW(p)

I agree with the first two points and partly the 4th point. Also the P.S. (I tried to hedge with words like "likely" but I didn't really proofread this a lot).

The 5th point seems like it could apply to me specifically, but like... I don't really know how I'd solve my ego problem, and it's still not clear how bad or whether that's bad in my situation (again, one of my broader points). I know this is likely to be a defense mechanism but... I'm okay with it? Is there decision theory about whether whether I should try to become less inhuman?

Replies from: cata
comment by cata · 2022-11-16T07:20:37.543Z · LW(p) · GW(p)

If that resembles you, I don't know if it's a problem for you. Maybe not, if you like it. I was just expressing that when I see someone appearing to do that, like the FTX people, I don't take their suggestion that the way they are going about it is really good and important very seriously.

Replies from: NicholasKross
comment by Nicholas / Heather Kross (NicholasKross) · 2022-11-17T02:05:48.604Z · LW(p) · GW(p)

Alright, well thanks for engaging with it and me!

comment by JBlack · 2022-11-16T00:21:26.596Z · LW(p) · GW(p)

Commenting only on the last paragraph, existing decision theory addresses this adequately: if only a big win is acceptable and everything lesser is essentially equivalent, then according to pretty much any decision theory, you should maximize your probability of achieving that big win even when probability is small.

Kelly betting maximizes long-run expected growth rate, but not probability of reaching a threshold. The criteria for maximizing chance to reach a (large) target are more complex depending upon the context, but are generally less conservative.

comment by zjx39062 · 2022-11-16T07:48:06.835Z · LW(p) · GW(p)

The claim about LoL has some merit. At my peak I was top 1% on the ranked ladder, and the best advice I would give to new players is: play 3 games with every champion so you know what they do. The most important skill in LoL is being able to simulate a few seconds forward in time so you're not surprised by anything. As you play longer, you will naturally get better as you encounter more situations.

If you haven't played much, you might not understand that Bronze is an extremely low rank for people that play seriously. When I started playing, I was sorted into mid-Silver after my first 10 matches. I've had numerous friends, including one who rarely plays games, who have quickly gone from no experience and low Bronze to Silver and sometimes Gold. I don't know a single person who's remained in Bronze and plays ranked games with any regularity. SBF's performance is unusually, though believably, bad -- and not entirely due to poor sleep and drug use; he played thousands of games at MIT and wasn't much better then.