## Posts

A LessWrong Crypto Autopsy 2018-01-28T09:01:49.799Z

Comment by Scott Alexander (scott-alexander) on 2018 Prediction Contest - Propositions Needed · 2018-03-29T17:19:41.892Z · LW · GW

You might want to try adapting some of the ones from http://slatestarcodex.com/2018/02/06/predictions-for-2018/ and the lists linked at the bottom.

Comment by scott-alexander on [deleted post] 2018-03-20T23:19:52.567Z

Agreed that some people were awful, but I still think this problem applies.

If somebody says "There's a 80% chance of rain today, you idiot, and everyone who thinks otherwise deserves to die", then it's still not clear that a sunny day has proven them wrong. Or rather, they were always wrong to be a jerk, but a single run of the experiment doesn't do much to prove they were wronger than we already believed.

Comment by scott-alexander on [deleted post] 2018-03-20T20:59:08.951Z

The weatherman who predicts a 20% chance of rain on a sunny day isn't necessarily wrong. Even the weatherman who predicts 80% chance of rain on a sunny day isn't *necessarily* wrong.

If there's a norm of shaming critics who predict very bad outcomes, of the sort "20% chance this leads to disaster", then after shaming them the first four times their prediction fails to come true, they're not going to mention it the fifth time, and then nobody will be ready for the disaster.

I don't know exactly how to square this with the genuine beneficial effects of making people have skin in the game for their predictions, except maybe for everyone to be more formal about it and have institutions that manage this sort of thing in an iterated way using good math. That's why I'm glad you were willing to bet me about this, though I don't know how to solve the general case.

Comment by Scott Alexander (scott-alexander) on God Help Us, Let’s Try To Understand Friston On Free Energy · 2018-03-06T07:26:02.881Z · LW · GW

Re: the "when friends and colleagues first come across this conclusion..." quote:

A world where everybody's true desire is to rest in bed as much as possible, but where they grudgingly take the actions needed to stay alive and maintain homeostasis, seems both very imaginable, and also very different from what we observe.

Comment by Scott Alexander (scott-alexander) on A LessWrong Crypto Autopsy · 2018-01-29T02:47:14.611Z · LW · GW

I think it says something good about our community that whoever implemented this feature assumed people would be more likely to want to write mathematics than to discuss amounts of money.

Comment by Scott Alexander (scott-alexander) on A LessWrong Crypto Autopsy · 2018-01-29T01:07:27.026Z · LW · GW

I can't click your link, but I disagree. MIRI got most of its money from Vitalik, who I think was into crypto first and then found rationality/LW. We don't get any credit for that.

Also, MIRI got a 500,000 dollar (why can't I make the dollar sign on this site?) worth of Ripple donation in 2014. If they had kept it as Ripple, it would be worth 50 million now. Instead they sold it for 500,000 dollars (I'm not blaming them, this made sense at the time).

So although MIRI and CFAR lucked out into getting some money from crypto, I don't think it was primarily because of their (or our) great decisions. And if people had made great decisions they could have gotten much more.