The Apocalypse Bet

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-08-09T17:23:33.000Z · LW · GW · Legacy · 51 comments

Contents

51 comments

A problem with betting on engineered superplagues, physics disasters, nanotechnological warfare, or intelligence explosions of both Friendly and unFriendly type, is that all these events are likely to disrupt settlement of trades (to put it mildly).  It's not easy to sell a bet that pays off only if the prediction market ceases to exist.

And yet everyone still wants to know the year, month, and day of the Singularity.  Even I want to know, I'm just professionally aware that the knowledge is not available.

This morning, I saw that someone had launched yet another poll on "when the Singularity will occur".  Just a raw poll, mind you, not a prediction market.  I was thinking of how completely and utterly worthless this poll was, and how a prediction market might be slightly less than completely worthless, when it occurred to me how to structure the bet - bet that "settlement of trades will be disrupted / the resources gambled will become worthless, no later than time T".

Suppose you think that gold will become worthless on April 27th, 2020 at between four and four-thirty in the morning.  I, on the other hand, think this event will not occur until 2030.  We can sign a contract in which I pay you one ounce of gold per year from 2010 to 2020, and then you pay me two ounces of gold per year from 2020 to 2030.  If gold becomes worthless when you say, you will have profited; if gold becomes worthlesss when I say, I will have profited.  We can have a prediction market on a generic apocalypse, in which participants who believe in an earlier apocalypse are paid by believers in a later apocalypse, until they pass the date of their prediction, at which time the flow reverses with interest.  I don't see any way to distinguish between apocalypses, but we can ask the participants why they were willing to bet, and probably receive a decent answer.

I would be quite interested in seeing what such a market had to say.  And if the predicted date was hovering around 2080, I would pick up as much of that free money as I dared.


EDIT:  Robin Hanson pointed out why this wouldn't work.  See comments.

51 comments

Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).

comment by Ned · 2007-08-09T18:00:00.000Z · LW(p) · GW(p)

There is no need for multiple settlement dates. If someone firmly believes the end-of-the-world event is going to occur on Apr. 1st. 2020, he should be ready to issue promissory notes to pay any amount on Apr. 2nd, in return for very small (no?) charge.

comment by Sebastian_Hagen2 · 2007-08-09T20:07:12.000Z · LW(p) · GW(p)

If someone firmly believes the end-of-the-world event is going to occur on Apr. 1st. 2020, he should be ready to issue promissory notes to pay any amount on Apr. 2nd, in return for very small (no?) charge. Probably not for no charge, since writing the notes takes some effort, and any rationalist will assign a nonzero (though possibly very small) probability to the apocalypse not occuring in time.

A possible problem with bets made over such long periods is that people who don't expect gold to become worthless in general by date X, but who do expect it to become worthless for them by date X, may also take the bet, skewing the results. A simple example would be an old hedonist, who strongly expects to die within a few years, doesn't care at all about what happens to his heirs, and would like some more money to spend while he is alive. Assuming that his prediction holds, by taking a bet that the apocalypse will happen by X, he gets some more money to spend while he is alive, and can then default on his debt by dying.

comment by Robin_Hanson2 · 2007-08-09T20:11:55.000Z · LW(p) · GW(p)

I'm afraid all the bets like this will just recover interest rates, which give the exchange rate between resources on one date and resources on another date. The interest rate combines both preferences to have stuff in some dates over others, and beliefs about whether people will actually have to make good on their promises to pay in the future. The problem is that it is very hard to disentangle these two effects.

comment by Will_M · 2007-08-09T21:24:06.000Z · LW(p) · GW(p)

A related problem to that envisioned by Sebastian is that a diehard believer in the apocalypse would likely be “judgment proof” in a legal sense. I think it’s fair to assume that anyone who believes that there will be no April 2nd, 20XX will also not plan to need assets on that date and will spend accordingly (perhaps they will give their remaining assets to their religion to curry last-minute goodwill or perhaps they will blow it in Vegas). As a result, the winner of the bet will not be able to collect from someone who has little to nothing left. Most modern societies’ lenient stances on bankruptcy will allow the loser to effectively start anew and not be required to pay the winner regardless of whether the loser goes on to make enough money later to satisfy their obligation. Asking the loser to put up collateral to protect against default would also defeat the purpose of the agreement; the person convinced of the apocalypse would then be deprived of some portion of the value of the payments they receive ahead of the purported doomsday. Like Robin indicated, the arrangement will have to be tweaked in a way that makes it function like an ordinary interest-driven exchange.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-08-09T22:01:07.000Z · LW(p) · GW(p)

Robin, even if this market theoretically should recover interest rates, don't you think it might be interesting to have market participants who are actually thinking about Singularity-type issues, to see if that market recovers a different interest rate? Or to look at it another way, with this market you could formally issue "put your money where your mouth is" challenges that wouldn't be quite so unethical as taking a loan from a banker because you don't expect to ever pay it back.

I know that I wouldn't be comfortable with taking out a loan because I believed the due date in 2080 was post-Singularity, unless this were explicitly understood by both parties to the agreement. For that matter, I'd want a clause saying that a Friendly intelligence explosion obviated the loan even if the event otherwise preserved markets.

Sebastian, the agreement would have to obligate the estate or descendants of a 50-year-old who wanted to bet on 2080. Again, the main objective is to let people put their money where their mouth is - let Kurzweil bet against Hofstadter's "more than 100 years to AI" pronouncement, even if the payoff is made by Kurzweil's estate to Hofstadter's estate (do they have kids)?

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2012-09-25T01:10:38.394Z · LW(p) · GW(p)

Dear past Eliezer: Robin is just right here, your idea doesn't work, accept it and move on.

Replies from: army1987
comment by A1987dM (army1987) · 2013-08-09T10:51:14.313Z · LW(p) · GW(p)

Do you remember what exactly made you change your mind?

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-08-09T23:42:06.394Z · LW(p) · GW(p)

I'm not sure. Probably had something to do with (a) reading econblogs in the meantime (b) getting in more practice on abandoning bad ideas and recognizing the 'reluctance'.

comment by CarlShulman · 2007-08-09T22:45:59.000Z · LW(p) · GW(p)

"Sebastian, the agreement would have to obligate the estate or descendants of a 50-year-old who wanted to bet on 2080. Again, the main objective is to let people put their money where their mouth is - let Kurzweil bet against Hofstadter's "more than 100 years to AI" pronouncement, even if the payoff is made by Kurzweil's estate to Hofstadter's estate (do they have kids)?"

Surely a bet obligating one's estate (beyond the reach of most of the incentive effects of near-term bets) won't add much value over this:

http://www.longbets.org/1

comment by Robin_Hanson2 · 2007-08-09T22:57:05.000Z · LW(p) · GW(p)

Eliezer, there are thick existing markets revealing short term interest rates. I would indeed be interested to create markets that more explicitly revealed very long term interest rates. But it would be hard to ensure that participants in such markets thought about any particular topic; they would think about whatever they thought was most relevant to estimating such prices.

comment by Michael2 · 2007-08-09T23:36:34.000Z · LW(p) · GW(p)

I wanted to comment on your statement that there is an ethical issue with taking out a loan if you think the payment date is post-singularity. I strongly disagree. Do you believe that a bank that believed, due to proprietary interpretation of public data, we would soon enter a period of rapid deflation would disclose this to you when you signed up for your 7% 30 year fixed rate mortgage? Unless you have some sort of private data, such as an existing Singularity on your home pc just waiting to be unleashed, then you are just a person with a different opinion and have no obligation to disclose why you are choosing a certain payment date.

comment by michael_vassar3 · 2007-08-10T00:26:29.000Z · LW(p) · GW(p)

I will second Michael here, but will also ask why you would worry about having to pay back after a Friendly singularity. Surely the opportunity cost to you of needing to pay someone back post-singularity is so much greater than that pre-singularity that you can ignore the latter.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-08-10T00:47:19.000Z · LW(p) · GW(p)

Vassar, I want the market to yield a time estimate of an apocalypse meta-event that includes Friendly singularities.

comment by Tom_McCabe · 2007-08-10T03:51:50.000Z · LW(p) · GW(p)

I'm not convinced that prediction markets supply data that's any more accurate than the predictions of individuals. Consider the market in crude oil futures; crude oil is much simpler and therefore much easier to predict than a Friendly intelligence explosion, and yet the data (http://www.durangobill.com/OilChart.html) shows that futures are horrifically inaccurate at predicting future prices. In fact, for most of the past six years, you could have done better at predicting the price of crude oil by using the current price instead of the future-market price. Does anyone have a link to a paper studying how accurate prediction markets are, compared to individual guessing?

Replies from: AspiringRationalist
comment by NoSignalNoNoise (AspiringRationalist) · 2012-07-23T20:18:28.358Z · LW(p) · GW(p)

In the case of oil, the current price is (sort of) a prediction of the price at a later date. If the market believes that the price of oil will reach a particular level on a particular date, the futures price will reflect that expectation. This sets a floor on the current price, specifically, the price at which it is profitable for a well-financed institution to take out a loan payable on the futures settlement date, buy physical oil now, pay for storage costs, and simultaneously sell the oil for a future date on the futures market.

Note that all prediction markets, especially those where trades are tied to a physical or financial asset that is regularly bought and sold share a similar property: prices are driven by arbitrage opportunities and participants' hedging needs just as much as by actual predictions.

comment by Peter_McCluskey · 2007-08-10T04:22:09.000Z · LW(p) · GW(p)

The treasury bond market appears to be as close to such a market as we can expect to get. It shows interest rates for bonds maturing in 2027 with a yield about 0.20% higher than those maturing in 2017, and bonds maturing in 2037 have a lower interest rate than those maturing in 2027. That's a clear prediction that apocalypse isn't expected. Markets for more than a few years into the future normally say that the best forecast is that conditions will stay the same and/or that existing trends will continue.

Replies from: gwern
comment by gwern · 2011-07-06T20:43:20.695Z · LW(p) · GW(p)

It shows interest rates for bonds maturing in 2027 with a yield about 0.20% higher than those maturing in 2017, and bonds maturing in 2037 have a lower interest rate than those maturing in 2027.

I'll admit, I don't understand this. Why do bonds which mature a decade later have lower interest rates? Shouldn't they have higher interest rates because you're taking on more risk? (The further out bonds mature, the higher the chance of an apocalypse or a huge runup in non-fixed instruments sometime during that period, right?)

comment by Tom_McCabe · 2007-08-10T06:52:05.000Z · LW(p) · GW(p)

"Markets for more than a few years into the future normally say that the best forecast is that conditions will stay the same and/or that existing trends will continue."

Judging from the crude oil data, futures markets tend to lag current prices; ie, if the current price was $20 a few days/weeks/months/years ago, the futures price will be $20 today. Markets have short-term memories; they think of "normal" as what conditions have been like for the past few years, and so if there's a deviation from "normal" (in either direction), people predict that the deviation will correct itself over time.

comment by michael_vassar3 · 2007-08-10T13:13:28.000Z · LW(p) · GW(p)

Peter McCluskey: I would say rather that markets more than a few years into the future normally say an incoherent and arbitrage filled mix of "things will stay the same" and "existing trends will continue". These are two very different but not generally mentally differentiated statements. In practice the time required to build a strong reputation as a money manager via long-term prediction is too great for the market to provide any selective pressure in favor of managers capable of closing such arbitrage opportunities.

comment by Stuart_Armstrong · 2007-08-10T15:38:41.000Z · LW(p) · GW(p)

I remember reading a (parody, I think) story a few years ago about a set up for Rapture-worried Christians to leave messages to their less devout loved ones, reassuring them after the Christians were raptured. Those in charge of passing on the message would guarrantee that they themselves would not be raptured by "blaspheming the holy spirit" or something of that nature. People have been trying to get some post-apocalypse deals for some time (someone even saught ways of ensuring that taboos over nuclear wastes would survive a civilization collapse). But the Christians above at least knew the nature of the Rapture; to get a sensible bet on the singularity, you´de have to know a lot more about the post-singularity world than is generally allowed. It sounds likely that gold will be valueless after the singularity - but we can construct scenarios, not TOOO unplausible sounding, that would end up in gold being increadibly valuble after a singularity. I feel it is ignorance not of the date, but of the nature of the singularity, that is the true barrier to sensible betting on it.

comment by Peter_McCluskey · 2007-08-10T19:10:29.000Z · LW(p) · GW(p)

Michael, I don't understand what opportunities you're referring to that could qualify as arbitrage. Also, reputation isn't necessarily needed - there are many investors who would use their own money to exploit the relevant opportunities if there were good reason to think they could be identified, without needing to convince clients of anything. One of the reasons I don't try to exploit opportunities that I can imagine involving apocalypse in the 2020s is that I think it's unlikely that markets will see any new information in the next few years that would make those opportunities less profitable if I wait to try exploiting them.

comment by michael_vassar3 · 2007-08-10T19:23:56.000Z · LW(p) · GW(p)

I won't try to argue for particular opportunities here on OB. Suffice it to say that if markets price in a random mix of the assumption that things will stay the same and the assumption that trends will stay the same because participants don't distinguish between those two statements incoherence exists and arbitrage is possible.

comment by Douglas_Knight2 · 2007-08-10T22:33:07.000Z · LW(p) · GW(p)

(parody, I think) story

The story was for real. The site, I dunno, but it does accept money through paypal.

comment by Joshua_Fox · 2007-08-16T08:55:13.000Z · LW(p) · GW(p)

Someone who believes that a Singularity is likely at, say 2030, might save less for retirement than they otherwise would. I wonder if Singularitarians really do so?

(Some believe that the rich will be the first and only ones to afford the technologies of Transcendence. I don't believe that, but one who did might save up after all.)

Replies from: gwern
comment by gwern · 2011-07-06T21:43:05.707Z · LW(p) · GW(p)

I suppose that depends on what kind of Singularity you are expecting. If you're expecting a Singleton which will pension off all the humans, then sure, eat drink and be merry - for tomorrow our investments die. If you're expecting a Hansonian 'crack of the future dawn', you'd better be saving and investing in equities so as to ride the coming bull market when ems eat everyone's wage-jobs.

Replies from: shminux
comment by Shmi (shminux) · 2012-09-23T21:42:49.279Z · LW(p) · GW(p)

what should you do if you have no clue whatsoever what will happen?

Replies from: gwern
comment by gwern · 2012-09-23T22:20:00.276Z · LW(p) · GW(p)

My best thought is a Pascalian thought about what the quasi-dominant option is:

  1. if the singleton Singularity happens, then your investment is just wasted and your loss is bounded at whatever that money could have bought you pre-Singularity;
  2. if the Hanson Singularity happens, your gains are potentially huge over the long-term (assuming property rights are respected...);
  3. if no Singularity at all happens, then your early all-equity investments (for a young person) was a good-to-great way to save for retirement (or a post-death bequest or expense, like cryonics) and you gain a lot.

So in 2 out of the 3 outcomes, you are much better off investing heavily in equities, and in the first you are not that badly off.

Replies from: Desrtopa, shminux
comment by Desrtopa · 2012-09-23T22:23:13.920Z · LW(p) · GW(p)

I think judging on the basis of those three possibilities is a premature narrowing of the hypothesis space.

Replies from: gwern
comment by gwern · 2012-09-23T22:26:54.733Z · LW(p) · GW(p)

One has to start somewhere. And the possibility-space isn't that large: the investments will be valuable or not, some sort of Singularity will happen or not, etc.

Replies from: johnsonmx
comment by johnsonmx · 2012-09-24T02:26:20.165Z · LW(p) · GW(p)

I would suggest that a breakdown in social order (without a singularity occurring) is another scenario that might be roughly as probable as the others you mentioned. In such case, it would seem the manner by which you invest in equities would matter. I.e.., the value of most abstract investments may vanish, and the value of equities held in trust by various institutions (or counterparties) may also vanish.

Replies from: gwern
comment by gwern · 2012-09-24T02:42:33.000Z · LW(p) · GW(p)

Which falls in the 'not valuable'/'not Singularity' cell of the 2x2 table.

comment by Shmi (shminux) · 2012-09-23T22:27:53.005Z · LW(p) · GW(p)

So one should carry on disregarding the possibility of God existing... err... Singularity happening?

Replies from: gwern
comment by gwern · 2012-09-23T22:40:41.751Z · LW(p) · GW(p)

Dunno. What do you think of the thumbnail analysis?

comment by timtyler · 2011-11-04T12:25:32.080Z · LW(p) · GW(p)

After an apocalypse it would not matter much who has "profited" - since then all forms of currency are worthless.

Replies from: pedanterrific, lessdazed
comment by pedanterrific · 2011-11-04T13:16:23.866Z · LW(p) · GW(p)

This is why the proposed system has the payments happen before the apocalypse. Did you actually read the post?

Replies from: timtyler
comment by timtyler · 2011-11-04T14:42:39.146Z · LW(p) · GW(p)

For most evolved creatures, it matters very little if their germ line is extingished in 10 years or 20 years - these outcomes are both germ-line extinction, they are both equally bad.

You have to use an accounting scheme which discounts extremely heavily for a few years for happiness to matter very much in the face of rapid eternal oblivion. Note that Yudkowsky advocates not discounting at all.

Replies from: pedanterrific
comment by pedanterrific · 2011-11-04T21:50:01.148Z · LW(p) · GW(p)

For most evolved creatures, it matters very little if their germ line is extinguished in 10 years or 20 years - these outcomes are both germ-line extinction, they are both equally bad.

If they're equally bad, then congratulations - you've managed to make germ-line extinction not matter at all. Cause, see, that was going to happen eventually anyway, with the heat death if nothing else. If your reaction to learning that the universe is probably going to end wasn't suicidal despair, I have to think you don't actually believe what you're saying.

Replies from: timtyler, lessdazed
comment by timtyler · 2011-11-05T12:36:15.507Z · LW(p) · GW(p)

Extinction in 10 or 20 years would be regarded as being roughly equally bad - since these are small figures - smaller than the lifespan of humans and within their planning horizon. So an evolved creature acting in their genetic self-interest can be expected to regard both outcomes as being roughly equally bad.

In the case of 10 years and universal heat death, most evolved creatures would strongly prefer to avoid immediate extinction, since universal heat death is far outside both their experience and their planning horizon. As a bonus, there may be ways of avoiding the heat death - by creating large new low-entropy regions by using known inflationary processes.

Replies from: pedanterrific, army1987
comment by pedanterrific · 2011-11-05T16:28:18.760Z · LW(p) · GW(p)

Whoa, hold up - what does the lifespan and planning horizon of humans have to do with "most evolved creatures"? Plenty of things live less than ten years.

Replies from: timtyler
comment by timtyler · 2011-11-06T12:28:32.102Z · LW(p) · GW(p)

So: a mouse may not be able to conceive of "extinction of all its relatives in 20 years". That might conceivably hinder it in making an adaptive decision - it's brain is too small to understand the options.

comment by A1987dM (army1987) · 2013-08-09T10:50:48.662Z · LW(p) · GW(p)

Extinction in 10 or 20 years would be regarded as being roughly equally bad - since these are small figures smaller than the lifespan of humans and within their planning horizon. So an evolved creature acting in their genetic self-interest can be expected to regard both outcomes as being roughly equally bad.

People don't act in their genetic self-interest alone, and myself, I'd very much rather die childless in 20 years than die childless in 10 years.

comment by lessdazed · 2011-11-05T13:36:04.834Z · LW(p) · GW(p)

I have to think you don't actually believe what you're saying.

I basically agree. I had said:

Humans aren't "naturally" like that - at least only a small subset of memes convinces normal ones of the intellectual truth of the proposition "life has no meaning because/if you merely die at the end".

I think it is important to untangle belief and belief in belief here. timtyler is talking about his beliefs and so is sharing his beliefs about his beliefs, and I don't think he's likely lying.

comment by lessdazed · 2011-11-04T13:19:20.993Z · LW(p) · GW(p)

This reminds me of "life has no meaning because/if you merely die at the end."

Replies from: timtyler
comment by timtyler · 2011-11-04T14:48:05.597Z · LW(p) · GW(p)

This reminds me of "life has no meaning because/if you merely die at the end."

That would only apply to those who assign no value to their genetic / memetic legacy.

Since humans and their culture evolved, there are not many humans that are very much like that.

Replies from: lessdazed, Nornagest
comment by lessdazed · 2011-11-04T16:00:15.281Z · LW(p) · GW(p)

Since humans and their culture evolved, there are not many humans that are very much like that.

Humans aren't "naturally" like that - at least only a small subset of memes convinces normal ones of the intellectual truth of the proposition "life has no meaning because/if you merely die at the end".

It is a valley of rationality sickness. If I get money now and there is an apocalypse later - such as perhaps the sun expanding to swallow the Earth (especially if we never colonize other places), or the heat death of the universe - I still consider it all worthwhile. The future in which I get a lot of stuff and then everyone dies in a cataclysm is much preferable to me to the one in which I don't get a lot of stuff and everyone dies in a cataclysm.

comment by Nornagest · 2011-11-04T16:57:23.492Z · LW(p) · GW(p)

I feel compelled to point out here that selection pressure pointing towards genetic/memetic fitness doesn't necessarily imply explicit values oriented towards the same goal, merely values correlated with it, and even that can sometimes get fuzzy based on moral fashion or political forces. Explicitly legacy-oriented values clearly do exist, but they're far from universal: compare Havamal 75 to Analects of Confucius 1:11 to Epicurus' letter to Menoeceus to get three very different perspectives on the general topic of an enduring legacy.

Replies from: timtyler
comment by timtyler · 2011-11-04T17:14:31.101Z · LW(p) · GW(p)

Humans' "explicit values" are not to be trusted either. Humans invent all kinks of bullshit stories about their values for the purpose of signalling to prospective partners how great they are so as to better manipulate them. They are like the priests who preach chastity and fidelity - and then roger the altar boy in "a moment of weakness".

By their works ye shall know them.

Replies from: Nornagest
comment by Nornagest · 2011-11-04T17:26:21.715Z · LW(p) · GW(p)

Sure, but the same goes for anyone talking about how life is meaningless because of the period at the end. We're generally pretty terrible at making our explicit values consistent with out implicit objectives, but it doesn't do much good to invoke an implicit goal to resolve explicit existential angst if we don't acknowledge that goal in the first place.

comment by timtyler · 2011-11-04T16:06:12.844Z · LW(p) · GW(p)

I think this procedure would be more likely to recover the chances of a personal disaster - rather than a global one.

I would borrow from my distant future self - offering to repay with interest later - but not because of a belief that the end of the world is imminent - but because of the chances that my future self would never have to repay the debt through personal catastrophes involving things like aging and dying. Since aging and dying are commonplace - while the end of the world is highly speculative - it seems likely that the former effects will dominate the signal - making it very difficult to say anything useful about the probability of the end of the world at particular points in time.

comment by pnrjulius · 2012-06-23T20:41:36.577Z · LW(p) · GW(p)

Surely there's a better way of predicting and preventing apocalypses than betting on them?

It's an interesting idea, to be sure; and I like the way it forces you to put your money where your mouth is. But the point of predicting existential risks is to avoid them, not try to make wealth off of them (especially since once the world ends, no matter what you're using to hold your wealth, it will still become worthless).

By the way, 2020 and 2030 are not implausible guesses for the point at which asteroid mining picks up and floods the market with shiny yellow rocks. I for one eagerly await, and plan to buy into asteroid mining ventures.