0 comments
Comments sorted by top scores.
comment by Vladimir_Nesov · 2010-07-23T20:21:39.737Z · LW(p) · GW(p)
The problem with this post is raving madness of its presentation, even if conclusions you presented are defensible. You can't just state absurd things not clearly explained, it's not a matter of not telling what you believe to be true, it's a matter of a rational mode of communication. People shouldn't believe absurd things, much less statements of confusing meaning, unless all steps are laid out (or unless they are willing to spend time on research).
It's fine to discuss such things in comments with people who you know share the necessary background (as I did a few weeks ago on this same topic), but a top-level post requires much more background-building work.
Replies from: Roko↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-07-24T09:21:13.846Z · LW(p) · GW(p)
We're not worried about superintelligences figuring it out. They figure it out automatically. If you don't understand what we're worrying about then I'm not going to explain it, though, because I don't want anyone to worry about it.
The other reason why you don't discuss this sort of thing is that all discussions of it sound absolutely crazy.
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-07-24T05:35:38.366Z · LW(p) · GW(p)
One might think that the possibility of CEV punishing people couldn't possibly be taken seriously enough by anyone to actually motivate them. But in fact one person at SIAI was severely worried by this, to the point of having terrible nightmares, though ve wishes to remain anonymous.
I don't usually talk like this, but I'm going to make an exception for this case.
Listen to me very closely, you idiot.
YOU DO NOT THINK IN SUFFICIENT DETAIL ABOUT SUPERINTELLIGENCES CONSIDERING WHETHER OR NOT TO BLACKMAIL YOU. THAT IS THE ONLY POSSIBLE THING WHICH GIVES THEM A MOTIVE TO FOLLOW THROUGH ON THE BLACKMAIL.
There's an obvious equilibrium to this problem where you engage in all positive acausal trades and ignore all attempts at acausal blackmail.
Until we have a better worked-out version of TDT and we can prove that formally, it should just be OBVIOUS that you DO NOT THINK ABOUT DISTANT BLACKMAILERS in SUFFICIENT DETAIL that they have a motive to ACTUALLY BLACKMAIL YOU.
If there is any part of this acausal trade that is positive-sum and actually worth doing, that is exactly the sort of thing you leave up to an FAI. We probably also have the FAI take actions that cancel out the impact of anyone motivated by true rather than imagined blackmail, so as to obliterate the motive of any superintelligences to engage in blackmail.
Meanwhile I'm banning this post so that it doesn't (a) give people horrible nightmares and (b) give distant superintelligences a motive to follow through on blackmail against people dumb enough to think about them in sufficient detail, though, thankfully, I doubt anyone dumb enough to do this knows the sufficient detail. (I'm not sure I know the sufficient detail.)
You have to be really clever to come up with a genuinely dangerous thought. I am disheartened that people can be clever enough to do that and not clever enough to do the obvious thing and KEEP THEIR IDIOT MOUTHS SHUT about it, because it is much more important to sound intelligent when talking to your friends.
This post was STUPID.
(For those who have no idea why I'm using capital letters for something that just sounds like a random crazy idea, and worry that it means I'm as crazy as Roko, the gist of it was that he just did something that potentially gives superintelligences an increased motive to do extremely evil things in an attempt to blackmail us. It is the sort of thing you want to be EXTREMELY CONSERVATIVE about NOT DOING.)
Replies from: timtyler↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-07-24T05:24:30.684Z · LW(p) · GW(p)
Wow... that's like such a backhanded compliment it turns around and becomes a fronthanded compliment again.
comment by [deleted] · 2010-07-23T17:50:36.679Z · LW(p) · GW(p)
"Well, there is another way to make $300,000,000. Start with $30,000, and using a quantum random number generator, gamble it on the forex markets at a 10,000:1 ratio. Then in the branches where your gamble pays off, start an AGI company, hire the best people and build an FAI yourself"
This is kind of glossed over, but I don't think it works at all. Here is what I think you mean to do:
construct 10,000 trades that each pay off 10,000:1 and combined cover the entire possible future potential prices of some set of currency pairs, so that no matter what, one of them will pay off.
roll a quantum die to decide which one to bet on.
make the bet and sit back to collect your quantum winnings.
If that is what you meant, then you are wrong. You certainly can make bets with a payoff of 10,000:1 or greater with forex options, for some scenarios, but probably all those scenarios are much less likely to happen than 1 in 10,000 because sane people don't take the other side of bets like that without a lot of edge. And there is no way that you can make bets that levered in the more plausible scenarios. For instance, how would you bet on the EUR/USD (or whatever cross you want) not moving in the next year, or next few years? You could sell a shitload of strangles or straddles, but no one will let you sell $300M of strangles with only 30K of capital, because any movement (or just an unfavorable settle) will cost all your capital and a hell of a lot more.
Replies from: timtyler, Roko↑ comment by timtyler · 2010-07-23T22:40:44.008Z · LW(p) · GW(p)
Three spins of a roulette wheel should do it. (1/60)^3 is 1/216,000. (59/60)^3 > 0.95 - so the casino's cut would be around 5%. You might have to visit multiple casinos, but the fees would not do much damage to the project. This all seems managable - so your assertion seems implausible.
Replies from: None, Roko↑ comment by [deleted] · 2010-07-24T00:41:51.606Z · LW(p) · GW(p)
agreed. typically roulette wheels pay 35 to 1 with either 37 or 38 spots but that doesn't change the vailidity of your point. The only difficulty would be finding a place where you could place such a huge bet on a roulette wheel. I don't think that there is a casino where you could place even a $1M bet on a single number.
I see no reason in principle that it should be unreasonably difficult to become a quantum billionaire, I just didn't think that the specific plan Roko presented would work, though when he explained it more it did seem more plausible to me. And I think that you'll have to give up a decent amount of expected value to do it. Maybe powerball should move to a quantum mechanism for picking numbers to attract more many worlds believers!
↑ comment by Roko · 2010-07-23T18:06:28.924Z · LW(p) · GW(p)
Use leverage and iterate the bets. GBP/JPY has 2-3% daily volatility. Leverage by a factor of 25, and within a week you will be wiped out in half the branches and have doubled your money in the other half. If you win, repeat. If you lose, end. Iterate 14 times.
eToro or any of a number of highly advertized online trading programs make this so easy anyone can do it. They even give you 25% free bonus-money so your expected value is positive. There are web-services that give you qbits for free.
Replies from: andreas↑ comment by [deleted] · 2010-07-23T20:03:40.892Z · LW(p) · GW(p)
Are you really? I am curious to hear about your experience. How wide a market does eToro make in GBP/JPY? That's not a huge cross, so my guess would be at least 4bips. Also, what kind of carry do you have to pay? My guess would be that you will be murdered by costs if you don't just get stopped out by an adverse market move. Overall, I think you could definitely get rich with this plan, but I can't imagine that you actually have positive EV
Replies from: Roko↑ comment by Roko · 2010-07-23T20:26:26.382Z · LW(p) · GW(p)
Sorry, you obviously know more about this than me. What do you mean by "How wide a market does eToro make in GBP/JPY?", and what do you mean by "That's not a huge cross, so my guess would be at least 4bips". What is 4 bips? Is that basis points per second? Do you know of a more volatile asset to hold?
The bid-offer spread with x25 leverage is a few percent, so that doesn't seem so bad. And note that they give you 25% extra money.
Replies from: None↑ comment by [deleted] · 2010-07-24T01:35:46.066Z · LW(p) · GW(p)
sorry, I meant pip not bip. by 4 pips I just meant the size of the bid-ask. like for instance, the market might be 134.88 @ 134.92 or so.
I think that the chained bet thing fundamentally makes sense, I just think it will net cost money.
comment by Mass_Driver · 2010-07-24T03:20:11.185Z · LW(p) · GW(p)
I stopped reading after the first three paragraphs looked like post-modern Calvinism without any kind of disclaimer. Not proud of that, but I did.
↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-07-24T10:52:23.361Z · LW(p) · GW(p)
You don't think about that either. DUH.
You don't worry about the justification. You just do it until such time as we prove that immunity to blackmail is the equilibrium solution.
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-07-24T20:44:57.110Z · LW(p) · GW(p)
The original version of this post caused actual psychological damage to at least some readers. This would be sufficient in itself for shutdown even if all issues discussed failed to be true, which is hopefully also the case.
Please discontinue all further discussion of the banned topic.
All comments on the banned topic will be banned.
Exercise some elementary common sense in future discussions. With sufficient time, effort, knowledge, and stupidity it is possible to hurt people. Don't.
As we used to say on SL4: KILLTHREAD.
↑ comment by orthonormal · 2010-07-24T17:06:53.438Z · LW(p) · GW(p)
(Comment deleted by author)
NOTE: What I said here was either nonsense or might be a further harmful meme, so goodbye to it.
↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-07-24T11:20:00.567Z · LW(p) · GW(p)
It doesn't work that way. I'm not going to explain how it does work for obvious reasons. But vague thoughts about blackmailers don't give anyone an incentive to blackmail you. If you were worried that they did, you'd just try to apply an equal and opposite impulse in the other direction to reduce the incentive to blackmail you, and then rely on an FAI to try to cancel it out exactly later.
But mostly the winning move is to just go think about something else. So take my word for it, I know more than you do, no really I do, and SHUT UP.
I'm going to ask Tricycle to delete this post. Apparently people are really that incapable of shutting up.
comment by Roko · 2010-07-23T21:03:30.760Z · LW(p) · GW(p)
So the consensus seems to be that I explained my ideas in an unclear and overly brief way in this post. I'd appreciate it if people could post as sub-comments of this comment bits that they think are poorly explained.
People could also suggest a good way of breaking the material down. I don't have a good idea of what things are common knowledge here, versus what things I picked up in various less publicized fora.
Together, we can get contribute to the creation of an improved version of this post, perhaps as a series.
comment by timtyler · 2010-07-23T20:09:03.088Z · LW(p) · GW(p)
Betting large quantities at long odds is usually a very bad thing for a trader to do - due to the diminishing utility of money. The occasional big win fails to compensate for all the losses.
It is highly unclear why anyone would think that changes in future circumstances would be likely to make such risk-seeking behaviour any less stupid.