Evidential Decision Theory and Mass Mind Control
post by DanielLC · 2010-10-23T23:26:42.124Z · LW · GW · Legacy · 10 commentsContents
Required Reading: Evidential Decision Theory None 10 comments
Required Reading: Evidential Decision Theory
Let me begin with something similar to Newcomb's Paradox. You're not the guy choosing whether or not to take both boxes. You're the guy who predicts. You're not actually prescient. You can only make an educated guess.
You watch the first person play. Let's say they pick one box. You know they're not an ordinary person. They're a lot more philosophical than normal. But that doesn't mean that the knowledge of what they choose is completely useless later on. The later people might be just as weird. Or they might be normal, but they're not completely independent of this outlier. You can use his decision to help predict theirs, if only by a little. What's more, this still works if you're reading through archives and trying to "predict" the decisions people have already made in earlier trials.
The decision of the player choosing the box affects whether or not the predictor will predict that later, or earlier, people will take the box. According to EDT, one should act in the way that results in the most evidence for what one wants. Since the predictor is completely rational, this means that the player choosing the box effectively changes decisions other people make, or actually changes depending on your interpretation of EDT. One can even affect people's decisions in the past, provided that one doesn't know what they were.
In short, the decisions you make affect the decisions other people will make and have made. I'm not sure how much, but there have probably been 50 to 100 billion people. And that's not including the people who haven't been born yet. Even if you only change one in a thousand decisions, that's at least 50 million people.
Like I said: mass mind control. Use this power for good.
10 comments
Comments sorted by top scores.
comment by Relsqui · 2010-10-23T23:28:55.713Z · LW(p) · GW(p)
I think at least some of your "effect"s were meant to be "affect"s. Also, I didn't follow the logical jump to "you can affect decisions other people made in the past."
Replies from: DanielLC↑ comment by DanielLC · 2010-10-24T00:39:37.489Z · LW(p) · GW(p)
I think at least some of your "effect"s were meant to be "affect"s.
Fixed
Also, I didn't follow the logical jump to "you can affect decisions other people made in the past."
I added more to that, although I still can't guarantee whether or not it's easy to follow.
Replies from: Relsqui↑ comment by Relsqui · 2010-10-24T04:18:54.043Z · LW(p) · GW(p)
I still don't follow it, personally. Specifically, I don't see why to equate "you can affect someone's prediction of the past" with "you can affect the actual event that person is predicting." ("Predict" is probably the wrong word for this, but I can't think of a better one offhand.)
Besides. Even if you DO make that jump, in order to get from there to
In short, the decisions you make affect the decisions other people will make and have made.
it seems that you would need to propose that reality fits your model, i.e. that there actually exists a perfectly rational observer making predictions about us.
Replies from: DanielLC↑ comment by DanielLC · 2010-10-24T05:41:49.892Z · LW(p) · GW(p)
How do you interpret Evidential Decision Theory? Perhaps that would make this easier to explain.
I interpreted it as that by making evidence point to something you're causing it to happen. This isn't what most people mean by "cause", but as an Eternalist, I never mean that. I consider the future every bit as constant as the past. If you drop a rock, you're not actually causing it to fall in the normal sense, as it's already either going to fall or it isn't.
In any case, it's treating correlation and causation the same as far as decision is concerned. If two-boxing correlates to the million-dollar box being empty, you treat it as that it makes the box empty, and only take one box.
Replies from: Relsquicomment by magfrump · 2010-10-24T06:19:48.965Z · LW(p) · GW(p)
So is this just a sarcastic remark about EDT? Or are you positing something?
Replies from: DanielLC↑ comment by DanielLC · 2010-10-24T20:49:37.010Z · LW(p) · GW(p)
It's a serious remark about EDT. I agree with EDT and with this conclusion about it.
Replies from: magfrump↑ comment by magfrump · 2010-10-24T21:50:48.910Z · LW(p) · GW(p)
Your remark seems ridiculous. As Relsqui mentions, there is a distinction between changing your probability estimate of others' decisions and changing their decisions.
The fact that EDT has troubles with these sorts of distinctions is, afaik, generally considered a weakness of EDT.
If I read your argument correctly, you are saying that "given that omega uses EDT your decisions acausally affect the others' decisions which you are unaware of." Again; this seems patently ridiculous and an argument for nothing except that "omega doesn't use EDT and neither should you."