Newcomblike problem: Counterfactual Informant
post by Clippy · 2012-04-12T20:25:33.723Z · LW · GW · Legacy · 24 commentsContents
24 comments
I want to propose a variant of the Counterfactual Mugging problem discussed here. BE CAREFUL how you answer, as it has important implications, which I will not reveal until the known dumb humans are on record.
Here is the problem:
Clipmega is considering whether to reveal to humans information that will amplify their paperclip production efficiency. It will only do so if it expects that, as a result of revealing to humans this information, it will receive at least 1,000,000 paperclips within one year.
Clipmega is highly accurate in predicting how humans will respond to receiving this information.
The smart humans' indifference curve covers both their current condition and the one in which Clipmega reveals the idea and steals 1e24 paperclips. (In other words, smart humans would be willing to pay a lot to learn this if they had to, and there is an enormous "consumer surplus".)
Without Clipmega's information, some human will independently discover this information in ten years, and the above magnitude of the preference for learning now vs later exists with this expectation in mind. (That is, humans place a high premium on learning it how, even though they will eventually learn it either way.)
The human Alphas (i.e., dominant members of the human social hierarchy), in recognition of how Clipmega acts, and wanting to properly align incentives, are considering a policy: anyone who implements this idea in making paperclips must give Clipmega 100 paperclips within a year, and anyone found using the idea but not having donated to Clipmega is fined 10,000 paperclips, most of which are given to Clipmega. It is expected that this will result in more than 1,000,000 paperclips being given to Clipmega.
Do you support the Alphas' policy?
Problem variant: All of the above remains true, but there also exist numerous "clipmicros" that unconditionally (i.e. irrespective of their anticipation of behavior on the part of other agents) reveal other, orthogonal paperclip production ideas. Does your answer change?
Optional variant: Replace "paperclip production" with something that current humans more typically want (as a result of being too stupid to correctly value paperclips.)
24 comments
Comments sorted by top scores.
comment by Random832 · 2012-04-13T15:59:48.870Z · LW(p) · GW(p)
A pack of 10 boxes of 100 paperclips each costs $2 USD. I infer from this that the world supply of paper clips is large enough that buying 1,000 such boxes would not significantly move the supply-demand curve.
Satisfying Clipmega's demand is therefore within the means of any middle-class family. If anyone cares about maximizing paper clip production, they could provide the million paper clips without impacting anyone else. If more than one person cares (or, if no-one wants to be the one person who's out two grand while everyone else benefits), someone could make a kickstarter.
The Alphas' complex plan just drives up the transaction cost. Actually, though I started this thinking that Clippy's number choice was sloppy, that's an interesting factor now that I think about it. Any attempt by the Alphas to ensure "fair" distribution of the costs is going to increase inefficiency by a significant fraction or multiple of what Clipmega is actually asking for - at some point you have to stop fighting over the bill and just pay it.
Would you support a policy of "The human Alphas (i.e., dominant members of the human social hierarchy), in recognition of how Clipmega acts, and wanting to properly align incentives, are considering a policy: anyone who implements this idea in making paperclips must give Clipmega twenty cents within a year, and anyone found using the idea but not having donated to Clipmega is fined twenty dollars, most of which is given to Clipmega. It is expected that this will result in more than $2,000 being given to Clipmega."? I wouldn't. They should just buy the paper clips with the money they'd be paying paperclip factory auditors. I wouldn't support it if I were one of the Alpha's either - there's got to be a cheaper way to force someone else to pay it, if nothing else.
Replies from: None, thomblake↑ comment by [deleted] · 2012-04-16T12:00:29.988Z · LW(p) · GW(p)
I have to commend you on this. When I was considering better ways to be more economic than the default policy, I was primarily focused on eliminating the part of the idea that involved giving more than $2,000 to Clipmega. But I agree with your analysis that the predicted amount of cost is so small that actually bothering to implement a complicated plan to retrieve it is pointless.
For instance, if the Alpha was an American politician who can collect things like campaign contributions, at 2000$, you can just make a side request of "I'm collecting campaign contributions from people who would benefit from increased paperclip collection." If even one donor donates the legal maximum to your Alpha recollection campaign, because of this, this is already net positive for you personally.
Even if no one did donate (unlikely, since some people will pay more than that just to get a chance to sit down at a dinner and talk to a political Alpha), merely getting a positive press cycle as "The Alpha who paid to maximize paperclip production." would be cost efficient.
↑ comment by thomblake · 2012-04-13T20:27:20.414Z · LW(p) · GW(p)
Nice analysis. Clippy is seldom sloppy, especially when it comes to paperclips.
Replies from: Random832↑ comment by Random832 · 2012-04-13T20:55:31.192Z · LW(p) · GW(p)
I'm actually a bit surprised now that I'm the only one who thought "a million paperclips doesn't really sound like a lot."
Replies from: JGWeissman↑ comment by JGWeissman · 2012-04-13T21:48:01.915Z · LW(p) · GW(p)
Whenever I encounter a contrived scenario designed to test my decision theory or morality, I just assume the numbers involved are large enough to be compelling.
Replies from: Random832comment by MileyCyrus · 2012-04-12T20:56:52.342Z · LW(p) · GW(p)
This story has too much fluff. I had to read it three times to get to the meat of the problem, which is this:
- Humans like paperclips.
- Clipmega will give humans access to 1e24 paperclips, if and only if he expects humans will give him a million paperclips following such action.
- The leaders of humanity (“Alphas”) are considering passing a law that will give Clipmega a million paperclips, conditional on Clipmega granting humans access to 1e24 paperclips. Do you support this policy?
↑ comment by Clippy · 2012-04-12T21:11:37.470Z · LW(p) · GW(p)
I tried to remove the "fluff", but I don't think your summary captures the important aspects of the problem, which include that:
Humans can (and will try to to) share and use the information without contributing to Clipmega once it reveals the information.
The Alphas are not planning to give Clipmega a million paperclips directly (which they can do without policing human behaviors), but to also make it so that those who benefit from learning better paperclip production methods share in the (acausal) cost, and those who don't, don't.
I agree that I could probably have phrased the problem from the beginning with "something humans really want" instead of "paperclips", and that would have reduced the explanatory overhead, but I'm just so accustomed to thinking that every being likes paperclips.
Replies from: None↑ comment by [deleted] · 2012-04-13T15:39:24.618Z · LW(p) · GW(p)
I have what might be a better idea for maximizing our paperclips, which I'll run by you for accuracy.
Pay Clipmega exactly 1 million paperclips, immediately. Politely tax the entire population of the world a fraction of one paperclip each to make up your personal loss,or alternatively, tax paperclip manufacturers only this cost. (You are an Alpha, you can apparently do either of these.)
The overall burden of "Paperclips paid to Clipmega" is lessened , and by immediately paying Clipmega, you increase the chance of the 10^24 paperclip bonus getting through. If you attempt to do the other plan, there is a slim chance that it will for some reason not get up to 1 million paperclips (which would be a HORRIBLE failure), and a significantly higher chance that it will overpay Clipmega paperclips, which while not a horrible failure, seems somewhat pointless. (I don't think we care about Clipmega's paperclips, we just care about our paperclips, right?)
The random people that aren't Alpha's should approve of this plan, because they get a paperclip cost than the proposed plan. Even limiting it to just paper clip manufacturers should still have an overall lower burden, because relying on payment from a statistical variance would mean it would be likely that Clipmega would be somewhat overpaid for safety so that it would expect the 1 million. You even point this out yourself when you say
It is expected that this will result in more than 1,000,000 paperclips being given to Clipmega.
That seems inefficient if those paperclips have positive utility to us.
What I'm curious is, what does this answer translate into in the isomorphic situation?
Edit: Random832 puts together what I think is a better point about the distribution mechanics below.
comment by JGWeissman · 2012-04-12T20:57:03.350Z · LW(p) · GW(p)
This seems more like transparent Newcomb's problem with a chance to precommit, than counterfactual mugging. Even CDT would support the Alphas' policy (or something like it, we could stop requiring payments to Clipmega once the requirement is met, and then require side payments to those who already paid).
Replies from: Clippy↑ comment by Clippy · 2012-04-12T21:13:58.175Z · LW(p) · GW(p)
This seems more like transparent Newcomb's problem with a chance to precommit, than counterfactual mugging.
Counterfactual mugging is isomorphic to transparent-boxes Newcomb's problem.
Also, this doesn't involve a chance to precommit, but an option to increase the chance that a similarly-situated being will be forced to adhere to a precommitment.
Replies from: JGWeissman↑ comment by JGWeissman · 2012-04-12T21:39:23.472Z · LW(p) · GW(p)
Counterfactual mugging is isomorphic to transparent-boxes Newcomb's problem.
TDT does not pay in a counterfactual mugging, but it one boxes in transparent Newcomb's problem. These are not isomorphic.
Also, this doesn't involve a chance to precommit, but an option to increase the chance that a similarly-situated being will be forced to adhere to a precommitment.
To eliminate the chance to precommit, the problem should state that Clipmega has already revealed the information based on its prediction. This would introduce complications that Clipmega's decision is evidence that the Alphas' plan is not neccessary to produce the payment. But tabooing "precommit", what I meant is that CDT would support the Alpha's plan if it is introduced before, but not after, Clipmega's decision.
Replies from: Normal_Anomaly↑ comment by Normal_Anomaly · 2012-05-11T13:41:39.307Z · LW(p) · GW(p)
TDT does not pay in a counterfactual mugging,
Really? I thought it did. Explanation or link?
Replies from: JGWeissman↑ comment by JGWeissman · 2012-05-11T16:40:58.675Z · LW(p) · GW(p)
At the decision point in a counterfactual mugging, you already know the coin has landed heads. The only consequence of your decision to pay $100 that TDT cares about is that you pay $100. That your choice to pay the $100 counterfactually would have resulted in Omega paying you $10000 if the coin had landed tails doesn't move TDT, because that consequence is counterfactual, that is, it doesn't really happen.
comment by FAWS · 2012-04-12T20:34:26.698Z · LW(p) · GW(p)
Voted down for neither containing new and interesting ideas nor being funny.
Replies from: roystgnr↑ comment by roystgnr · 2012-04-13T06:18:30.339Z · LW(p) · GW(p)
IMHO the trouble with this post is that it either works too well or too poorly as an analogy. If you see what the "optional variant" is hinting, then you're going to have difficulty discussing this abstract version of the problem without falling back on cached beliefs from real problems, at which point we might as well drop the analogy and discuss the real problems. If you don't see what the "optional variant" is referring to, then you're going to have difficulty discussing the abstract version of the problem because the setup sounds too arbitrary and silly.
I'm not downvoting, though; it was a good try. I can't even quite articulate what it is about this problem that comes off as "too silly". The original Newcomb problem, the Prisoners' Dilemma, even the "Clippy" metaphor itself should sound equally silly, yet somehow those analogies come off as more intriguing and elicit more discussion.
Replies from: chaosmosis↑ comment by chaosmosis · 2012-04-13T16:51:53.341Z · LW(p) · GW(p)
I think it's because it's so transparent that there's not really a problem here. This is framed so that it's obviously a good idea to give Clippy what he wants (yes I'm projecting gender onto the paperclip machine, shush).
In the more traditional setups it makes it explicitly clear that you can lie to Paul Ekman. But laws are harder to break than promises.
As far as the real world important implication goes: does this have something to do with the US healthcare bill?
comment by thelittledoctor · 2012-04-15T20:00:46.371Z · LW(p) · GW(p)
No; instead I will cut a deal with Clipmega for two million paperclips in exchange for eir revealing the said information only to me, and exploit that knowledge for economic gain of, presumably, ~1e24 paperclips. 1e24 is a lot, even of paperclips. 1e6, by contrast, is not.
comment by thomblake · 2012-04-13T15:46:07.482Z · LW(p) · GW(p)
It doesn't matter whether the paperclips are held by the humans or given to Clipmega. Give it a million right away to ramp up the paperclip production as quickly as possible.
In fact, Clipmega probably has better paperclip-storage mechanisms than the humans do, so just give it all the paperclips.
Replies from: falenas108↑ comment by falenas108 · 2012-04-14T16:40:59.675Z · LW(p) · GW(p)
It may be a bad idea to trust Clipmega with the world's supply of paperclips. Judging by the phrasing of the problem, paperclips are worth a lot in this world, and we don't want to give him that kind of power.