Semi-open thread: blackmail
post by Stuart_Armstrong · 2013-07-15T16:25:20.385Z · LW · GW · Legacy · 31 commentsContents
31 comments
My blackmail posts have generated some interesting discussion, so I'm just creating this one so that people can post examples of behaviours that they think are either clearly blackmail, or clearly not blackmail, or something in between.
31 comments
Comments sorted by top scores.
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-07-15T17:22:51.906Z · LW(p) · GW(p)
All the blackmail posts should probably have been one thread.
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2013-07-15T17:28:44.371Z · LW(p) · GW(p)
I agree, it just sort of accumulated in this way. Can threads be merged?
comment by Discredited · 2013-07-15T21:46:57.873Z · LW(p) · GW(p)
Previous LW discussion here.
comment by Stuart_Armstrong · 2013-07-15T16:31:05.941Z · LW(p) · GW(p)
Consider the following (inspired by a conversation with Xixidu):
Agent A and agent B come to the following agreement: agent A will make a car for agent B for £100. They sign a contract for this. But no contract can be totally exhaustive, so some details are left out. One detail left out is the colour of the car.
Why? Because there are only two colours available, black (which costs £1) and green (which costs £150). Agent B much prefers black to green, so didn't bother to include that in the contract: obviously agent A isn't going to use paint that costs £150 on a car he's selling for less than that!
But A suddenly announces that he will, after all, paint the car green. He credibly commits to painting the car green, at a massive loss - unless agent B gives him an extra £10.
Blackmail or not? And would it be different if the price of green was high, but not so extreme?
Replies from: TrE, Eliezer_Yudkowsky, wedrifid, twanvl↑ comment by TrE · 2013-07-15T17:03:19.911Z · LW(p) · GW(p)
(Please note that in the following, I'm using "blackmail" for all sorts of attempted coercion or extortion)
I'd say this formulation of yours is very useful for drawing some boundaries around the word "blackmail". Namely, the cost to the blackmailer in case the threat fails should be somewhat less, but at least comparable than the cost to the blackmailee in case the threat succeeds, or the threat will be simply viewed as stupid (although you could, technically, still call it "blackmail").
Or rather, this assumes a probability of success of 0.5. If this probability (as judged by the blackmailer, and historically checkable) is different, the threat and costs imposed on both blackmailer and blackmailee also have to change accordingly.
For example, whole governments are known to work on a no-blackmail-basis, as officially announced. For example, when hostages are taken, government officials (at least in my country) frequently announce that their country will not let themselves be blackmailed. So in order for a threat to succeed, according to my model the costs to the threatener must be very low (e.g. they have safe refugee or asylum in a third country, and don't have to fear retaliation in case the threat fails and they have to act on it) and the costs to the country or government (incremental over succumbing to the threat) very high (e.g. the leaking of diplomatically relevant documents which can seriously damage international relations).
In your example from above, I'd argue that the threat (with $150 green paint) has a chance of succeeding if and only if:
- (a) The customer has been successfully blackmailed many times before, and this is common knowledge and/or
- (b) The customer really needs a black car because of some obscure reason, and only this car, only in black, will somehow give him huge profits. This is also common knowledge.
The question is whether one can reason about why this is so, utilizing some form of TDT or -derivative.
↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2013-07-15T17:21:56.970Z · LW(p) · GW(p)
Blackmail. A wouldn't make the threat if A believed that B ignored such threats, and A would have no motive to paint the car green if B were a rock. Pricing of green makes no difference so long as the price is not negative.
Replies from: Stuart_Armstrong, wedrifid, Stuart_Armstrong↑ comment by Stuart_Armstrong · 2013-07-15T17:49:14.733Z · LW(p) · GW(p)
if B were a rock
Defining what a rock is is akin to defining what the status quo/disagreement point is. What if B were an automated rock that responded to "I'll paint it green" with "please don't, I'll offer £10", what would your feelings be then?
In this example that seems a little gratuitous, but I'm sure we can construct other examples where it's much more natural...
↑ comment by Stuart_Armstrong · 2013-07-15T19:07:17.321Z · LW(p) · GW(p)
Now an example with a rock: http://lesswrong.com/lw/i07/semiopen_thread_blackmail/9dux
↑ comment by twanvl · 2013-07-15T16:48:27.985Z · LW(p) · GW(p)
What is the reason that Agent A asks for an extra £10? Is it because he had a bucket of green paint laying around that is about to go bad, and he doesn't want to spend £9 to drive to the store to buy black paint? Or does he believe he can get away with demanding more since the contracts needs to get amended?
Replies from: wedrifid↑ comment by wedrifid · 2013-07-15T17:10:35.172Z · LW(p) · GW(p)
What is the reason that Agent A asks for an extra £10? Is it because he had a bucket of green paint laying around that is about to go bad, and he doesn't want to spend £9 to drive to the store to buy black paint? Or does he believe he can get away with demanding more since the contracts needs to get amended?
The second. The scenario is contrived such that it contains credible commitment to self harm that also harms the other.
comment by Alejandro1 · 2013-07-15T18:30:36.864Z · LW(p) · GW(p)
It is unclear which of two partners is today in charge of doing the dishes and which of doing the laundry. Laundry is more work than dishes. B tells A: "Honey, if you do the dishes I will do the laundry". Not blackmail.
It is assumed based on custom that B will do the laundry, but unclear who does the dishes today. Laundry is more work than dishes. B tells A: "Honey, unless you do the dishes, I won't do laundry". Blackmail.
This shows that an "assumed status quo" is necessary to define blackmail. I proposed in the other thread a definition inspired by this example.
Replies from: ThisSpaceAvailable, Stuart_Armstrong↑ comment by ThisSpaceAvailable · 2013-07-22T02:31:18.965Z · LW(p) · GW(p)
That's more theft than blackmail. When B refuses to do laundry, B is acting on the intrinsic value of not doing laundry. That's different from the car paint example, since threatening to paint the car green is not based on the intrinsic value of painting the car green.
↑ comment by Stuart_Armstrong · 2013-07-15T18:37:06.696Z · LW(p) · GW(p)
I think I agree with the status quo point.
comment by Stuart_Armstrong · 2013-07-15T19:06:23.560Z · LW(p) · GW(p)
Blackmail with literal rocks!
You and me live in the open desert, and our dear mother is trapped there (assume some mythology). She'll die unless we stay and shade her. We want her to survive, but we want to live our lives as well. You plan to leave for Olympos, where you'll have a really fun time, and will send some remittances and nymphs.
But instead, I leave before you do, safe in the knowledge that you will feel forced to stay behind and shade mother. So mother gets taken care of, and I get to enjoy the Olympos (and I won't bother sending anything). Blackmail?
What if the problem were entirely symmetric, and we both had exactly the same preferences. One of us has to stay, one has to go. Is it blackmail if I go first, and thus force you to stay?
Note that if you were literally a rock, I'd leave joyfully, as you'd be shading mother with no problem. I'm "blackmailing" you to inaction, not to action (which shouldn't really make a difference).
Replies from: Richard_Kennaway, None, Alejandro1↑ comment by Richard_Kennaway · 2013-07-15T20:34:31.987Z · LW(p) · GW(p)
You and me live in the open desert, and our dear mother is trapped there (assume some mythology).
No mythology necessary, actually. Two daughters and an aging mother. Which of the two gets to marry and move away from home, and which is left behind? My mother, her sister, and their mother. All dead now, so I can't ask any of them if there was any conflict over it.
This is not something I would call blackmail, any more than I would call theft blackmail. Fait accompli: one of the two simply does something to the disadvantage of the other. It may of course be morally wrong, but there are many varieties of wrong, only some of which are blackmail.
But this is just a terminological question. The real question here is presumably: how can a rational agent be designed not to be exploitable by fait accompli?
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2013-07-15T21:04:38.716Z · LW(p) · GW(p)
I did the mythology simply so I could make one of them into a literal rock at the end :-)
It may of course be morally wrong, but there are many varieties of wrong, only some of which are blackmail.
It still has some similarities - in that they would only move away because of the expected behaviour of the other...
↑ comment by [deleted] · 2013-07-15T19:48:10.058Z · LW(p) · GW(p)
What if the problem were entirely symmetric, and we both had exactly the same preferences. One of us has to stay, one has to go. Is it blackmail if I go first, and thus force you to stay?
Well, If I have a chance to react to that before your decision is final, I could say that if you do that then I'll go anyway (Thus leaving mother to burn) Then it seems to resolve to Chicken. (We both want the other person to back down first, so that we can go all the way to Olympos, but if neither backs down, mother burns in desert heat.)
And if I don't have a chance to react to that before your decision is final, then it's sort of like "What do you do if you are playing Chicken against someone who has already smashed his steering column with a rock?"
And I think that makes sense, since in both cases (blackmail, chicken) the goal is make the person back down because of some sort of threat. Does that help?
↑ comment by Alejandro1 · 2013-07-15T19:37:45.070Z · LW(p) · GW(p)
As I said elsewhere, I think identifying blackmail depends on identifying an agreed status quo state. This is fairly easy in ordinary life (if we had an affair, the ordinary human expectation is that you would keep the letters secret for free, etc.), but more difficult in outré cases like this one and the other one you suggest with the letters in the cave. It is not surprising therefore that in cases like this we do not have clear intuitions.
That said, following the definition I proposed, I'd say that in your first example if it is a plausible status quo that your brother would go to Olympos (maybe he got a job offer letter form there?) then it is blackmail for you to go instead. If he just happened to come up with the idea of leaving, then my intuition is that this is not a plausible status quo; there is no stability, no ordinary human expectation that you would abide by his decision of leaving you stranded. A more plausible status quo (which applies also to the perfectly symmetric case when both come up with the idea together) would be flipping a coin to decide who leaves. Relative to this, either of you leaving is blackmailing the other.
comment by Tenoke · 2013-07-15T17:32:03.482Z · LW(p) · GW(p)
One of the threads has 1 comment and the other has 25. Why would you think that you need a separate threads just for comments??
Replies from: Stuart_Armstrong↑ comment by Stuart_Armstrong · 2013-07-15T17:34:37.093Z · LW(p) · GW(p)
This wasn't a comment thread - it was supposed to be for people to submit examples of blackmail.
I'm very aware this wasn't the best organisation! I wrote the long post, then realised a summary could work, then thought maybe people would like a separate example thread... :-(
I apologise.
comment by Decius · 2013-07-17T17:15:07.750Z · LW(p) · GW(p)
Blackmail exists when you threaten (explicitly or implicitly) to take a course of action that harms another agent unless that agent takes an action which benefits you. The credibility of your precommitments is not relevant to whether or not an action is blackmail, although it is a factor in how effective your blackmail is.
That the blackmail threat is typically an action which harms the blackmailer is a red herring.
Replies from: ThisSpaceAvailable, Stuart_Armstrong↑ comment by ThisSpaceAvailable · 2013-07-22T02:40:22.164Z · LW(p) · GW(p)
What does "an action that harms another agent" mean? For instance, if I threaten to not give you a chicken unless you give me $5, does "I don't give you a chicken" count as "a course of action that harms another agent"? Or does it have to be an active course, rather than act of omission?
Is it still blackmail if it's "justified"? For instance, if you steal me car, and I threaten to call the police if you don't give it back, is that blackmail?
↑ comment by APMason · 2013-07-29T22:10:15.746Z · LW(p) · GW(p)
What does "an action that harms another agent" mean? For instance, if I threaten to not give you a chicken unless you give me $5, does "I don't give you a chicken" count as "a course of action that harms another agent"? Or does it have to be an active course, rather than act of omission?
It's not blackmail unless, given that I don't give you $5, you would be worse of, CDT-wise, not giving me the chicken than giving me the chicken. Which is to say, you really want to give me the chicken but you're threatening to withhold it because you think you can make $5 out of it. If I were a Don't-give-$5-bot, or just broke, you would have no reason to threaten to withhold the chicken. If you don't want to give me the chicken, but are willing to do so if I give you $5, that's just normal trade.
↑ comment by Stuart_Armstrong · 2013-07-17T19:02:50.733Z · LW(p) · GW(p)
Harming the blackmailer is simply a proxy for "this isn't a course of action the blackmailer would otherwise engage in; they do it only for its blackmail effect"
Replies from: Alejandro1↑ comment by Alejandro1 · 2013-07-17T19:55:53.271Z · LW(p) · GW(p)
I would say that if Baron truthfully tells Countess "I have been offered $100,000 for your letters by the Daily Tabloid, but in honor of our friendship I will give you the chance (which I hope you reject) to purchase them instead for only $50,000", I think this still counts as blackmail, as a matter of ordinary usage. So I agree with Decius in this respect.
For game-theoretic purposes, however, it might be worthy to restrict the definition the way you do, since it calls for a different response strategy: If Baronet also has letters incriminating Countess, but unlike Baron he has to pay to publish them instead of being paid, then it makes sense for Countess to credibly precommit to rejecting any offers from Baronet, but not from Baron.
Replies from: Decius↑ comment by Decius · 2013-07-18T17:07:16.805Z · LW(p) · GW(p)
If the cost to Baron to publish the letters is X, and the blackmail payment is Y, precommit to reject any offers with probability greater than 1-(x/y). That shifts Baron's expected value to negative (unless x=0), and increases the expected value for Countess by less than (1-(x/y))*(z-y), where z is the loss of value to Countess of having the letters published.
That strategy deters every Baron who is sophisticated enough to be deterred by a full precommittment, and does better against Barons who proceed regardless.
comment by Stuart_Armstrong · 2013-07-15T18:45:36.164Z · LW(p) · GW(p)
Another blackmail attempt:
I own steamy and revealing letters about our affair. I've boarded them up in a cave. Then the entrance to the cave starts to crumble: soon it will fall open, and the letters will blow out, for all the world to read.
I own the land the cave is on, and refuse to let anyone fix the cave, unless you pay me. Blackmail?
What about if the only reason the cave entrance was crumbling was because I diverted a river. I didn't do this with the intent of causing the entrance to crumble (I have other reasons), but I was fully aware of this side effect, and let it happen. Blackmail?
What if I approached you before I diverted the river, and asked for money then. I still want to divert the river, but if you pay up enough, I will desist from it. Blackmail?
Replies from: wedrifid↑ comment by wedrifid · 2013-07-16T10:03:20.405Z · LW(p) · GW(p)
I own the land the cave is on, and refuse to let anyone fix the cave, unless you pay me. Blackmail?
Yes.
What about if the only reason the cave entrance was crumbling was because I diverted a river. I didn't do this with the intent of causing the entrance to crumble (I have other reasons), but I was fully aware of this side effect, and let it happen. Blackmail?
No. Assuming that protecting the cave is free (because someone else wants to come and do it) the diverting of the river is not itself blackmail. Subsequent blackmail is still blackmail. It becomes close to blackmail if the cost to repair the cave is greater than the value of the river diversion. Depending on the various value instantiations possible this example does a good job of exemplifying the 'blackmail vs gains to trade' blurry boundary.
What if I approached you before I diverted the river, and asked for money then. I still want to divert the river, but if you pay up enough, I will desist from it. Blackmail?
No, assuming the right values for various outcomes this is exactly what "split the gains from trade equally" looks like. The inconvenience for keeping the secret is divided evenly. Another excellent example because to most humans this will feel like extortion and if the numbers are tweaked a little it could be extortion by my standards too.