A Possible Solution to Parfit's Hitchiker

post by Dorikka · 2011-01-28T19:21:15.400Z · LW · GW · Legacy · 5 comments

Contents

5 comments

I had what appeared to me to be a bit of insight regarding trade between selfish agents. I disclose that I have not read TDT or any books on decision theory, so what I say may be blatantly incorrect. However, I judged that posting this here was of higher utility rather than waiting until I had read up on decision theory -- I have no intention of reading up on decision theory any time soon because I have more important (to me) things to do. This is not meant to deter criticism of the post itself -- please tell me why I'm wrong if I am. The following paragraph is primarily an introduction.

When a rational agent predicts that he is interacting with another rational agent and that the other agent has motive for deceiving him, (and both have a large amount of computing power), he will not use any emotional basis for ‘trust.’ Instead, he will see the other agent’s commitments as truth claims which may be true or false depending what action will optimize the other agent’s utility function at the time which the commitment is to be fulfilled. Agents which know something of the each other’s utility function may bargain directly on such terms, even when each of their utility functions are largely (or completely) dominated by selfishness.

This leads to a solution to Parfit’s hitchhiker, allowing selfish agents to precommit to future trade. Give Ekman all of your clothes and state that you will buy them back from him when you arrive with an amount higher than the worth of your clothes to him but lower than the worth of your clothes to yourself. Furthermore, tell him that because you don’t have anything more on you, he can’t get any more money off of you than an amount infinitesimally smaller than your clothes are worth to you, and accurately tell him how much your clothes are worth to yourself (you must tell the truth here due to his microexpression-reading capability.) He should judge your words as truth, considering that you have told the truth. Of course, you lose regardless if the value of your clothes to yourself is less than the utility he loses by taking you to town.

Assumptions made regarding Parfit's hitchhiker: 1. Physical assault is judged to be of very low utility by both agents and so isn't a factor in the problem. 2. Trades in the present time may be executed without prompting an infinite cycle of "No, you give me X first."

5 comments

Comments sorted by top scores.

comment by Vladimir_Nesov · 2011-01-28T19:41:50.023Z · LW(p) · GW(p)

Assume you're clothed in rags, and they aren't worth anything to anyone.

Replies from: SilasBarta
comment by SilasBarta · 2011-01-30T17:08:57.987Z · LW(p) · GW(p)

And (per my least-conventient standard) that you can't communicate except for knowledge of your conditional behaviors.

comment by Jack · 2011-01-28T20:03:26.904Z · LW(p) · GW(p)

This is a solution for a large set of problems that might be said to resemble Parfit's Hitchhiker. But the real problem involves cases where you have nothing for Ekman to hold hostage- and this doesn't solve those cases. Whether or not the original Parfit's Hitchhiker is of the first or the second set doesn't really matter.

I do remember having teachers who demanded temporary possession of students' shoes in exchange for the loan of a pencil. This is a useful bargaining technique to keep in mind.

comment by Oscar_Cunningham · 2011-01-28T19:53:29.859Z · LW(p) · GW(p)

Of course, you lose regardless if the value of your clothes to yourself is less than the utility he loses by taking you to town.

Unless, of course, you just pay him when you get to town.

Replies from: Broggly
comment by Broggly · 2011-01-29T15:12:20.282Z · LW(p) · GW(p)

I think that's the point of Parfit's Hitchiker: being a jerk and breaking deals because you can isn't really that rational.