On Robin Hanson’s Board Game
post by Zvi · 2018-09-08T17:10:00.263Z · LW · GW · 15 commentsContents
Rules Summary Stages of Play Setup The Early Game Attention The End Game Rank Ordering Rich Player, Poor Player Flexible Structures Market Accuracy and Implications Overall None 15 comments
Previously: You Play to Win the Game, Prediction Markets: When Do They Work?, Subsidizing Prediction Markets
An Analysis Of (Robin Hanson at Overcoming Bias): My Market Board Game
Robin Hanson’s board game proposal has a lot of interesting things going on. Some of them are related to calibration, updating and the price discovery inherent in prediction markets. Others are far more related to the fact that this is a game. You Play to Win the Game.
Rules Summary
Dollars are represented by poker chips.
Media that contains an unknown outcome, such as that of a murder mystery, is selected, and suspects are picked. Players are given $200 each. At any time, players can exchange $100 for a contract in all possible suspects (one of which will pay $100, the rest of which will pay nothing).
A market is created for each suspect, with steps at 5, 10, 15, 20, 25, 30, 40, 50, 60 and 80 percent. At any time, each step in the market either contains dollars equal to its probability, or it has a contract good for $100 if that suspect is guilty. At any time, any player can exchange one for the other – if there’s a contract, they can buy it for the listed probability. If there’s chips there, you can exchange a contract for the chips. Whoever physically makes the exchange first wins the trade.
At the end of the game, the winning contract pays out, and the player with the most dollars wins the game.
Stages of Play
We can divide playing Robin’s game into four distinct stages.
In stage one, Setup, the source material we’ll be betting on is selected, and the suspects are generated.
In stage two, the Early Game, players react to incremental information and try to improve their equity, while keeping an eye out for control of various suspects.
In stage three, the Late Game, players commit to which suspects they can win with and lock them up, selling off anything that can’t help them win.
In stage four, Resolution, players again scramble to dump now-worthless contracts for whatever they can get and to buy up the last of the winning contracts. Then they see who won.
Setup
Not all mysteries will be good source material. Nor do you obviously want a ‘certified good’ source. That’s because knowing the source material creates a good game, is a huge update.
A proper multiple-suspects who-done-it that keeps everyone in suspense by design keeps the scales well-balanced, ensuring that early resolutions are fake outs. That can still make a good game, but an even more interesting game carries at least some risk that suspects will be definitively eliminated early, or even the case solved quickly. Comedy routines sometimes refer to the issue where they arrest someone on Law & Order too early in the episode, so you know they didn’t do it!
When watching sports, a similar dilemma arises. If you watch ‘classic games’ or otherwise ensure the games will be good, then the first half or more of the game is not exciting. Doing well early means the other team will catch up. So you want to choose games likely to be good, but not filter out bad games too aggressively, and learn to enjoy the occasional demolition.
The setup is also a giveaway, if it was selected by someone with knowledge of the material. At a minimum, it tells us that the list is reasonably complete. We can certainly introduce false suspects that should rightfully trade near zero from the start, to mix things up, and likely should do so.
One solution would be to have an unknown list of contracts at the start, and introduce the names as you go along. This would also potentially help with preventing a rush of trades at the very start.
In this model, you can always exchange $100 for a contract on each existing suspect, and a contract for ‘Someone You Least Suspect!’ Then, when a new suspect is introduced, everyone with a ‘Someone You Least Suspect!’ contract gets a contract in the new suspect for free for each such contract they hold. There are several choices for how one might introduce new suspects. They might unlock at fixed times, or players could be allowed to introduce them by buying a contract.
The complexity cost of hiding the suspects, or letting them be determined by the players, seems too high for the default version. It protects the fun of the movie and has some nice properties, but for the base game you almost certainly want to lay out the suspects at the start. This gives a lot away, but that’s also part of the game.
For the first few games played, it probably makes sense to choose mysteries ‘known to be good’ such as a classic Agatha Christie.
The game would presumably come with a website that allowed you to input a movie, show or other media, and output a list of suspects. It would also want to advise players on whether their selection was a good choice, or suggest good choices based on selected criteria. Both will need to be balanced to avoid giving too much away, as noted above; I’ll talk more about the general version of this problem another time.
If you are in charge of setup, I would encourage including at least one suspect that obviously did not do it, in a way that is easy to recognize early. This prevents players from assuming that all suspects will remain in play the whole time, and rewards those paying attention early. Keep people on their toes.
The Early Game
The market maker is intentionally dumb, although in default mode they are smart enough to know who the suspects are. All suspects start out equal.
There are a bunch of good heuristics, many of which should be intuitive to many viewers of mysteries, that create strong trading opportunities right away. To state the most basic, the earlier a suspect first appears on the screen, the more likely they are to have done the deed. So the moment one of the suspects appears – ‘That’s Bob!’ – everyone should rush to buy Bob, and perhaps sell everyone else if trading costs make that a good idea. How far up to buy him, or sell others, is an open question.
That will be the first of many times when there will be an ‘obvious’ update. There will also be non-obvious updates. Staying physically close to the board, chips and/or contracts ready to go, is key to make sure you get the trade first. This implies that making a race depend on the physical exchange of items might be a problem. Letting it be verbal (e.g. whoever first says ‘I buy Bob’) prevents that issue, but risks ambiguity.
What characterizes the early game, as opposed to the late game, is that the focus is on ‘make good trades’ rather than on winning. There’s no reason to worry too much about who owns how many of each contract, unless someone is invested heavily in one particular suspect. We can think of that as a player choosing to enter the endgame early.
Attention
Robin notes an interesting phenomenon, that players got caught up in the day trading and neglected to watch the mystery. Where should the smart player direct the bulk of their attention?
That depends upon your model of murder mysteries.
One model says that murder mysteries are ‘fair’. Clues are introduced. If you pay attention to those clues, you can figure out who did it before the detective does. When the detective solves the mystery, you can verify that the solution is correct once you hear their logic. If you can solve the mystery first, you can sell every worthless contract and buy all the worthwhile contracts. Ideally, that should be good enough to win the game, especially if you execute properly, selling and buying in balance without giving away that you believe you’ve solved the mystery.
Another related model says that murder mysteries follow the rules of murder mysteries, and that this often is good enough to narrow down or identify the killer. That way-too-famous-for-his-role actor is obviously the killer. Another would-be suspect was introduced at the wrong time, so she’s out. A third could easily have done it, but that wouldn’t work with the thematic elements on display.
A third model says that the detective, or others in the movie, have a certain credibility. Thus, when Sherlock Holmes says that Bob is innocent, that is that. Bob is innocent. You don’t need to know why. Evidence otherwise might not mean much, but there’s someone you can trust.
Functionally, these three are identical once you know what factors you’r reacting to. They say that (some kinds of) evidence count as evidence, and resulting updates are real. The more you believe this, the more you should pay attention to the movie. This includes trying early on to figure out what type of movie this is. Be Genre Savvy! Until you know what rules apply, don’t worry too much about day trading, unless people are going nuts.
A fourth model says that the mystery was chosen for a reason, and written to keep up suspense, so nothing you learn matters much beyond establishing who the suspects are. The game already did that for you. Unless the game followed my advice and included an obviously fake suspect or two, to punish players who only look at trading.
If you believe this model, and don’t think there is news and there aren’t fake subjects (or that they will be sufficiently obvious you’ll know anyway, if only by how others talk and act) then you won’t put as much value on watching the movie. If trading has good action, trading might be a better bet.
A fifth model that can overlap with the previous models says that others will watch the movie and process that information, so there’s no need to watch yourself if there are enough other players. You might think that there is then a momentum effect, where players are unwilling to trade aggressively enough on new information. Or you might think that players overreact to new information, especially if you’re a forth-model (nothing matters, eat at Arby’s) kind of theorist.
If you feel others can be relied on to react to news, you might trade on news even if you don’t think it matters, because others will trade after you, and you can then cash out at a quick profit. Just like in the real markets.
Or you might concentrate on arbitrage. Robin observed that players would focus on buying good suspects rather than selling poor suspects, and this often resulted in probabilities that summed to more than 100%. This offers the chance for risk-free profits, plus the chance to place whatever bet you like best while cashing in.
In my mind, the question boils down to where the game will be won and lost. Is there enough profit in day trading to beat the person who placed the largest bet on the guilty party? What happens in the endgame?
The End Game
A player enters the endgame when they attempt to ensure that they win if a particular suspect is guilty.
This is not as difficult as it looks, and could be quite difficult to fight against. Suppose I want to bet on Alice being the culprit. I could sell all other suspects and buy her contracts. As a toy example, lets say there are four suspects, and lets say I decide to butcher my executions. I sell the others for $20 and $15 each, and buy Alice for $25, $30 and $40.
If the game ends and Alice is guilty, I made $105 selling worthless contracts, and made $195 buying Alice contracts, for a net profit of $295. If she’s innocent, I collect nothing, so I paid $105 for Alice contracts and made $105 selling other contracts, so I’m just out my initial $200 and die broke. That’s really, really terrible odds if I chose Alice at random!
But if it’s a 10 person game and I do that, even if I chose at random, 25% of the time Alice is guilty. Can someone else make more than $295 to beat me?
If after I finish, others return all the prices to normal, then someone else could profit from my initial haste, then execute the same trades I did at better prices. If that happens, I’m shut out.
That works if you jump the gun, and enter the endgame too early. That’s true even if Alice is the most likely suspect.
In particular, others now need to make a choice. Lets say I went all in on Alice. There are three basic approaches on how to respond:
- Abandon Alice. If Alice is guilty, you’ve lost. So it’s safe to assume Alice is innocent, and sell any Alice contracts at their new higher prices, especially once you’re broke and no longer can buy any more. If this encourages someone else to take option 2 and also move in on Alice, even better, that’s one less person who can beat you if Alice is innocent. The majority of players should do this.
- Attack Alice. If others are abandoning Alice in droves, her price might collapse even beyond $25 as people rush to sell. You can then pick contracts up cheap, sell other contracts at better prices, and have a strictly better position.
- Arbitrage. Try to make as much money off the situation as possible, without committing to a direction. If people are being ‘too strategic’ and too eager to get where they want to go, rather than focusing on getting the best price, then by making good trades (sell Alice when I buy too quickly, buy when others sell too quickly) and forcing others to get worse prices, I can end up with more value, then decide later what to do.
If you only engage in arbitrage, and others commit to suspects, you’ll be in a lot of trouble unless you’ve already made a ton, because you won’t have anyone to trade against. Your only option becomes to trade with the market, which limits how much you can get on the suspect you finally decide to go with, even if the mystery is solved while the market is open, and you’re the only one left with cash.
The good and bad news is that’s unlikely to happen, as others will also ‘stick around’ with flexible portfolios. That means that you won’t be able to make that much when the mystery gets solved, but it does mean you can divide the spoils. If six players commit to suspects while four make good trades, not only are two of the six already shut out, it’s likely the remaining four can coordinate (or just do sensible trades) to win if two or three of the suspects are found guilty, and sometimes should be able to nail all four.
When you have four (or six) suspects and ten players, there are not enough suspects for everyone to own one, and there certainly aren’t enough for anyone to own two. That means that even if a suspect looks likely to be guilty, if you know you can’t win that scenario, you’ll be dumping, and that means at least seven of ten people are dumping any given suspect if they understand the game.
The logical response to this is to stay far enough ahead on your suspect that you clearly win if they’re guilty, and if you get a good early opportunity to dump other contracts you should definitely do that. Good trades are generally good, and those trades just got even better, especially if everyone focuses on buying rather than selling. What you don’t want to do is overpay, or run out of cash (and/or run out of things you can sell).
Thus, I might buy the Alice $20, $25 and Alice $30 contracts, and start selling contracts on suspects I think are trading rich. What I’m worried about is competition – I don’t want other players buying Alice contracts, so if they do, I’ll make sure they don’t get size off by buying at least at their price, and I’ll make sure to stay ahead of them on size. I’ll also think about whether the remaining players are sophisticated enough to sell what they have, even at lousy prices; if they are, I’ll be careful to hold a bunch of cash in reserve. If there are ten players, I can expect there to exist 16-25 Alice contracts, and I want to be sure not to run out of money.
Rank Ordering
This suggests each player has a few different goals.
You want to accumulate contracts in suspects you ‘like’ (which mostly means the ones you think are good bets), so you can get ‘control’ of one or more of them. Control means that if they did it, you win.
You want to get rid of contracts in the suspects you don’t like. The trick here is that sometimes the price will go super high (relative to the probability they did it) as multiple players compete to gain ‘control’ of the suspect. Other times, the price will collapse because there is only one bidder for control of that suspect. If one player gets a bunch of contracts, and is in good overall shape, then no one else will compete.
That in turn might drive the price so low – $10 or even $5 – that the value of their portfolio shrinks a lot, tempting another player to enter, but doing so would drive the price up right away, so it often doesn’t make sense to compete. If Bob is buying up Alice contracts and Carol now buys one at $10, who is going to sell one now? Much better to wait to see if the price goes higher, which in turn puts Bob back in control. The flip side of that is, if Carol can buy a $10 and a $15 contract, and force Bob to then pay $20, Carol can sell back to Bob at a profit. It’s a risky bluff if others are actively selling, but it can definitely pay off.
The key in these fights is who has more overall portfolio value, plus the transaction costs of moving into more contacts. If Carol can make $100 trading back and forth in other contracts, Bob is going to have a tough time keeping control, and mostly has to hope that Carol chooses to go after a different suspect. By being in as good shape as possible, Bob both is more likely to win the fight, and (if others realize this) more likely to avoid the fight.
With a lot of players engaged in active day trading, and aren’t strategically focused, transaction costs could be low. If they’re sufficiently low, then it could be a long time before it is hard to buy and sell what you want at a reasonable price, postponing the end game until quite late. The more other players are strategically focused, and strategy determines price, the harder it is to trade, the more existing positioning matters and the less you can try to day trade for profit other than anticipating a fight over a suspect, or a dump of them.
Rich Player, Poor Player
Suppose you’re a poor player. You made some trades, and they didn’t work out. Perhaps you held on to a suspect or two too long, and others dumped them, either strategically or for value. Perhaps you had a hunch, got overexcited, and others disagreed, and now you’re looking foolish. Now you only have (let’s say) $120 in equity, down from $200.
You Play to Win the Game. How do you do that? There are more players than suspects, several of whom have double your stake. So you’ll need to find a good gambit.
A basic gambit would be to buy up all the contracts you can of a suspect everyone has dismissed. Even if there are very good reasons they seem impossible and are trading at $5, you can still get enough of them to win if it works out, and you might have no competition since players in better shape have juicier targets. Slim chance beats none.
But if even that’s out of the question, you’ll have to rebuild your position. You will need to speculate via day trading. Any other play locks in a loss. You find yourself in a hole, and have no choice but to keep digging. Small arbitrage won’t work. Your best bet is likely to watch the screen and listen, and try to react faster than everyone else in the hopes that the latest development is huge or seen as huge, then turn around and sell your new position to others to make a quick buck. Then hope there’s still enough twists to do this again.
If the endgame has arrived, and rich players are sitting on or fighting for all the suspects, you’ve lost. Your best bet is to consolidate into cash, and hope some suspect crashes down to $5 for you.
Now suppose you’re a rich player. You have $300 in equity. How do you maximize your chance of winning?
The basic play is to corner the market on the most likely suspect, or whoever you think is most likely. If you make a strong move in, you should be able to scare off competition, and even if you don’t do so, you can use that as an opportunity to make more profit if they drive the price up. At some point, others will have to dump, and you can afford to give them a good price if you have to. It’s hard to win a fight when outgunned. The key is not to engage too much too soon, as this risks letting a third player take advantage of an asset dump later. So you’ll want to hold some cash for that, if possible. Remember that you’ll need something like 12-14 contracts to feel safe from a dump, depending on how much equity you’ve built, if you’re out of cash. That shuts out other players.
The advanced play, if you’re sufficiently far ahead, is to try to win on multiple suspects. That’s super hard. Even if you had $400 in equity, if you divide it in half, there are still multiple other players over $200. It seems unlikely you can get control of multiple worthwhile suspects. There’s no point in trying for multiple bargain basement suspects at the expense of one good one, even if it works. So is there any hope here?
I think there is, in the scenario where there is a clear prime suspect.
In this scenario, the prime suspect was bid high early on. Given Robin’s notes about player behavior and tendency to push prices too high, and the battle for control of the suspect, prices might get very high very quickly. There also may be players who will refuse to sell their contracts in the prime suspect, because they don’t realize that they’re shut out of winning in that case. Either they’re maximizing expected value rather than chance of winning, or they don’t realize the problem, or both.
This could open up an opportunity where the ‘net profit’ on the prime suspect isn’t that high for any player. Suppose they start at $25, and everyone starts with their two contracts. They then trade at $30, $40, $50 and $60 in a row, not all to the same player. So there’s minimal chances to buy contracts that make you that much money. If you buy an $80 contract your maximum profit is $20, which is easy to beat by day trading.
So what you can do is go for the block. Hopefully you helped drive the price up early, which is part of how you got your equity. Then contracts only really traded at $60 and $80. So even if the suspect is guilty, someone who moved in on this without day trading first is not going to end up with many contracts. You start with $200, so lets say they end up with 3 contracts and a little cash.
It’s not crazy for you to sell the suspect at the top, do some successful day trading, and then have over $300 in cash. You could win without any contracts that pay out, if you know you’re the most successful day trader and no one can have that many contracts.
That’s a better position than having 5 or 6 contracts in the prime suspect, since you still have cash if they’re innocent. The trick is then having that be enough to win on another suspect as well, or splitting your efforts by holding onto contracts elsewhere. Tricky. But perhaps not impossible, especially if people are dumping contracts at $5.
At a minimum, what you can do is be in a strong position to respond to new developments, and be able to choose which other suspect to back later in the game if you now think they’re more likely, while still winning if the situation doesn’t change. That’s very strong.
A final note is that it is legal, in the game, to trade with another player without going through the market. This could be used to buy out a players’ position in a suspect, shifting control of that suspect, and avoid the issue where once a player starts dumping a position, the price will collapse, as well as the ability of other players to ‘intercept’ the transfer and ruin the buyers’ attempt to accumulate a new position and take control. Thus, players should learn that if they have a bunch of contracts and want out, they should check for a bulk buyer, and if they want in they should consider doing the same. The risk of course is that you tip your hand, which makes doing it on the buy side less attractive.
Flexible Structures
It’s also worth noting that you can extend the idea easily to other prediction markets, and to an online version.
You could trade on the outcome of a sporting event, or an election, or any other real-world prediction market, using the same rules. You could play a board game, and also play the contract game on the outcome of your board game. That gives players something to do between turns and extra things to think about, and gives extra players or eliminated players something to do.
You could trade over a series of outcomes or events (for example, all the football games played today, or both the winner of the game and the combined number of points scored, or even obscure stuff too like number of punts or what not) in order to reward more trading ‘for value’ and place less emphasis on being right. Or just keep track of funds between games, watch multiple shows, and reward the overall winner.
That raises the question of what we can learn about prediction markets from the game.
Market Accuracy and Implications
Early in the game, market prices should roughly reflect fair probabilities of being guilty. Anyone who jumps the gun for strategic positioning will lose out to a more patient player. That won’t stop players from being overeager, and bidding suspects up too high, but as Robin noted that opens the door for others to do arbitrage and sell the contracts back down to reasonable prices.
Later in the game, prices will grow increasingly inaccurate as players jockey for position, and let strategic considerations override equity considerations.
This is a phenomenon we see in many strategic games. Early in the game, players mostly are concerned about getting good value, as most assets are vaguely fungible and will at some point be useful or necessary. As the game progresses, players develop specialized strategies, and start to do whatever they need to do to get what they need, often at terrible exchange rates, and dump or ignore things they don’t need, also at terrible exchange rates.
If we wanted to improve accuracy, we’d need to make the game less strategic and more tactical, by rewarding players who maximize expected profits. There’s a dumb market that is handing out Free Money when news occurs. We’d like players to battle for a share of that pie, rather than competing for control of suspects. If the game was played over many rounds, the early rounds would mostly focus on expected value and doing good trades. If the game was played for real money, and settled in actual dollars, then we’d definitely have a lot more accurate pricing!
If a market has traders purely motivated by expected value and profit, then its pricing will be as good as the pricing ability of the traders.
If a market has a few ‘motivated’ traders, or noise traders, that are doing something for reasons other than expected value, that is good. You need a source of free money to make the market work. Thus, the existence of the bank, as a source of free money, is great, because it motivates the game. You can imagine a version of the game where players can only trade when they agree to it. There would still be trades, since the prize for winning should overcome frictions and adverse selection, but volume of trading would plummet.
If a market has a few people who have poor fair values, that works like motivated traders.
If a market has too many traders who have poor fair values, or in context they have fair values that are not based on the expected payout, then relationships break down. There’s now profit for those who bet against them, but that doesn’t mean there’s enough money in the wings to balance things out. At some point that money is exhausted, and no one paying attention has spare funds. Prices stop being accurate, to varying degrees.
In particular, this illustrates that if those managing the money have payouts that are non-linear functions of their profits, then very weird things will happen. If I get fired for missing my benchmark, and so do my colleagues, but we don’t get extra fired for missing them by a lot, then this will lead us to discount tail risks. In the game, this takes the form of dumping suspects you can’t control – if Alice did it, you’ve already lost, and third prize is you’re fired. There are many other similar scenarios. If we want accurate prices, we need traders to have linear utility functions, or reasonable approximation thereof.
Overall
This game sounds like a lot of fun, and seems to have lots of opportunities for deep tactical and strategic play, for bluffing, and to do things that players should find fun. I really hope that it gets made. You can take one or more of many roles – arbitrager, mystery solver, genre savvy logician, momentum, value, tactical or strategic, or just hang out and watch the fun and/or the mystery, and if you later get a hunch, you can go for it.
I hope to talk to a few friends of mine who have small game companies, in the hopes one of them can help. Kickstarter ho?
If anyone out there is interested in the game and making it happen, please do talk to Robin Hanson about it. I’m sure he’d be happy to help make it a reality. And if you’re looking to play, I encourage you to give it a shot, and report back.
15 comments
Comments sorted by top scores.
comment by lionhearted (Sebastian Marshall) (lionhearted) · 2018-09-09T02:33:43.233Z · LW(p) · GW(p)
Loved this. The vast majority of analysis on games is shallow, tending to look at the stated rules and explicit mechanics, and ignoring derived and implied rules (the vastly different property value-payoff heatmaps in Monopoly, the "genre awareness" here), ignoring tempo/timing issues, and ignoring win conditions / endgame considerations.
I love when stuff like this gets boiled down elegantly:
>You want to accumulate contracts in suspects you ‘like’ (which mostly means the ones you think are good bets), so you can get ‘control’ of one or more of them. Control means that if they did it, you win.
Ah, cool, that's a win condition.
And then the logical corollaries:
>... Suppose you’re a poor player. ... A basic gambit would be to buy up all the contracts you can of a suspect everyone has dismissed. Even if there are very good reasons they seem impossible and are trading at $5, you can still get enough of them to win if it works out, and you might have no competition since players in better shape have juicier targets. Slim chance beats none. But if even that’s out of the question, you’ll have to rebuild your position. You will need to speculate via day trading. Any other play locks in a loss.
And tempo:
>This is a phenomenon we see in many strategic games. Early in the game, players mostly are concerned about getting good value, as most assets are vaguely fungible and will at some point be useful or necessary. As the game progresses, players develop specialized strategies, and start to do whatever they need to do to get what they need, often at terrible exchange rates, and dump or ignore things they don’t need, also at terrible exchange rates.
I love this stuff. It's so cool. Hat's off.
Incidentally, did you ever read "The best Puerto Rico strategy"?
A gem of a broad overview that similarly looks in depth across the whole stack. Puerto Rico is a wonderful game in that it's incredibly simple and satisfying for new players — no overtly destructive actions against fellow players, everyone gets a turn, you're always building up your island no matter what — but has incredibly deep mathematical and behavioral complexity underneath the hood. Rare that games can hit both of those notes.
Definitely recommend that one if you haven't read it. Seriously, huge respect and hat's off for this article — being able to intuit game dynamics, strategic and tactical considerations, elucidating win conditions and relevant play adjustments when winning or losing before having played the game extensively is... damn impressive.
I'm totally hoping you teach a class on game analysis at Stanford or MIT or whatever someday, and put the lectures online. I'd watch 'em. What you do is, frankly, really damn impressive. No idle flattery, but where do you reckon you're at percentile-wise in this skill? Top 0.00001% of analysts/theorists would be 1 in 10 million or so. There's probably not more than 30 better analysts in the USA than you, no? No flattery, more like a factual statement. If someone gave me an under/over bet of "100 more talented game analysts than Zvi in the USA", I'm leaning super strongly to the the under; under/over of 1000 people and I take the under without hesitating.
Replies from: Zvi↑ comment by Zvi · 2018-09-09T12:06:35.296Z · LW(p) · GW(p)
That's some strong praise there. It's great to hear and I hope I can live up to it. I think I'm one of the strongest at some aspects of analysis, and this task here plays into a lot of my strengths including my trading and market making experience. In other ways, I'm not as strong. I
really enjoy doing this type of analysis not only on games but on real world situations, problems, mechanisms, business opportunities, and so forth. If I could get fairly compensated for that type of consulting I'd love to do it, but alas the consulting business is mostly a self-selling business and the internal corporate politics business as far as I can tell - you get advice like "never improve things more than 10%, and if you do improve it more make sure to hide it."
In much more promising news, I'm exploring a potential game design opportunity, but it's too early for me to say more than that yet.
I'll check out the guide, looks cool at first glance.
comment by RobinHanson · 2018-09-09T13:14:52.942Z · LW(p) · GW(p)
Note that all this analysis is based on thinking about the game, not from playing the game. From my observing game play, I'd say that price accuracy does not noticeably suffer in the endgame.
For game design, yes good to include a few characters who will be excluded early, so people attend to story in early period.
Replies from: Zvi↑ comment by Zvi · 2018-09-10T03:34:23.008Z · LW(p) · GW(p)
What types of players did you test the game on, and how many games did they each play?
I can think of many other games where this distortion effect doesn't happen with new players, as they don't think about the game ending or the strategic layer, then picks up as players gain experience and improve. So this result isn't that surprising for players on their first game, especially if they're not hardcore game players. But it would be surprising if it was a stable equilibrium.
Replies from: RobinHanson↑ comment by RobinHanson · 2018-09-15T12:59:16.207Z · LW(p) · GW(p)
Yes, we only did a half dozen trials, and mostly with new players, so players were inexperienced.
comment by Vanessa Kosoy (vanessa-kosoy) · 2018-09-08T21:30:48.920Z · LW(p) · GW(p)
I might be missing something, but I got the impression that your interpretation is, it's an all-or-nothing game where the winner is the person who made the most money. But the more natural interpretation seems to be: your utility function is just the amount of money you made. So, it's not important whether you made more than another player, only the absolute amount is important. The latter is more in line with simulating prediction markets (and is most easily achieved if the game is on real money).
Replies from: Zvi↑ comment by Zvi · 2018-09-09T11:58:53.731Z · LW(p) · GW(p)
Robin explicitly said "the person with the most money wins" and that's the most natural way of viewing it as a game. Of course, there's nothing *wrong* with doing it the other way, and it creates more accurate (realistic?) prices and markets, as I note. But it's important to note that *as a game* it's more interesting to try and get the most money, than it is to simply make good trades. If it's normal trading you're all tactics and no strategy. This way you get both, plus the thrill of victory and the agony of defeat.
Replies from: vanessa-kosoy↑ comment by Vanessa Kosoy (vanessa-kosoy) · 2018-09-09T12:20:26.523Z · LW(p) · GW(p)
Oh, now I see that you do mention it. I missed that part, my apologies.
comment by Bucky · 2018-09-10T15:17:41.539Z · LW(p) · GW(p)
I wonder if something like this could be used in the workplace.
If we get an unexpected test result we all get together and come up with hypotheses as to what might be the cause. Each hypothesis has a column plus an "other" column and the game begins.
You'd need a robust method of adding new hypotheses (possibly which rewarded the hypothesiser - maybe giving them the first chance to purchasie stocks), plus a way of making it fair when new experimental evidence comes in (not just first come first served!).
I realise this is essentially just a prediction market at work but I think the visual elements would help people get it even if they aren't familiar with the theory. I imagine it would help with considering what to test next and to make sure people keep an open mind.
comment by ParanoidAltoid · 2018-09-09T11:01:42.431Z · LW(p) · GW(p)
I'd love to know what else this could work for. Slasher flicks might work, or any movie with an ensemble cast that gets killed off one by one until only one or two remain. Players would probably have to know in advance exactly how many survive, such that the initial trading values can be set appropriately (if 2 of 8 survive, then the initial trading value should start at 30 to buy, 20 to sell.) The number of survivors would also set the cost of the "buy all contracts" option.
Slasher flicks would have more action early on as characters are eliminated in gruesome fashion, but the winner might also be determined well before the end of the movie.
Replies from: Zvi↑ comment by Zvi · 2018-09-09T11:56:22.598Z · LW(p) · GW(p)
You don't need to know that one and only one contract pays. However, if you don't know that, then you can't allow people to exchange $100 for a set of contracts (or vice versa). So you could have contracts on each person surviving, although you'd need to clarify carefully what did and didn't count, in advance. And the strategy would be different, since there's the chance no one survives, or multiple people survive. You could also have contracts on other stuff (e.g. who dies first, then second, etc, or what kills them, or what not).
Although in most slasher movies the virgin would just trade at $90 and everyone else super cheap...
But yeah, the logic expands.
Replies from: RobinHanson, ParanoidAltoid↑ comment by RobinHanson · 2018-09-09T13:13:19.602Z · LW(p) · GW(p)
If more than one person "did it", you could pay off that fraction of $100 to each. So if two did it, each card is worth $50 at the end.
Replies from: Zvi, ParanoidAltoid↑ comment by ParanoidAltoid · 2018-09-09T15:26:01.305Z · LW(p) · GW(p)
That works. Though now card price doesn't actually reflect a character's implied probability of surviving. Eg buying a card at $40 is a confident move if there's a chance of two survivors, and always loses money if there's 3.
Instead it'd be $100*p(surviving|1 survivor) + $50*p(surviving|2 survivors) + $33*p(surviving|3 survivors)... which makes it a lot harder to think about whether to buy or sell. Could make things more interesting though.
↑ comment by ParanoidAltoid · 2018-09-09T15:39:17.505Z · LW(p) · GW(p)
In addition to not being able to exchange $100 for a set of contracts, there'd be some awkwardness over determining where to set the starting trade value. Notice in the original article it's initally $15 to sell and $20 to buy, reflecting the fact that a 6 character mystery implies a base probability of 16.6% for each character. If you naively started a slasher at $15 sell/$20 buy, someone who believes two survivors is likely would immediately bid all characters up to $30 sell/$40 buy.
Though you're right, most slashers don't try nearly as hard to surprise you as murder mysteries do. A movie that establishes a romance subplot 10 minutes in would immediately see those two characters bid up to at least $50.