Predictable Outcome Payments
post by Tristan H (Kythe) · 2023-01-14T15:20:01.556Z · ? · GW · 1 commentsContents
1 comment
This Tuesday January 17th I'll be running a meetup on the idea of how predictable and tradeable future payments after an outcome can create a specialized financial ecosystem to convert the uncertain future payment into a payment now for the expected value. That general description encompassess useful things we have today like insurance, crop futures, stocks and venture capital. But there's lots of other areas of life society could in theory apply the same idea to, which could be great if implemented.
We'll start by talking about why this kind of system is good:
- Trading risk to a larger pool diversifies it and smooths out the returns
- Markets turn the incentive of these smoothed returns into a contest to be the best predictor
- When individuals want to fund development (investment) or avoid catastrophes (insurance), the price is well-calibrated by competition
And then we'll talk about ideas for other spaces this concept could be applied:
- Impact markets: Altruistic individuals and governments pay for the impact equity of ventures that have already clearly improved the world, and investors seek to fund charities they predict will be successful at doing good.
- Prediction markets: Allow generally turning various kinds of future uncertainty into insurance or an investment.
- Medical markets: If hospitals were paid for good health outcomes the financial pressures would be much more aligned with what we want.
I'll introduce each idea and then we can discuss whether they're good ideas and how they might work out. Look forward to seeing people there!
=== WHEN+WHERE ===
7:00pm Tuesday, January 17th
The Solarium (Join the mailing list at https://groups.google.com/g/overcomingbiasnyc and say you came from LessWrong to access the post with the address)
1 comments
Comments sorted by top scores.
comment by Ofer (ofer) · 2023-01-15T11:11:52.877Z · ? · GW
Re impact markets: there's a problem regarding potentially incentivizing people to do risky, net-negative things (that can end up being beneficial). I co-authored this [EA · GW] post about the topic.