Posts

Jakub Halmeš's Shortform 2025-01-11T17:47:44.771Z
The Inner Alignment Problem 2024-02-24T17:55:55.649Z

Comments

Comment by Jakub Halmeš (jakub-halmes-1) on Jakub Halmeš's Shortform · 2025-01-11T17:47:44.982Z · LW · GW

If Alice thinks X happens with a probability of 20% while Bob thinks it's 40%, what would be a fair bet between them? 

I created a Claude Artifact, which calculates a bet such that the expected value is the same for both.

In this case, Bob wins if X happens (he thinks it's more likely). If Alice bets $100, he should bet $42.86, and the EV of such bet for both players (according to their beliefs) is $14.29. 

Comment by Jakub Halmeš (jakub-halmes-1) on The Inner Alignment Problem · 2024-02-24T10:20:52.877Z · LW · GW

I wrote this mostly for personal purposes. I wanted to organize my thoughts about the problem while reading the paper, and publishing the notes, even if no one reads them, forces me to write more clearly and precisely.

I would like to get some feedback if there may be value in posts such as this one for other people. Please let me know! Thank you.